Mar 14 06:58:56 crc systemd[1]: Starting Kubernetes Kubelet... Mar 14 06:58:56 crc restorecon[4833]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:56 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 06:58:57 crc restorecon[4833]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 14 06:58:57 crc kubenswrapper[5129]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:57 crc kubenswrapper[5129]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 14 06:58:57 crc kubenswrapper[5129]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:57 crc kubenswrapper[5129]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:57 crc kubenswrapper[5129]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 14 06:58:57 crc kubenswrapper[5129]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.771297 5129 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781571 5129 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781640 5129 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781655 5129 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781678 5129 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781693 5129 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781707 5129 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781719 5129 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781732 5129 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781743 5129 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781755 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781767 5129 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781777 5129 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781787 5129 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781798 5129 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781808 5129 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781818 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781827 5129 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781838 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781848 5129 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781858 5129 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781867 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781877 5129 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781887 5129 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781897 5129 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781907 5129 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781917 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781926 5129 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781936 5129 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781948 5129 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781958 5129 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781968 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781978 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781988 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.781999 5129 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782008 5129 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782018 5129 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782028 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782038 5129 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782051 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782062 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782075 5129 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782092 5129 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782103 5129 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782115 5129 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782127 5129 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782137 5129 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782147 5129 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782158 5129 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782171 5129 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782183 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782195 5129 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782206 5129 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782216 5129 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782229 5129 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782240 5129 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782250 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782260 5129 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782269 5129 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782279 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782288 5129 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782299 5129 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782309 5129 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782318 5129 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782331 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782341 5129 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782351 5129 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782361 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782371 5129 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782380 5129 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782391 5129 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.782409 5129 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782668 5129 flags.go:64] FLAG: --address="0.0.0.0" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782695 5129 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782714 5129 flags.go:64] FLAG: --anonymous-auth="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782728 5129 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782744 5129 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782756 5129 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782772 5129 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782785 5129 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782797 5129 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782809 5129 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782822 5129 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782834 5129 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782846 5129 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782858 5129 flags.go:64] FLAG: --cgroup-root="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782869 5129 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782881 5129 flags.go:64] FLAG: --client-ca-file="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782892 5129 flags.go:64] FLAG: --cloud-config="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782903 5129 flags.go:64] FLAG: --cloud-provider="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782915 5129 flags.go:64] FLAG: --cluster-dns="[]" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782932 5129 flags.go:64] FLAG: --cluster-domain="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782944 5129 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782956 5129 flags.go:64] FLAG: --config-dir="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782967 5129 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782979 5129 flags.go:64] FLAG: --container-log-max-files="5" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.782993 5129 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783005 5129 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783017 5129 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783055 5129 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783070 5129 flags.go:64] FLAG: --contention-profiling="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783081 5129 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783094 5129 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783113 5129 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783124 5129 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783139 5129 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783150 5129 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783161 5129 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783173 5129 flags.go:64] FLAG: --enable-load-reader="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783184 5129 flags.go:64] FLAG: --enable-server="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783196 5129 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783209 5129 flags.go:64] FLAG: --event-burst="100" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783221 5129 flags.go:64] FLAG: --event-qps="50" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783233 5129 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783245 5129 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783256 5129 flags.go:64] FLAG: --eviction-hard="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783271 5129 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783282 5129 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783294 5129 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783306 5129 flags.go:64] FLAG: --eviction-soft="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783317 5129 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783329 5129 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783340 5129 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783352 5129 flags.go:64] FLAG: --experimental-mounter-path="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783363 5129 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783375 5129 flags.go:64] FLAG: --fail-swap-on="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783386 5129 flags.go:64] FLAG: --feature-gates="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783400 5129 flags.go:64] FLAG: --file-check-frequency="20s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783411 5129 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783423 5129 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783435 5129 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783447 5129 flags.go:64] FLAG: --healthz-port="10248" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783458 5129 flags.go:64] FLAG: --help="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783470 5129 flags.go:64] FLAG: --hostname-override="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783481 5129 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783500 5129 flags.go:64] FLAG: --http-check-frequency="20s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783515 5129 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783526 5129 flags.go:64] FLAG: --image-credential-provider-config="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783536 5129 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783548 5129 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783559 5129 flags.go:64] FLAG: --image-service-endpoint="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783570 5129 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783581 5129 flags.go:64] FLAG: --kube-api-burst="100" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783593 5129 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783640 5129 flags.go:64] FLAG: --kube-api-qps="50" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783654 5129 flags.go:64] FLAG: --kube-reserved="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783665 5129 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783677 5129 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783689 5129 flags.go:64] FLAG: --kubelet-cgroups="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783700 5129 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783712 5129 flags.go:64] FLAG: --lock-file="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783723 5129 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783734 5129 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783746 5129 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783765 5129 flags.go:64] FLAG: --log-json-split-stream="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783776 5129 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783787 5129 flags.go:64] FLAG: --log-text-split-stream="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783798 5129 flags.go:64] FLAG: --logging-format="text" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783809 5129 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783822 5129 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783834 5129 flags.go:64] FLAG: --manifest-url="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783845 5129 flags.go:64] FLAG: --manifest-url-header="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783860 5129 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783872 5129 flags.go:64] FLAG: --max-open-files="1000000" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783885 5129 flags.go:64] FLAG: --max-pods="110" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783897 5129 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783909 5129 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783921 5129 flags.go:64] FLAG: --memory-manager-policy="None" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783932 5129 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783944 5129 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783957 5129 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783969 5129 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.783997 5129 flags.go:64] FLAG: --node-status-max-images="50" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784009 5129 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784021 5129 flags.go:64] FLAG: --oom-score-adj="-999" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784032 5129 flags.go:64] FLAG: --pod-cidr="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784044 5129 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784061 5129 flags.go:64] FLAG: --pod-manifest-path="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784073 5129 flags.go:64] FLAG: --pod-max-pids="-1" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784085 5129 flags.go:64] FLAG: --pods-per-core="0" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784097 5129 flags.go:64] FLAG: --port="10250" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784109 5129 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784120 5129 flags.go:64] FLAG: --provider-id="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784131 5129 flags.go:64] FLAG: --qos-reserved="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784143 5129 flags.go:64] FLAG: --read-only-port="10255" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784155 5129 flags.go:64] FLAG: --register-node="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784167 5129 flags.go:64] FLAG: --register-schedulable="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784178 5129 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784198 5129 flags.go:64] FLAG: --registry-burst="10" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784209 5129 flags.go:64] FLAG: --registry-qps="5" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784221 5129 flags.go:64] FLAG: --reserved-cpus="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784232 5129 flags.go:64] FLAG: --reserved-memory="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784247 5129 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784259 5129 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784271 5129 flags.go:64] FLAG: --rotate-certificates="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784282 5129 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784294 5129 flags.go:64] FLAG: --runonce="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784305 5129 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784317 5129 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784329 5129 flags.go:64] FLAG: --seccomp-default="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784341 5129 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784353 5129 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784365 5129 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784377 5129 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784389 5129 flags.go:64] FLAG: --storage-driver-password="root" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784400 5129 flags.go:64] FLAG: --storage-driver-secure="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784412 5129 flags.go:64] FLAG: --storage-driver-table="stats" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784423 5129 flags.go:64] FLAG: --storage-driver-user="root" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784437 5129 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784449 5129 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784461 5129 flags.go:64] FLAG: --system-cgroups="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784473 5129 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784492 5129 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784503 5129 flags.go:64] FLAG: --tls-cert-file="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784514 5129 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784529 5129 flags.go:64] FLAG: --tls-min-version="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784541 5129 flags.go:64] FLAG: --tls-private-key-file="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784553 5129 flags.go:64] FLAG: --topology-manager-policy="none" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784564 5129 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784576 5129 flags.go:64] FLAG: --topology-manager-scope="container" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784587 5129 flags.go:64] FLAG: --v="2" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784636 5129 flags.go:64] FLAG: --version="false" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784653 5129 flags.go:64] FLAG: --vmodule="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784666 5129 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.784679 5129 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.784981 5129 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785000 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785013 5129 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785025 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785039 5129 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785052 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785065 5129 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785079 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785090 5129 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785100 5129 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785109 5129 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785123 5129 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785136 5129 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785147 5129 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785158 5129 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785169 5129 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785180 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785191 5129 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785201 5129 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785215 5129 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785226 5129 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785237 5129 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785247 5129 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785259 5129 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785269 5129 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785278 5129 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785288 5129 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785298 5129 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785308 5129 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785318 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785328 5129 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785338 5129 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785348 5129 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785358 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785368 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785378 5129 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785387 5129 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785398 5129 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785408 5129 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785419 5129 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785429 5129 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785439 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785450 5129 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785462 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785473 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785482 5129 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785492 5129 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785505 5129 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785519 5129 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785530 5129 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785541 5129 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785551 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785561 5129 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785574 5129 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785585 5129 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785598 5129 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785657 5129 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785668 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785681 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785693 5129 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785704 5129 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785715 5129 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785726 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785736 5129 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785746 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785757 5129 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785770 5129 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785781 5129 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785791 5129 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785807 5129 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.785818 5129 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.785834 5129 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.798683 5129 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.798712 5129 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798823 5129 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798836 5129 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798850 5129 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798863 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798872 5129 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798881 5129 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798889 5129 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798898 5129 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798906 5129 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798917 5129 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798927 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798935 5129 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798943 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798951 5129 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798959 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798967 5129 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.798975 5129 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799017 5129 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799025 5129 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799034 5129 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799041 5129 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799051 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799058 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799066 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799074 5129 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799083 5129 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799091 5129 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799099 5129 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799107 5129 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799115 5129 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799124 5129 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799132 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799140 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799147 5129 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799155 5129 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799163 5129 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799170 5129 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799178 5129 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799186 5129 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799193 5129 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799201 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799208 5129 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799216 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799224 5129 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799232 5129 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799240 5129 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799248 5129 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799258 5129 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799268 5129 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799276 5129 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799284 5129 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799292 5129 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799300 5129 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799307 5129 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799315 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799323 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799331 5129 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799338 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799346 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799354 5129 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799362 5129 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799370 5129 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799379 5129 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799387 5129 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799395 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799403 5129 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799411 5129 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799419 5129 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799427 5129 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799434 5129 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799442 5129 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.799454 5129 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799821 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799839 5129 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799849 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799858 5129 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799866 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799875 5129 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799884 5129 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799892 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799902 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799912 5129 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799924 5129 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799932 5129 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799940 5129 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799949 5129 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799958 5129 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799966 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799974 5129 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799981 5129 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799991 5129 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.799999 5129 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800007 5129 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800015 5129 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800022 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800030 5129 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800038 5129 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800047 5129 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800054 5129 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800062 5129 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800070 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800078 5129 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800086 5129 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800093 5129 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800101 5129 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800108 5129 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800116 5129 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800124 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800132 5129 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800140 5129 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800147 5129 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800155 5129 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800163 5129 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800171 5129 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800179 5129 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800187 5129 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800194 5129 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800203 5129 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800211 5129 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800220 5129 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800229 5129 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800238 5129 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800248 5129 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800259 5129 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800270 5129 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800280 5129 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800289 5129 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800296 5129 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800305 5129 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800315 5129 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800325 5129 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800333 5129 feature_gate.go:330] unrecognized feature gate: Example Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800341 5129 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800350 5129 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800358 5129 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800366 5129 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800374 5129 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800382 5129 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800390 5129 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800397 5129 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800405 5129 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800413 5129 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.800420 5129 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.800431 5129 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.800674 5129 server.go:940] "Client rotation is on, will bootstrap in background" Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.813112 5129 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.818344 5129 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.818513 5129 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.820638 5129 server.go:997] "Starting client certificate rotation" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.820683 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.820919 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.845301 5129 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.848639 5129 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.849165 5129 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.866767 5129 log.go:25] "Validated CRI v1 runtime API" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.908488 5129 log.go:25] "Validated CRI v1 image API" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.910876 5129 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.918149 5129 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-14-06-48-30-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.918293 5129 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.953819 5129 manager.go:217] Machine: {Timestamp:2026-03-14 06:58:57.949126142 +0000 UTC m=+0.701041416 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f9bf91bc-f395-4d83-a8e9-849213d9a3dc BootID:56e5df46-9227-44ac-8b48-da482731e804 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:6f:42:a8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:6f:42:a8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b7:fa:dc Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3a:30:7d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ee:cc:4e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:37:78:a0 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:3d:df:8f Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:af:b0:60 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:46:38:59:64:ff:43 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:da:15:94:c3:f3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.954137 5129 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.954371 5129 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.957586 5129 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.957861 5129 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.957899 5129 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.959649 5129 topology_manager.go:138] "Creating topology manager with none policy" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.959677 5129 container_manager_linux.go:303] "Creating device plugin manager" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.960324 5129 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.960355 5129 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.960557 5129 state_mem.go:36] "Initialized new in-memory state store" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.960674 5129 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.965073 5129 kubelet.go:418] "Attempting to sync node with API server" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.965100 5129 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.965119 5129 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.965137 5129 kubelet.go:324] "Adding apiserver pod source" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.965150 5129 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.970039 5129 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.971398 5129 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.972746 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.972838 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.972827 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.972927 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.974312 5129 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976075 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976118 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976134 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976147 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976168 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976181 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976194 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976216 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976231 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976245 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976288 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.976301 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.977033 5129 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.977674 5129 server.go:1280] "Started kubelet" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.978964 5129 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.978986 5129 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.979276 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:57 crc systemd[1]: Started Kubernetes Kubelet. Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.980416 5129 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.980858 5129 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.980913 5129 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.981448 5129 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.981520 5129 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.982069 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.982111 5129 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 14 06:58:57 crc kubenswrapper[5129]: W0314 06:58:57.982472 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.982524 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.983148 5129 server.go:460] "Adding debug handlers to kubelet server" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.983404 5129 factory.go:55] Registering systemd factory Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.986629 5129 factory.go:221] Registration of the systemd container factory successfully Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.986768 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="200ms" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.987047 5129 factory.go:153] Registering CRI-O factory Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.987064 5129 factory.go:221] Registration of the crio container factory successfully Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.987118 5129 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.987141 5129 factory.go:103] Registering Raw factory Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.987153 5129 manager.go:1196] Started watching for new ooms in manager Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.987561 5129 manager.go:319] Starting recovery of all containers Mar 14 06:58:57 crc kubenswrapper[5129]: E0314 06:58:57.992230 5129 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ca2f7b06c3632 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,LastTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997433 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997496 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997506 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997516 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997527 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997537 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997548 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997557 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997567 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997577 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997586 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997595 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997623 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997633 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997673 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997693 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997702 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997712 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997723 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997735 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997747 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997756 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997766 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997775 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997784 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997796 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997809 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997818 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997827 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997836 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997845 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997856 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997865 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997874 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997883 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997893 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997905 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997915 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997924 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997935 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997944 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997954 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997964 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997973 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997984 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.997993 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:57 crc kubenswrapper[5129]: I0314 06:58:57.998003 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998012 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998022 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998032 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998041 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998051 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998092 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998102 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998113 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998122 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998131 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998140 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998149 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998157 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998166 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998174 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998202 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998230 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998243 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998253 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998288 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998298 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998311 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998319 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998331 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998339 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998347 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998356 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998365 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998401 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998411 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998420 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998431 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998440 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998449 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998458 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998467 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998476 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998485 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998495 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998505 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998514 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998523 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998556 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998568 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998577 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998587 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998611 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998621 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998630 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998640 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998649 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998658 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998668 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998680 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998689 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998700 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998709 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998725 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998741 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998752 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998762 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998772 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998782 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998791 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998800 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998809 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998818 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998827 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998835 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998844 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998852 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998860 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:57.998869 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001597 5129 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001670 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001694 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001714 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001731 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001749 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001768 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001786 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001803 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001820 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001837 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001854 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001872 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001890 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001907 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001925 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001943 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001962 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001981 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.001999 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002016 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002058 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002077 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002096 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002113 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002131 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002148 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002166 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002188 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002205 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002222 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002239 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002259 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002278 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002296 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002313 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002331 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002350 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002409 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002428 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002445 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002464 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002483 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002500 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002523 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002547 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002589 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002634 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002652 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002672 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002690 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002707 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002724 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002743 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002762 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002780 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002797 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002815 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002834 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002854 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002877 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002895 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002916 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002935 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002952 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002970 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.002987 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003006 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003023 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003041 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003061 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003079 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003096 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003112 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003130 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003148 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003165 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003182 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003200 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003216 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003234 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003254 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003276 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003294 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003311 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003330 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003349 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003365 5129 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003384 5129 reconstruct.go:97] "Volume reconstruction finished" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.003398 5129 reconciler.go:26] "Reconciler: start to sync state" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.019991 5129 manager.go:324] Recovery completed Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.032384 5129 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.035017 5129 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.035076 5129 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.035108 5129 kubelet.go:2335] "Starting kubelet main sync loop" Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.035254 5129 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 14 06:58:58 crc kubenswrapper[5129]: W0314 06:58:58.037222 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.037313 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.037716 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.039201 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.039245 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.039257 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.040027 5129 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.040053 5129 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.040078 5129 state_mem.go:36] "Initialized new in-memory state store" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.065331 5129 policy_none.go:49] "None policy: Start" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.066672 5129 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.066702 5129 state_mem.go:35] "Initializing new in-memory state store" Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.083164 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.123820 5129 manager.go:334] "Starting Device Plugin manager" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.123921 5129 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.123939 5129 server.go:79] "Starting device plugin registration server" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.124467 5129 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.124492 5129 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.124653 5129 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.125814 5129 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.125841 5129 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.135867 5129 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.135966 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.137148 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.137197 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.137232 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.137474 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.137688 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.137842 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.137899 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.138901 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.138930 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.138942 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.138988 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.139030 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.139042 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.139157 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.139184 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.139214 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.140446 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.140491 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.140498 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.140521 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.140577 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.140588 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.140951 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.141161 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.141225 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.141732 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.141758 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.141771 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.141980 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.142137 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.142192 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.142801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.142829 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.142842 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.143518 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.143586 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.143623 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.143550 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.143721 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.143760 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.144110 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.144175 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.145632 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.145664 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.145677 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.188303 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="400ms" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205512 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205659 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205697 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205724 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205749 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205816 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205845 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205867 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205888 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205909 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205932 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205952 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205970 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.205988 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.206009 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.224658 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.226202 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.226251 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.226265 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.226360 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.226922 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307506 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307569 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307594 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307659 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307692 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307715 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307734 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307756 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307797 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307793 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307814 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307857 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307890 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307817 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307906 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307915 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307962 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307982 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.307992 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308007 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308049 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308097 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308124 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308182 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308201 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308219 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308291 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.308420 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.427234 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.429053 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.429147 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.429246 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.429288 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.430379 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.466645 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.492161 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: W0314 06:58:58.512683 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4a4bfaacd33160dab1b555ea5bb18acb9dc82b171f8e0d95de036d4b1e81e4e8 WatchSource:0}: Error finding container 4a4bfaacd33160dab1b555ea5bb18acb9dc82b171f8e0d95de036d4b1e81e4e8: Status 404 returned error can't find the container with id 4a4bfaacd33160dab1b555ea5bb18acb9dc82b171f8e0d95de036d4b1e81e4e8 Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.512832 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.522115 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.526953 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:58:58 crc kubenswrapper[5129]: W0314 06:58:58.527938 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1684f0d04a59cd826acb893b8c904521ce1afa17d7b45d32f0dd38242d58af5a WatchSource:0}: Error finding container 1684f0d04a59cd826acb893b8c904521ce1afa17d7b45d32f0dd38242d58af5a: Status 404 returned error can't find the container with id 1684f0d04a59cd826acb893b8c904521ce1afa17d7b45d32f0dd38242d58af5a Mar 14 06:58:58 crc kubenswrapper[5129]: W0314 06:58:58.530956 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4bc9e3de0dd1fc280b76cf417c1fbe38d307dc63b028e87dbae4214819eb2a0e WatchSource:0}: Error finding container 4bc9e3de0dd1fc280b76cf417c1fbe38d307dc63b028e87dbae4214819eb2a0e: Status 404 returned error can't find the container with id 4bc9e3de0dd1fc280b76cf417c1fbe38d307dc63b028e87dbae4214819eb2a0e Mar 14 06:58:58 crc kubenswrapper[5129]: W0314 06:58:58.549188 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-689e85f1dcf87b54a26228f208ec5c722ebaff4cc8e50c49e91b84d008080eeb WatchSource:0}: Error finding container 689e85f1dcf87b54a26228f208ec5c722ebaff4cc8e50c49e91b84d008080eeb: Status 404 returned error can't find the container with id 689e85f1dcf87b54a26228f208ec5c722ebaff4cc8e50c49e91b84d008080eeb Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.589743 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="800ms" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.830947 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.833894 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.833940 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.833953 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.833983 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.834541 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Mar 14 06:58:58 crc kubenswrapper[5129]: W0314 06:58:58.845296 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.845414 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:58 crc kubenswrapper[5129]: W0314 06:58:58.973271 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:58 crc kubenswrapper[5129]: E0314 06:58:58.973485 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:58 crc kubenswrapper[5129]: I0314 06:58:58.980583 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.042178 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1684f0d04a59cd826acb893b8c904521ce1afa17d7b45d32f0dd38242d58af5a"} Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.043913 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4a4bfaacd33160dab1b555ea5bb18acb9dc82b171f8e0d95de036d4b1e81e4e8"} Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.045676 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e0f526525e5076ee70cb47d0fca7970cd8fa3454c9d17636e073eb9983e537ee"} Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.047038 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"689e85f1dcf87b54a26228f208ec5c722ebaff4cc8e50c49e91b84d008080eeb"} Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.048901 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4bc9e3de0dd1fc280b76cf417c1fbe38d307dc63b028e87dbae4214819eb2a0e"} Mar 14 06:58:59 crc kubenswrapper[5129]: W0314 06:58:59.210640 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:59 crc kubenswrapper[5129]: E0314 06:58:59.210725 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:59 crc kubenswrapper[5129]: W0314 06:58:59.353706 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:58:59 crc kubenswrapper[5129]: E0314 06:58:59.353783 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:58:59 crc kubenswrapper[5129]: E0314 06:58:59.391161 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="1.6s" Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.634695 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.637446 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.637523 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.637549 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.637590 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:58:59 crc kubenswrapper[5129]: E0314 06:58:59.638298 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Mar 14 06:58:59 crc kubenswrapper[5129]: I0314 06:58:59.980545 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.002944 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:59:00 crc kubenswrapper[5129]: E0314 06:59:00.004106 5129 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.054225 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401" exitCode=0 Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.054295 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.054373 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.056061 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.056107 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.056125 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.057735 5129 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e" exitCode=0 Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.057812 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.057880 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.059183 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.059414 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.059460 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.059473 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.060293 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.060329 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.060346 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.060503 5129 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77" exitCode=0 Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.060529 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.060736 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.065475 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.065529 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.065546 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.069484 5129 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9" exitCode=0 Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.069619 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.069565 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.070826 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.070853 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.070865 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.076356 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1cf7436ce9438c258552ab9d2cdb6fc34a3f0369aa4151691f33d5e9997208bd"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.076402 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7235ddb1c63b5f61521cf2e92d0070dc5c8d6f9899565129aa993711c1715caf"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.076421 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.076437 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c"} Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.076456 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.077948 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.078008 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.078035 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:00 crc kubenswrapper[5129]: I0314 06:59:00.979966 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:59:00 crc kubenswrapper[5129]: E0314 06:59:00.992692 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="3.2s" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.079636 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.079746 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.080650 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.080682 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.080694 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.082684 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"579eedbe1341c93457c7fbd71eda29dcd1172371231300b16d9319491196cc0e"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.082710 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f5e09d308061549f8c3183d7885644cd785cc8769880e0643f2bdc30e8a361c"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.082720 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c928fbb323aa809b60e546f9ab99a82ad8ea28ae853a828483412b8a3175f965"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.082737 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.083470 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.083509 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.083517 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.085367 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.085396 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.085412 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.085423 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.089465 5129 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034" exitCode=0 Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.089560 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.089553 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034"} Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.089695 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.090794 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.090845 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.090863 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.092802 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.092836 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.092847 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.199908 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.239007 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.239959 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.239991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.240000 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.240020 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:01 crc kubenswrapper[5129]: E0314 06:59:01.240385 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Mar 14 06:59:01 crc kubenswrapper[5129]: W0314 06:59:01.270575 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Mar 14 06:59:01 crc kubenswrapper[5129]: E0314 06:59:01.270695 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.401126 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:01 crc kubenswrapper[5129]: I0314 06:59:01.417546 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.095542 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.095536 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b05c5fc68b5433415bc0b470cd77e8527086c069717e147b31d2658c420baf69"} Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.096259 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.096289 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.096299 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.101587 5129 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01" exitCode=0 Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.101673 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.101754 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.101765 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01"} Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.101805 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.103008 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.103305 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.103340 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.103351 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.104062 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.104086 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.104098 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.104555 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.104586 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.104644 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.105456 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.105497 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.105513 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:02 crc kubenswrapper[5129]: I0314 06:59:02.677440 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.108965 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f"} Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109062 5129 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109000 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109125 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109152 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109064 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99"} Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109305 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109338 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256"} Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.109375 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93"} Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.111570 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.111640 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.111659 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.112914 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.112968 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.112987 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.115006 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.115070 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.115107 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:03 crc kubenswrapper[5129]: I0314 06:59:03.234783 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.115797 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab"} Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.115864 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.115891 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.116923 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.116963 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.116980 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.117047 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.117076 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.117091 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.335646 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.441472 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.443031 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.443104 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.443131 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.443180 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:04 crc kubenswrapper[5129]: I0314 06:59:04.514691 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.118624 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.118641 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.119784 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.119817 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.119826 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.119939 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.119968 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.119980 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.127484 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.127653 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.128486 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.128507 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:05 crc kubenswrapper[5129]: I0314 06:59:05.128516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.006695 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.120837 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.120884 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.121954 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.121995 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.122002 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.122032 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.122042 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.122008 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:06 crc kubenswrapper[5129]: I0314 06:59:06.352832 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.123325 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.124856 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.124933 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.124958 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.342197 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.342434 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.343957 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.344003 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:07 crc kubenswrapper[5129]: I0314 06:59:07.344019 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:08 crc kubenswrapper[5129]: E0314 06:59:08.138657 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:10 crc kubenswrapper[5129]: I0314 06:59:10.342358 5129 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:10 crc kubenswrapper[5129]: I0314 06:59:10.342441 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:11 crc kubenswrapper[5129]: W0314 06:59:11.903805 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.904019 5129 trace.go:236] Trace[401736760]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Mar-2026 06:59:01.902) (total time: 10001ms): Mar 14 06:59:11 crc kubenswrapper[5129]: Trace[401736760]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:59:11.903) Mar 14 06:59:11 crc kubenswrapper[5129]: Trace[401736760]: [10.001256031s] [10.001256031s] END Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.904052 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.913827 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z Mar 14 06:59:11 crc kubenswrapper[5129]: W0314 06:59:11.919643 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.919734 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.923034 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.924375 5129 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca2f7b06c3632 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,LastTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:11 crc kubenswrapper[5129]: W0314 06:59:11.926146 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.926249 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.931642 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.932793 5129 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:11 crc kubenswrapper[5129]: W0314 06:59:11.935302 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z Mar 14 06:59:11 crc kubenswrapper[5129]: E0314 06:59:11.935388 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.938569 5129 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49328->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.938676 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49328->192.168.126.11:17697: read: connection reset by peer" Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.938693 5129 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.938751 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.969803 5129 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:11 crc kubenswrapper[5129]: I0314 06:59:11.969877 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 06:59:12 crc kubenswrapper[5129]: I0314 06:59:12.105300 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:12Z is after 2026-02-23T05:33:13Z Mar 14 06:59:12 crc kubenswrapper[5129]: I0314 06:59:12.684435 5129 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]log ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]etcd ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/generic-apiserver-start-informers ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/priority-and-fairness-filter ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-apiextensions-informers ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-apiextensions-controllers ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/crd-informer-synced ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-system-namespaces-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 14 06:59:12 crc kubenswrapper[5129]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 14 06:59:12 crc kubenswrapper[5129]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/bootstrap-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/start-kube-aggregator-informers ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/apiservice-registration-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/apiservice-discovery-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]autoregister-completion ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/apiservice-openapi-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 14 06:59:12 crc kubenswrapper[5129]: livez check failed Mar 14 06:59:12 crc kubenswrapper[5129]: I0314 06:59:12.684487 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 06:59:12 crc kubenswrapper[5129]: I0314 06:59:12.988837 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:12Z is after 2026-02-23T05:33:13Z Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.141945 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.144161 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b05c5fc68b5433415bc0b470cd77e8527086c069717e147b31d2658c420baf69" exitCode=255 Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.144201 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b05c5fc68b5433415bc0b470cd77e8527086c069717e147b31d2658c420baf69"} Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.144341 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.145348 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.145419 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.145440 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.146339 5129 scope.go:117] "RemoveContainer" containerID="b05c5fc68b5433415bc0b470cd77e8527086c069717e147b31d2658c420baf69" Mar 14 06:59:13 crc kubenswrapper[5129]: I0314 06:59:13.982634 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:13Z is after 2026-02-23T05:33:13Z Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.148852 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.151509 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062"} Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.151755 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.152947 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.152980 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.152990 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.598262 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.598497 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.600170 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.600216 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.600232 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.616706 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 14 06:59:14 crc kubenswrapper[5129]: I0314 06:59:14.985056 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:14Z is after 2026-02-23T05:33:13Z Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.135471 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.135712 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.137457 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.137525 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.137548 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.157150 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.158134 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.161045 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" exitCode=255 Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.161135 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062"} Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.161199 5129 scope.go:117] "RemoveContainer" containerID="b05c5fc68b5433415bc0b470cd77e8527086c069717e147b31d2658c420baf69" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.161265 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.161369 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.162553 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.162582 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.162628 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.162853 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.163090 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.163114 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.163516 5129 scope.go:117] "RemoveContainer" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" Mar 14 06:59:15 crc kubenswrapper[5129]: E0314 06:59:15.163743 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.390033 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:15 crc kubenswrapper[5129]: W0314 06:59:15.661353 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:15Z is after 2026-02-23T05:33:13Z Mar 14 06:59:15 crc kubenswrapper[5129]: E0314 06:59:15.661825 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:15 crc kubenswrapper[5129]: I0314 06:59:15.985750 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:15Z is after 2026-02-23T05:33:13Z Mar 14 06:59:16 crc kubenswrapper[5129]: I0314 06:59:16.164510 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:16 crc kubenswrapper[5129]: I0314 06:59:16.166092 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:16 crc kubenswrapper[5129]: I0314 06:59:16.166171 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:16 crc kubenswrapper[5129]: I0314 06:59:16.166196 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:16 crc kubenswrapper[5129]: I0314 06:59:16.167323 5129 scope.go:117] "RemoveContainer" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" Mar 14 06:59:16 crc kubenswrapper[5129]: E0314 06:59:16.167724 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:16 crc kubenswrapper[5129]: W0314 06:59:16.260998 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:16Z is after 2026-02-23T05:33:13Z Mar 14 06:59:16 crc kubenswrapper[5129]: E0314 06:59:16.261104 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:16 crc kubenswrapper[5129]: W0314 06:59:16.325421 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:16Z is after 2026-02-23T05:33:13Z Mar 14 06:59:16 crc kubenswrapper[5129]: E0314 06:59:16.325520 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:16 crc kubenswrapper[5129]: I0314 06:59:16.988062 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:16Z is after 2026-02-23T05:33:13Z Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.169671 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.682443 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.682594 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.684104 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.684144 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.684162 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.684803 5129 scope.go:117] "RemoveContainer" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" Mar 14 06:59:17 crc kubenswrapper[5129]: E0314 06:59:17.684998 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.688028 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:17 crc kubenswrapper[5129]: I0314 06:59:17.984468 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:17Z is after 2026-02-23T05:33:13Z Mar 14 06:59:18 crc kubenswrapper[5129]: E0314 06:59:18.139533 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.175311 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.176564 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.176645 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.176663 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.177442 5129 scope.go:117] "RemoveContainer" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" Mar 14 06:59:18 crc kubenswrapper[5129]: E0314 06:59:18.177741 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.324224 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.326059 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.326114 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.326132 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:18 crc kubenswrapper[5129]: I0314 06:59:18.326163 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:18 crc kubenswrapper[5129]: E0314 06:59:18.329080 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:18Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 06:59:18 crc kubenswrapper[5129]: E0314 06:59:18.335331 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:18Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 06:59:19 crc kubenswrapper[5129]: W0314 06:59:19.349075 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:19Z is after 2026-02-23T05:33:13Z Mar 14 06:59:19 crc kubenswrapper[5129]: E0314 06:59:19.349163 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:19 crc kubenswrapper[5129]: I0314 06:59:19.659516 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:19Z is after 2026-02-23T05:33:13Z Mar 14 06:59:19 crc kubenswrapper[5129]: I0314 06:59:19.984337 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:19Z is after 2026-02-23T05:33:13Z Mar 14 06:59:20 crc kubenswrapper[5129]: I0314 06:59:20.342934 5129 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:20 crc kubenswrapper[5129]: I0314 06:59:20.343019 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:20 crc kubenswrapper[5129]: I0314 06:59:20.632756 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:59:20 crc kubenswrapper[5129]: E0314 06:59:20.636791 5129 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:20 crc kubenswrapper[5129]: I0314 06:59:20.983724 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:20Z is after 2026-02-23T05:33:13Z Mar 14 06:59:21 crc kubenswrapper[5129]: I0314 06:59:21.635087 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:21 crc kubenswrapper[5129]: I0314 06:59:21.635344 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:21 crc kubenswrapper[5129]: I0314 06:59:21.637071 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:21 crc kubenswrapper[5129]: I0314 06:59:21.637125 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:21 crc kubenswrapper[5129]: I0314 06:59:21.637138 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:21 crc kubenswrapper[5129]: I0314 06:59:21.637806 5129 scope.go:117] "RemoveContainer" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" Mar 14 06:59:21 crc kubenswrapper[5129]: E0314 06:59:21.638037 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:21 crc kubenswrapper[5129]: E0314 06:59:21.930017 5129 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:21Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca2f7b06c3632 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,LastTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:21 crc kubenswrapper[5129]: I0314 06:59:21.983431 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:21Z is after 2026-02-23T05:33:13Z Mar 14 06:59:22 crc kubenswrapper[5129]: W0314 06:59:22.775767 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:22Z is after 2026-02-23T05:33:13Z Mar 14 06:59:22 crc kubenswrapper[5129]: E0314 06:59:22.775858 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:22 crc kubenswrapper[5129]: I0314 06:59:22.984212 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:22Z is after 2026-02-23T05:33:13Z Mar 14 06:59:23 crc kubenswrapper[5129]: I0314 06:59:23.983547 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:23Z is after 2026-02-23T05:33:13Z Mar 14 06:59:24 crc kubenswrapper[5129]: W0314 06:59:24.697104 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:24Z is after 2026-02-23T05:33:13Z Mar 14 06:59:24 crc kubenswrapper[5129]: E0314 06:59:24.697246 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:24 crc kubenswrapper[5129]: I0314 06:59:24.983861 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:24Z is after 2026-02-23T05:33:13Z Mar 14 06:59:25 crc kubenswrapper[5129]: I0314 06:59:25.329209 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:25 crc kubenswrapper[5129]: I0314 06:59:25.331148 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:25 crc kubenswrapper[5129]: I0314 06:59:25.331389 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:25 crc kubenswrapper[5129]: I0314 06:59:25.331593 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:25 crc kubenswrapper[5129]: I0314 06:59:25.331832 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:25 crc kubenswrapper[5129]: E0314 06:59:25.337347 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:25Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 06:59:25 crc kubenswrapper[5129]: E0314 06:59:25.344049 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:25Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 06:59:25 crc kubenswrapper[5129]: I0314 06:59:25.984032 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:25Z is after 2026-02-23T05:33:13Z Mar 14 06:59:26 crc kubenswrapper[5129]: I0314 06:59:26.983193 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:26Z is after 2026-02-23T05:33:13Z Mar 14 06:59:27 crc kubenswrapper[5129]: I0314 06:59:27.984700 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:27Z is after 2026-02-23T05:33:13Z Mar 14 06:59:28 crc kubenswrapper[5129]: E0314 06:59:28.139698 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:28 crc kubenswrapper[5129]: W0314 06:59:28.394200 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:28Z is after 2026-02-23T05:33:13Z Mar 14 06:59:28 crc kubenswrapper[5129]: E0314 06:59:28.394319 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 06:59:28 crc kubenswrapper[5129]: I0314 06:59:28.984806 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:28Z is after 2026-02-23T05:33:13Z Mar 14 06:59:29 crc kubenswrapper[5129]: I0314 06:59:29.984504 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:29Z is after 2026-02-23T05:33:13Z Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.343005 5129 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.343102 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.343180 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.343353 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.345093 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.345166 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.345194 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.346006 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.346286 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2" gracePeriod=30 Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.688972 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.689632 5129 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2" exitCode=255 Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.689675 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2"} Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.689705 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4933abe33b7df00e5183eceab4fa4714af0304d9910886fb723e907c75dcd018"} Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.689799 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.690771 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.690805 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.690816 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:30 crc kubenswrapper[5129]: I0314 06:59:30.983213 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:30Z is after 2026-02-23T05:33:13Z Mar 14 06:59:31 crc kubenswrapper[5129]: E0314 06:59:31.936836 5129 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:31Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca2f7b06c3632 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,LastTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:31 crc kubenswrapper[5129]: I0314 06:59:31.983118 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:31Z is after 2026-02-23T05:33:13Z Mar 14 06:59:32 crc kubenswrapper[5129]: I0314 06:59:32.338084 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:32 crc kubenswrapper[5129]: I0314 06:59:32.339569 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:32 crc kubenswrapper[5129]: I0314 06:59:32.339691 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:32 crc kubenswrapper[5129]: I0314 06:59:32.339718 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:32 crc kubenswrapper[5129]: I0314 06:59:32.339763 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:32 crc kubenswrapper[5129]: E0314 06:59:32.342843 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:32Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 06:59:32 crc kubenswrapper[5129]: E0314 06:59:32.348072 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:32Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 14 06:59:32 crc kubenswrapper[5129]: I0314 06:59:32.985280 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:32Z is after 2026-02-23T05:33:13Z Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.035956 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.037437 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.037476 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.037485 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.038022 5129 scope.go:117] "RemoveContainer" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.698945 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.700238 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6"} Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.700372 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.701114 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.701143 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.701153 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:33 crc kubenswrapper[5129]: I0314 06:59:33.981715 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:33Z is after 2026-02-23T05:33:13Z Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.705825 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.706905 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.710268 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6" exitCode=255 Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.710330 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6"} Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.710415 5129 scope.go:117] "RemoveContainer" containerID="7d848faf746df38e69f47f831785ad2f29d0d86901741c1da56540735baab062" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.710666 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.712407 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.712483 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.712509 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.713509 5129 scope.go:117] "RemoveContainer" containerID="16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6" Mar 14 06:59:34 crc kubenswrapper[5129]: E0314 06:59:34.713859 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:34 crc kubenswrapper[5129]: I0314 06:59:34.984752 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.389878 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.713953 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.716031 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.716935 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.716989 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.717008 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.717766 5129 scope.go:117] "RemoveContainer" containerID="16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6" Mar 14 06:59:35 crc kubenswrapper[5129]: E0314 06:59:35.718043 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:35 crc kubenswrapper[5129]: I0314 06:59:35.985651 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.006932 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.007062 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.008043 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.008073 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.008084 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.892479 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.910941 5129 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 06:59:36 crc kubenswrapper[5129]: I0314 06:59:36.984024 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:37 crc kubenswrapper[5129]: I0314 06:59:37.342103 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:37 crc kubenswrapper[5129]: I0314 06:59:37.342379 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:37 crc kubenswrapper[5129]: I0314 06:59:37.343845 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:37 crc kubenswrapper[5129]: I0314 06:59:37.343919 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:37 crc kubenswrapper[5129]: I0314 06:59:37.343930 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:37 crc kubenswrapper[5129]: I0314 06:59:37.986949 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:38 crc kubenswrapper[5129]: E0314 06:59:38.142638 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:38 crc kubenswrapper[5129]: I0314 06:59:38.987854 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:39 crc kubenswrapper[5129]: I0314 06:59:39.343997 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:39 crc kubenswrapper[5129]: I0314 06:59:39.345784 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:39 crc kubenswrapper[5129]: I0314 06:59:39.345934 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:39 crc kubenswrapper[5129]: I0314 06:59:39.346028 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:39 crc kubenswrapper[5129]: I0314 06:59:39.346222 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:39 crc kubenswrapper[5129]: E0314 06:59:39.355237 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:39 crc kubenswrapper[5129]: E0314 06:59:39.355340 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:39 crc kubenswrapper[5129]: I0314 06:59:39.986192 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:40 crc kubenswrapper[5129]: I0314 06:59:40.343108 5129 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 06:59:40 crc kubenswrapper[5129]: I0314 06:59:40.343231 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 06:59:40 crc kubenswrapper[5129]: I0314 06:59:40.984382 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:41 crc kubenswrapper[5129]: W0314 06:59:41.181233 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.181326 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:41 crc kubenswrapper[5129]: I0314 06:59:41.635857 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 06:59:41 crc kubenswrapper[5129]: I0314 06:59:41.636100 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:41 crc kubenswrapper[5129]: I0314 06:59:41.637944 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:41 crc kubenswrapper[5129]: I0314 06:59:41.637998 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:41 crc kubenswrapper[5129]: I0314 06:59:41.638020 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:41 crc kubenswrapper[5129]: I0314 06:59:41.638883 5129 scope.go:117] "RemoveContainer" containerID="16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.639189 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.944709 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b06c3632 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,LastTimestamp:2026-03-14 06:58:57.977636402 +0000 UTC m=+0.729551616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.949451 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.955762 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.962547 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b41894f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,LastTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.968770 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b97aa8bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.129578172 +0000 UTC m=+0.881493366,LastTimestamp:2026-03-14 06:58:58.129578172 +0000 UTC m=+0.881493366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.976269 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4181302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.137181973 +0000 UTC m=+0.889097157,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: I0314 06:59:41.983234 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.983197 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4186b7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.137204754 +0000 UTC m=+0.889119938,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.987714 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b41894f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b41894f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,LastTimestamp:2026-03-14 06:58:58.137240694 +0000 UTC m=+0.889155878,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.993799 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4181302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.138921847 +0000 UTC m=+0.890837031,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:41 crc kubenswrapper[5129]: E0314 06:59:41.998647 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4186b7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.138938667 +0000 UTC m=+0.890853851,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.005046 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b41894f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b41894f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,LastTimestamp:2026-03-14 06:58:58.138978118 +0000 UTC m=+0.890893312,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.009254 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4181302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.139021398 +0000 UTC m=+0.890936592,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.016696 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4186b7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.139038018 +0000 UTC m=+0.890953212,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.023386 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b41894f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b41894f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,LastTimestamp:2026-03-14 06:58:58.139048158 +0000 UTC m=+0.890963352,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.029973 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4181302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.140484617 +0000 UTC m=+0.892399811,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.036966 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4181302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.140511587 +0000 UTC m=+0.892426781,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.043741 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4186b7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.140553018 +0000 UTC m=+0.892468242,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.050085 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4186b7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.140576408 +0000 UTC m=+0.892491602,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.056069 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b41894f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b41894f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,LastTimestamp:2026-03-14 06:58:58.140631989 +0000 UTC m=+0.892547223,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.060415 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b41894f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b41894f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,LastTimestamp:2026-03-14 06:58:58.140779781 +0000 UTC m=+0.892694975,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.064713 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4181302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.141750774 +0000 UTC m=+0.893665978,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.069327 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4186b7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.141766845 +0000 UTC m=+0.893682039,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.074357 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b41894f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b41894f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039264504 +0000 UTC m=+0.791179698,LastTimestamp:2026-03-14 06:58:58.141779655 +0000 UTC m=+0.893694849,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.078349 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4181302\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4181302 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039231234 +0000 UTC m=+0.791146428,LastTimestamp:2026-03-14 06:58:58.142822328 +0000 UTC m=+0.894737522,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.082518 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca2f7b4186b7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca2f7b4186b7c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.039253884 +0000 UTC m=+0.791169078,LastTimestamp:2026-03-14 06:58:58.142835148 +0000 UTC m=+0.894750332,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.093348 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f7d0e9f8d2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.522749138 +0000 UTC m=+1.274664362,LastTimestamp:2026-03-14 06:58:58.522749138 +0000 UTC m=+1.274664362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.098197 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f7d163aa5b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.530724443 +0000 UTC m=+1.282639627,LastTimestamp:2026-03-14 06:58:58.530724443 +0000 UTC m=+1.282639627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.102137 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f7d1a78a0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.535172623 +0000 UTC m=+1.287087807,LastTimestamp:2026-03-14 06:58:58.535172623 +0000 UTC m=+1.287087807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.107838 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f7d2774763 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.548787043 +0000 UTC m=+1.300702267,LastTimestamp:2026-03-14 06:58:58.548787043 +0000 UTC m=+1.300702267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.111959 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f7d2dbb1f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:58.55536792 +0000 UTC m=+1.307283114,LastTimestamp:2026-03-14 06:58:58.55536792 +0000 UTC m=+1.307283114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.116532 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f7f25c8f24 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.083906852 +0000 UTC m=+1.835822046,LastTimestamp:2026-03-14 06:58:59.083906852 +0000 UTC m=+1.835822046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.122222 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f7f2d58249 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.091833417 +0000 UTC m=+1.843748611,LastTimestamp:2026-03-14 06:58:59.091833417 +0000 UTC m=+1.843748611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.128955 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f7f2ec6724 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.093333796 +0000 UTC m=+1.845248990,LastTimestamp:2026-03-14 06:58:59.093333796 +0000 UTC m=+1.845248990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.135506 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f7f2ef3c27 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.093519399 +0000 UTC m=+1.845434593,LastTimestamp:2026-03-14 06:58:59.093519399 +0000 UTC m=+1.845434593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.139323 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f7f333fbff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.098024959 +0000 UTC m=+1.849940153,LastTimestamp:2026-03-14 06:58:59.098024959 +0000 UTC m=+1.849940153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.143823 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f7f3373d04 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.098238212 +0000 UTC m=+1.850153436,LastTimestamp:2026-03-14 06:58:59.098238212 +0000 UTC m=+1.850153436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.150237 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f7f3639b40 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.10114592 +0000 UTC m=+1.853061134,LastTimestamp:2026-03-14 06:58:59.10114592 +0000 UTC m=+1.853061134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.154738 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f7f39e5e61 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.104996961 +0000 UTC m=+1.856912155,LastTimestamp:2026-03-14 06:58:59.104996961 +0000 UTC m=+1.856912155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.159310 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f7f3cd0f65 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.108056933 +0000 UTC m=+1.859972127,LastTimestamp:2026-03-14 06:58:59.108056933 +0000 UTC m=+1.859972127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.165538 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f7f3e95970 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.109910896 +0000 UTC m=+1.861826110,LastTimestamp:2026-03-14 06:58:59.109910896 +0000 UTC m=+1.861826110,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.170014 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f7f3f6d3b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.110794168 +0000 UTC m=+1.862709362,LastTimestamp:2026-03-14 06:58:59.110794168 +0000 UTC m=+1.862709362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.175441 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f8061edb77 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.415407479 +0000 UTC m=+2.167322703,LastTimestamp:2026-03-14 06:58:59.415407479 +0000 UTC m=+2.167322703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.180841 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f806eb7b5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.428817757 +0000 UTC m=+2.180732981,LastTimestamp:2026-03-14 06:58:59.428817757 +0000 UTC m=+2.180732981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.186669 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f80703dcb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.430415539 +0000 UTC m=+2.182330763,LastTimestamp:2026-03-14 06:58:59.430415539 +0000 UTC m=+2.182330763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.193108 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f8153f0fe7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.669176295 +0000 UTC m=+2.421091529,LastTimestamp:2026-03-14 06:58:59.669176295 +0000 UTC m=+2.421091529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.197338 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f816646dd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.688402391 +0000 UTC m=+2.440317605,LastTimestamp:2026-03-14 06:58:59.688402391 +0000 UTC m=+2.440317605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.201243 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f8167e0cfc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.690081532 +0000 UTC m=+2.441996756,LastTimestamp:2026-03-14 06:58:59.690081532 +0000 UTC m=+2.441996756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.206869 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f824207df9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.918831097 +0000 UTC m=+2.670746301,LastTimestamp:2026-03-14 06:58:59.918831097 +0000 UTC m=+2.670746301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.211405 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f824ccb489 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.930117257 +0000 UTC m=+2.682032491,LastTimestamp:2026-03-14 06:58:59.930117257 +0000 UTC m=+2.682032491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.219480 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f82c7a9444 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.058952772 +0000 UTC m=+2.810867986,LastTimestamp:2026-03-14 06:59:00.058952772 +0000 UTC m=+2.810867986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.227382 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f82c9e045c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.061275228 +0000 UTC m=+2.813190422,LastTimestamp:2026-03-14 06:59:00.061275228 +0000 UTC m=+2.813190422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.233132 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f82d0fc803 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.068730883 +0000 UTC m=+2.820646107,LastTimestamp:2026-03-14 06:59:00.068730883 +0000 UTC m=+2.820646107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.241011 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f82d71177d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.075108221 +0000 UTC m=+2.827023445,LastTimestamp:2026-03-14 06:59:00.075108221 +0000 UTC m=+2.827023445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.246530 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f83c888f5b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.328304475 +0000 UTC m=+3.080219659,LastTimestamp:2026-03-14 06:59:00.328304475 +0000 UTC m=+3.080219659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.251321 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f83c89ea59 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.328393305 +0000 UTC m=+3.080308489,LastTimestamp:2026-03-14 06:59:00.328393305 +0000 UTC m=+3.080308489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.256396 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f83c8a9e9f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.328439455 +0000 UTC m=+3.080354639,LastTimestamp:2026-03-14 06:59:00.328439455 +0000 UTC m=+3.080354639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.260461 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f83c92b15f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.328968543 +0000 UTC m=+3.080883727,LastTimestamp:2026-03-14 06:59:00.328968543 +0000 UTC m=+3.080883727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.264862 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f83d5fbaa6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.342405798 +0000 UTC m=+3.094320982,LastTimestamp:2026-03-14 06:59:00.342405798 +0000 UTC m=+3.094320982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.269855 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca2f83d74f252 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.343796306 +0000 UTC m=+3.095711490,LastTimestamp:2026-03-14 06:59:00.343796306 +0000 UTC m=+3.095711490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.274871 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f83d785fe5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.344020965 +0000 UTC m=+3.095936149,LastTimestamp:2026-03-14 06:59:00.344020965 +0000 UTC m=+3.095936149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.279146 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f83db84dc3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.348210627 +0000 UTC m=+3.100125811,LastTimestamp:2026-03-14 06:59:00.348210627 +0000 UTC m=+3.100125811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.283325 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f83df54cd0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.35220808 +0000 UTC m=+3.104123264,LastTimestamp:2026-03-14 06:59:00.35220808 +0000 UTC m=+3.104123264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.289052 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f83e2f89ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.356024762 +0000 UTC m=+3.107939936,LastTimestamp:2026-03-14 06:59:00.356024762 +0000 UTC m=+3.107939936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.292861 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f84a0aa6be openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.55493395 +0000 UTC m=+3.306849144,LastTimestamp:2026-03-14 06:59:00.55493395 +0000 UTC m=+3.306849144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.301145 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f84a32bd39 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.557561145 +0000 UTC m=+3.309476389,LastTimestamp:2026-03-14 06:59:00.557561145 +0000 UTC m=+3.309476389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.305487 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f84b0d92fd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.571902717 +0000 UTC m=+3.323817921,LastTimestamp:2026-03-14 06:59:00.571902717 +0000 UTC m=+3.323817921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.310397 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f84b29e126 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.573757734 +0000 UTC m=+3.325672948,LastTimestamp:2026-03-14 06:59:00.573757734 +0000 UTC m=+3.325672948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.315637 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f84b4cafae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.57603883 +0000 UTC m=+3.327954024,LastTimestamp:2026-03-14 06:59:00.57603883 +0000 UTC m=+3.327954024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.319805 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f84b5bc7d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.577028048 +0000 UTC m=+3.328943232,LastTimestamp:2026-03-14 06:59:00.577028048 +0000 UTC m=+3.328943232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.324130 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f855fa4a33 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.755188275 +0000 UTC m=+3.507103459,LastTimestamp:2026-03-14 06:59:00.755188275 +0000 UTC m=+3.507103459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.329271 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f85654685b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.761094235 +0000 UTC m=+3.513009419,LastTimestamp:2026-03-14 06:59:00.761094235 +0000 UTC m=+3.513009419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.335953 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca2f856d82892 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.769728658 +0000 UTC m=+3.521643842,LastTimestamp:2026-03-14 06:59:00.769728658 +0000 UTC m=+3.521643842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.340250 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f85848820b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.793868811 +0000 UTC m=+3.545783995,LastTimestamp:2026-03-14 06:59:00.793868811 +0000 UTC m=+3.545783995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.344549 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f8585bad6b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.795125099 +0000 UTC m=+3.547040283,LastTimestamp:2026-03-14 06:59:00.795125099 +0000 UTC m=+3.547040283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.348105 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f861eabbf3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.955495411 +0000 UTC m=+3.707410615,LastTimestamp:2026-03-14 06:59:00.955495411 +0000 UTC m=+3.707410615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.351976 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f86273bd16 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.964474134 +0000 UTC m=+3.716389318,LastTimestamp:2026-03-14 06:59:00.964474134 +0000 UTC m=+3.716389318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.357473 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f862816c03 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:00.965370883 +0000 UTC m=+3.717286077,LastTimestamp:2026-03-14 06:59:00.965370883 +0000 UTC m=+3.717286077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.362710 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f86a38c744 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:01.094827844 +0000 UTC m=+3.846743028,LastTimestamp:2026-03-14 06:59:01.094827844 +0000 UTC m=+3.846743028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.367414 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f86d27d175 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:01.144047989 +0000 UTC m=+3.895963203,LastTimestamp:2026-03-14 06:59:01.144047989 +0000 UTC m=+3.895963203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.372563 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2f86dc33d64 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:01.1542337 +0000 UTC m=+3.906148884,LastTimestamp:2026-03-14 06:59:01.1542337 +0000 UTC m=+3.906148884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.377482 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f87a116644 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:01.360682564 +0000 UTC m=+4.112597788,LastTimestamp:2026-03-14 06:59:01.360682564 +0000 UTC m=+4.112597788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.384512 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f87ba6ecb7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:01.387259063 +0000 UTC m=+4.139174277,LastTimestamp:2026-03-14 06:59:01.387259063 +0000 UTC m=+4.139174277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.392095 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8a6899f82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.106759042 +0000 UTC m=+4.858674256,LastTimestamp:2026-03-14 06:59:02.106759042 +0000 UTC m=+4.858674256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.396552 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8b661c222 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.372581922 +0000 UTC m=+5.124497126,LastTimestamp:2026-03-14 06:59:02.372581922 +0000 UTC m=+5.124497126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.403373 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8b706e805 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.383405061 +0000 UTC m=+5.135320265,LastTimestamp:2026-03-14 06:59:02.383405061 +0000 UTC m=+5.135320265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.407981 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8b714efa7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.384324519 +0000 UTC m=+5.136239713,LastTimestamp:2026-03-14 06:59:02.384324519 +0000 UTC m=+5.136239713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.409649 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8c3bc4cf8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.596619512 +0000 UTC m=+5.348534686,LastTimestamp:2026-03-14 06:59:02.596619512 +0000 UTC m=+5.348534686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.413932 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8c4c30369 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.613836649 +0000 UTC m=+5.365751833,LastTimestamp:2026-03-14 06:59:02.613836649 +0000 UTC m=+5.365751833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.416391 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8c4d55476 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.615037046 +0000 UTC m=+5.366952240,LastTimestamp:2026-03-14 06:59:02.615037046 +0000 UTC m=+5.366952240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.418935 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8cf9a41ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.79571499 +0000 UTC m=+5.547630204,LastTimestamp:2026-03-14 06:59:02.79571499 +0000 UTC m=+5.547630204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.422755 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8d0876ef2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.81125861 +0000 UTC m=+5.563173804,LastTimestamp:2026-03-14 06:59:02.81125861 +0000 UTC m=+5.563173804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.429216 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8d09810f2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:02.812348658 +0000 UTC m=+5.564263862,LastTimestamp:2026-03-14 06:59:02.812348658 +0000 UTC m=+5.564263862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.434455 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8dcd039c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:03.017355715 +0000 UTC m=+5.769270899,LastTimestamp:2026-03-14 06:59:03.017355715 +0000 UTC m=+5.769270899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.439269 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8dd8d9c20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:03.0297672 +0000 UTC m=+5.781682384,LastTimestamp:2026-03-14 06:59:03.0297672 +0000 UTC m=+5.781682384,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.444880 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8dd9ed679 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:03.030896249 +0000 UTC m=+5.782811443,LastTimestamp:2026-03-14 06:59:03.030896249 +0000 UTC m=+5.782811443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.451438 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8e8ac67e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:03.216334823 +0000 UTC m=+5.968250017,LastTimestamp:2026-03-14 06:59:03.216334823 +0000 UTC m=+5.968250017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.457751 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca2f8e943b954 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:03.226251604 +0000 UTC m=+5.978166788,LastTimestamp:2026-03-14 06:59:03.226251604 +0000 UTC m=+5.978166788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.465869 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2fa916bc918 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:42 crc kubenswrapper[5129]: body: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:10.342416664 +0000 UTC m=+13.094331848,LastTimestamp:2026-03-14 06:59:10.342416664 +0000 UTC m=+13.094331848,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.472433 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2fa916cb410 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:10.342476816 +0000 UTC m=+13.094392020,LastTimestamp:2026-03-14 06:59:10.342476816 +0000 UTC m=+13.094392020,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.479862 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-apiserver-crc.189ca2faf0907309 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:49328->192.168.126.11:17697: read: connection reset by peer Mar 14 06:59:42 crc kubenswrapper[5129]: body: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:11.938654985 +0000 UTC m=+14.690570239,LastTimestamp:2026-03-14 06:59:11.938654985 +0000 UTC m=+14.690570239,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.486107 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2faf0915004 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:49328->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:11.938711556 +0000 UTC m=+14.690626780,LastTimestamp:2026-03-14 06:59:11.938711556 +0000 UTC m=+14.690626780,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.493087 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-apiserver-crc.189ca2faf091aee7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 06:59:42 crc kubenswrapper[5129]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:42 crc kubenswrapper[5129]: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:11.938735847 +0000 UTC m=+14.690651071,LastTimestamp:2026-03-14 06:59:11.938735847 +0000 UTC m=+14.690651071,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.498644 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2faf092472c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:11.938774828 +0000 UTC m=+14.690690052,LastTimestamp:2026-03-14 06:59:11.938774828 +0000 UTC m=+14.690690052,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.505963 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca2faf091aee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-apiserver-crc.189ca2faf091aee7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 06:59:42 crc kubenswrapper[5129]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 06:59:42 crc kubenswrapper[5129]: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:11.938735847 +0000 UTC m=+14.690651071,LastTimestamp:2026-03-14 06:59:11.969856907 +0000 UTC m=+14.721772131,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.511169 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca2faf092472c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca2faf092472c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:11.938774828 +0000 UTC m=+14.690690052,LastTimestamp:2026-03-14 06:59:11.969911789 +0000 UTC m=+14.721827003,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.514352 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-apiserver-crc.189ca2fb1d04c47e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 14 06:59:42 crc kubenswrapper[5129]: body: [+]ping ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]log ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]etcd ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/generic-apiserver-start-informers ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/priority-and-fairness-filter ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-apiextensions-informers ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-apiextensions-controllers ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/crd-informer-synced ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-system-namespaces-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 14 06:59:42 crc kubenswrapper[5129]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 14 06:59:42 crc kubenswrapper[5129]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/bootstrap-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/start-kube-aggregator-informers ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/apiservice-registration-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/apiservice-discovery-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]autoregister-completion ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/apiservice-openapi-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 14 06:59:42 crc kubenswrapper[5129]: livez check failed Mar 14 06:59:42 crc kubenswrapper[5129]: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:12.684475518 +0000 UTC m=+15.436390702,LastTimestamp:2026-03-14 06:59:12.684475518 +0000 UTC m=+15.436390702,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.520963 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2fa916bc918\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2fa916bc918 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:42 crc kubenswrapper[5129]: body: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:10.342416664 +0000 UTC m=+13.094331848,LastTimestamp:2026-03-14 06:59:20.342992818 +0000 UTC m=+23.094908042,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.525580 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2fa916cb410\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2fa916cb410 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:10.342476816 +0000 UTC m=+13.094392020,LastTimestamp:2026-03-14 06:59:20.343056549 +0000 UTC m=+23.094971773,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.530676 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2ff398d849c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:42 crc kubenswrapper[5129]: body: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:30.343068828 +0000 UTC m=+33.094984052,LastTimestamp:2026-03-14 06:59:30.343068828 +0000 UTC m=+33.094984052,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.536075 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2ff398e9bcc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:30.3431403 +0000 UTC m=+33.095055524,LastTimestamp:2026-03-14 06:59:30.3431403 +0000 UTC m=+33.095055524,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.542909 5129 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2ff39be29f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:30.346256887 +0000 UTC m=+33.098172111,LastTimestamp:2026-03-14 06:59:30.346256887 +0000 UTC m=+33.098172111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.548045 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f7f3639b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f7f3639b40 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.10114592 +0000 UTC m=+1.853061134,LastTimestamp:2026-03-14 06:59:30.463981572 +0000 UTC m=+33.215896786,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.555167 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f8061edb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f8061edb77 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.415407479 +0000 UTC m=+2.167322703,LastTimestamp:2026-03-14 06:59:30.671061389 +0000 UTC m=+33.422976613,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.561686 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2f806eb7b5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2f806eb7b5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:58:59.428817757 +0000 UTC m=+2.180732981,LastTimestamp:2026-03-14 06:59:30.682099453 +0000 UTC m=+33.434014647,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.570265 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2ff398d849c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 06:59:42 crc kubenswrapper[5129]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca2ff398d849c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 06:59:42 crc kubenswrapper[5129]: body: Mar 14 06:59:42 crc kubenswrapper[5129]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:30.343068828 +0000 UTC m=+33.094984052,LastTimestamp:2026-03-14 06:59:40.34320443 +0000 UTC m=+43.095119624,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 06:59:42 crc kubenswrapper[5129]: > Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.577150 5129 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca2ff398e9bcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca2ff398e9bcc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 06:59:30.3431403 +0000 UTC m=+33.095055524,LastTimestamp:2026-03-14 06:59:40.343290303 +0000 UTC m=+43.095205497,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 06:59:42 crc kubenswrapper[5129]: W0314 06:59:42.660526 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 14 06:59:42 crc kubenswrapper[5129]: E0314 06:59:42.660685 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:42 crc kubenswrapper[5129]: I0314 06:59:42.985231 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:43 crc kubenswrapper[5129]: I0314 06:59:43.986822 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:44 crc kubenswrapper[5129]: I0314 06:59:44.984206 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:45 crc kubenswrapper[5129]: I0314 06:59:45.985033 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:46 crc kubenswrapper[5129]: I0314 06:59:46.356253 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:46 crc kubenswrapper[5129]: I0314 06:59:46.357632 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:46 crc kubenswrapper[5129]: I0314 06:59:46.357690 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:46 crc kubenswrapper[5129]: I0314 06:59:46.357702 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:46 crc kubenswrapper[5129]: I0314 06:59:46.357733 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:46 crc kubenswrapper[5129]: E0314 06:59:46.360647 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:46 crc kubenswrapper[5129]: E0314 06:59:46.361253 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:46 crc kubenswrapper[5129]: I0314 06:59:46.984652 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.360568 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.360756 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.361863 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.361899 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.361912 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.367223 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.750428 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.751215 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.751266 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.751279 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:47 crc kubenswrapper[5129]: W0314 06:59:47.760534 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 14 06:59:47 crc kubenswrapper[5129]: E0314 06:59:47.760585 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:47 crc kubenswrapper[5129]: I0314 06:59:47.983889 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:48 crc kubenswrapper[5129]: E0314 06:59:48.143412 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:48 crc kubenswrapper[5129]: W0314 06:59:48.341372 5129 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:48 crc kubenswrapper[5129]: E0314 06:59:48.341425 5129 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 06:59:48 crc kubenswrapper[5129]: I0314 06:59:48.983946 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:49 crc kubenswrapper[5129]: I0314 06:59:49.987181 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:50 crc kubenswrapper[5129]: I0314 06:59:50.985385 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:51 crc kubenswrapper[5129]: I0314 06:59:51.206586 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 06:59:51 crc kubenswrapper[5129]: I0314 06:59:51.206854 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:51 crc kubenswrapper[5129]: I0314 06:59:51.208859 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:51 crc kubenswrapper[5129]: I0314 06:59:51.209027 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:51 crc kubenswrapper[5129]: I0314 06:59:51.209180 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:51 crc kubenswrapper[5129]: I0314 06:59:51.986376 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:52 crc kubenswrapper[5129]: I0314 06:59:52.985775 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:53 crc kubenswrapper[5129]: I0314 06:59:53.361355 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:53 crc kubenswrapper[5129]: I0314 06:59:53.362761 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:53 crc kubenswrapper[5129]: I0314 06:59:53.362831 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:53 crc kubenswrapper[5129]: I0314 06:59:53.362856 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:53 crc kubenswrapper[5129]: I0314 06:59:53.362899 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 06:59:53 crc kubenswrapper[5129]: E0314 06:59:53.366653 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 06:59:53 crc kubenswrapper[5129]: E0314 06:59:53.367067 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 06:59:53 crc kubenswrapper[5129]: I0314 06:59:53.987155 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:54 crc kubenswrapper[5129]: I0314 06:59:54.983384 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.035660 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.036801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.036844 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.036856 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.037433 5129 scope.go:117] "RemoveContainer" containerID="16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.772245 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.774173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee"} Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.774328 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.775210 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.775238 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.775248 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:55 crc kubenswrapper[5129]: I0314 06:59:55.984540 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.779879 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.780634 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.782921 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" exitCode=255 Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.782982 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee"} Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.783030 5129 scope.go:117] "RemoveContainer" containerID="16d4f165194fa463998c280a8f21ef69a0ef221d5644f5645db71df018206ce6" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.783389 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.785330 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.785375 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.785391 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.788074 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 06:59:56 crc kubenswrapper[5129]: E0314 06:59:56.788526 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 06:59:56 crc kubenswrapper[5129]: I0314 06:59:56.984157 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:57 crc kubenswrapper[5129]: I0314 06:59:57.789950 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 06:59:57 crc kubenswrapper[5129]: I0314 06:59:57.990647 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:58 crc kubenswrapper[5129]: E0314 06:59:58.143516 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 06:59:58 crc kubenswrapper[5129]: I0314 06:59:58.987072 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 06:59:59 crc kubenswrapper[5129]: I0314 06:59:59.984765 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 07:00:00 crc kubenswrapper[5129]: I0314 07:00:00.367157 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:00 crc kubenswrapper[5129]: I0314 07:00:00.369895 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:00 crc kubenswrapper[5129]: I0314 07:00:00.369976 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:00 crc kubenswrapper[5129]: I0314 07:00:00.370003 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:00 crc kubenswrapper[5129]: I0314 07:00:00.370051 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 07:00:00 crc kubenswrapper[5129]: E0314 07:00:00.374151 5129 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 07:00:00 crc kubenswrapper[5129]: E0314 07:00:00.374312 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 07:00:00 crc kubenswrapper[5129]: I0314 07:00:00.984079 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 07:00:01 crc kubenswrapper[5129]: I0314 07:00:01.635261 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:00:01 crc kubenswrapper[5129]: I0314 07:00:01.635437 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:01 crc kubenswrapper[5129]: I0314 07:00:01.636566 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:01 crc kubenswrapper[5129]: I0314 07:00:01.636617 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:01 crc kubenswrapper[5129]: I0314 07:00:01.636632 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:01 crc kubenswrapper[5129]: I0314 07:00:01.637264 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 07:00:01 crc kubenswrapper[5129]: E0314 07:00:01.637447 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:01 crc kubenswrapper[5129]: I0314 07:00:01.987221 5129 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 07:00:02 crc kubenswrapper[5129]: I0314 07:00:02.585074 5129 csr.go:261] certificate signing request csr-vgjnx is approved, waiting to be issued Mar 14 07:00:02 crc kubenswrapper[5129]: I0314 07:00:02.593702 5129 csr.go:257] certificate signing request csr-vgjnx is issued Mar 14 07:00:02 crc kubenswrapper[5129]: I0314 07:00:02.649168 5129 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 14 07:00:02 crc kubenswrapper[5129]: I0314 07:00:02.820131 5129 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 14 07:00:03 crc kubenswrapper[5129]: I0314 07:00:03.594897 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-07 05:35:31.392750437 +0000 UTC Mar 14 07:00:03 crc kubenswrapper[5129]: I0314 07:00:03.594961 5129 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7174h35m27.7977926s for next certificate rotation Mar 14 07:00:05 crc kubenswrapper[5129]: I0314 07:00:05.389994 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:00:05 crc kubenswrapper[5129]: I0314 07:00:05.390290 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:05 crc kubenswrapper[5129]: I0314 07:00:05.391944 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:05 crc kubenswrapper[5129]: I0314 07:00:05.392032 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:05 crc kubenswrapper[5129]: I0314 07:00:05.392076 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:05 crc kubenswrapper[5129]: I0314 07:00:05.392941 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 07:00:05 crc kubenswrapper[5129]: E0314 07:00:05.393209 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.374511 5129 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.375898 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.375968 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.375991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.376155 5129 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.384269 5129 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.384697 5129 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.384745 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.387895 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.387959 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.387981 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.388003 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.388021 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:07Z","lastTransitionTime":"2026-03-14T07:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.403225 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.410315 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.410371 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.410382 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.410400 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.410414 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:07Z","lastTransitionTime":"2026-03-14T07:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.418841 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.425935 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.425971 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.425983 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.425999 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.426008 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:07Z","lastTransitionTime":"2026-03-14T07:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.435382 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.442418 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.442475 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.442493 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.442516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:07 crc kubenswrapper[5129]: I0314 07:00:07.442533 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:07Z","lastTransitionTime":"2026-03-14T07:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.452511 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.452679 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.452708 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.553378 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.654073 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.754220 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.854698 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:07 crc kubenswrapper[5129]: E0314 07:00:07.955618 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.056179 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.143732 5129 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.156529 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.257150 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.357681 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.458067 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.558643 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.659301 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.759404 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.859776 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:08 crc kubenswrapper[5129]: E0314 07:00:08.960801 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.061240 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.162275 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.262440 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.363120 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.463586 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.564676 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.665571 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.766652 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.867254 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:09 crc kubenswrapper[5129]: I0314 07:00:09.945523 5129 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:09 crc kubenswrapper[5129]: E0314 07:00:09.968373 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.069039 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.169943 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.271000 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.372047 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.472318 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.572783 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.673939 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.774064 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.875003 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:10 crc kubenswrapper[5129]: E0314 07:00:10.975421 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.075553 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.175946 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.276489 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.377561 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.478272 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.578512 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.679494 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.780081 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.880494 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:11 crc kubenswrapper[5129]: E0314 07:00:11.981490 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.082717 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.183031 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.283264 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.384100 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.484298 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.584532 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.684780 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.785826 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.886806 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:12 crc kubenswrapper[5129]: E0314 07:00:12.987247 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.087683 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.188864 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.289753 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.390561 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.490861 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.591635 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.691893 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.792586 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.893155 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:13 crc kubenswrapper[5129]: E0314 07:00:13.994162 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.094713 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.195100 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.296154 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.397219 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.498009 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.598742 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.699893 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.800972 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:14 crc kubenswrapper[5129]: E0314 07:00:14.901547 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.002406 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.102817 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.203248 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.303597 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.403806 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.504550 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.605200 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.706316 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.806490 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: E0314 07:00:15.907436 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:15 crc kubenswrapper[5129]: I0314 07:00:15.975085 5129 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.007632 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.108539 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.208999 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.309460 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.409771 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.510224 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.610574 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.711395 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.811941 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:16 crc kubenswrapper[5129]: E0314 07:00:16.912734 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.013470 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.113576 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.214797 5129 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.284769 5129 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.317708 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.317751 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.317763 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.317780 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.317793 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.420662 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.420721 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.420738 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.420761 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.420780 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.522933 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.522999 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.523018 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.523041 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.523058 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.624964 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.624998 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.625009 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.625024 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.625034 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.688933 5129 apiserver.go:52] "Watching apiserver" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.697580 5129 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.697966 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.698517 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.698549 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.698716 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.698762 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.698790 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.698831 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.699242 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.699221 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.699315 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.701264 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.701487 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.702559 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.703286 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.703338 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.703506 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.703714 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.704854 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.705557 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.727284 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.727312 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.727320 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.727332 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.727341 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.729657 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.738891 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.747421 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.758938 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.763940 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.763967 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.763977 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.763990 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.764025 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.768097 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.774031 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.777036 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.777076 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.777088 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.777103 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.777113 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.778162 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.783465 5129 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.787583 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.793819 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.797534 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.797564 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.797574 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.797587 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.797619 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.810158 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.813226 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.813266 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.813280 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.813297 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.813309 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.821058 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.824523 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.824572 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.824588 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.824629 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.824646 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.833713 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.833852 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.835246 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.835292 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.835308 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.835325 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.835338 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858005 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858087 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858136 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858181 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858227 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858271 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858318 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858362 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858416 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858462 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858497 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858497 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858514 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858626 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858665 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858693 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858722 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858743 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858753 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858785 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858814 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858846 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858876 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858842 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858906 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858937 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858946 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858970 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.858997 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859023 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859051 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859079 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859104 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859107 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859130 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859156 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859188 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859217 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859243 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859274 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859306 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859334 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859365 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859396 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859426 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859453 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859485 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859513 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859540 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859571 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859627 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859716 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859748 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859779 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859810 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859890 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859921 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859985 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860011 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860037 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860066 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860120 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860190 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860224 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860253 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860281 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860311 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860343 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860372 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860401 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860434 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860467 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860493 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860522 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860553 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860578 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860632 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859177 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860667 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859298 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859481 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860701 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860734 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860772 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860804 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860838 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860870 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860898 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860929 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860961 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860993 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861027 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861056 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861087 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861115 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861144 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861172 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861199 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861226 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861252 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861283 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861841 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861891 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861939 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861980 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862076 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862143 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859716 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859742 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.859825 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860106 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860146 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860159 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860312 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860377 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860619 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860505 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860681 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860786 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860993 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860991 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861067 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862040 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862091 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862078 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.860184 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862165 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861267 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862476 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862446 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862537 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862676 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862786 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862947 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.862990 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863180 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863188 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863247 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863247 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863368 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863397 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863701 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863760 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.863758 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864256 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864285 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864300 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864304 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864324 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864579 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.868241 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.868308 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864787 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.864831 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.865167 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.865269 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.866014 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.866246 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.866420 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.866666 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.868439 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.866826 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.866850 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.867118 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.867284 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.867311 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.867047 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.867844 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.868329 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.868846 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.868933 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.868973 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869016 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869063 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869070 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869104 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869132 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869141 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869183 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869226 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869265 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869308 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869321 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869349 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869395 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869512 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869542 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869638 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869658 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869711 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869726 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869711 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870418 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870453 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870477 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870507 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870536 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870570 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870614 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870642 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870649 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870673 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870676 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870698 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870724 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870752 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870774 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870800 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870825 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870873 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870897 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870924 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870951 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870978 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871004 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871026 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871021 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871051 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871077 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871102 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871127 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871154 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871178 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871199 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871223 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871319 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871430 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.861318 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871822 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871823 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.871849 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869757 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.869793 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.870149 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872156 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872269 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872354 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872377 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872400 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872114 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872425 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872522 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.872570 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873007 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873073 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873322 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873343 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873079 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873520 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873548 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873390 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873705 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873766 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873809 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873841 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873874 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873907 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873945 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.873985 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874019 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874047 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874071 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874094 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874107 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874124 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874307 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874388 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874441 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874491 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874540 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874633 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874648 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874678 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874692 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874728 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874763 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874788 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874795 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874842 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874874 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874908 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874943 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.874975 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875008 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875045 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875079 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875113 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875151 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875186 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875275 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875311 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875346 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875382 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875417 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875573 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875661 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875714 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875770 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875824 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875878 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876013 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876067 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876105 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876157 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876205 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876250 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876302 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876353 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876396 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876434 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876469 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876516 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876566 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876655 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876782 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876830 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876862 5129 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876889 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876918 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876945 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876972 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876999 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877024 5129 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877051 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877079 5129 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877108 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877135 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877161 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877187 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877208 5129 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877229 5129 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877248 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877272 5129 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877297 5129 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877325 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877351 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877376 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877405 5129 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877432 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877458 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877482 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877499 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877518 5129 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877536 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877555 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877573 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877594 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877669 5129 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877694 5129 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877717 5129 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877738 5129 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877757 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877776 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877798 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877819 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877837 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877856 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877875 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877894 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877913 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877931 5129 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877949 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877969 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877988 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878007 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878026 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878045 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878063 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878082 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878101 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878119 5129 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878137 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878155 5129 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878174 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878193 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878211 5129 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878230 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878249 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878269 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878289 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878308 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878326 5129 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878346 5129 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878371 5129 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878396 5129 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878422 5129 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878451 5129 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878478 5129 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878503 5129 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878527 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878547 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878565 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878584 5129 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878644 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878672 5129 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878701 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878726 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878753 5129 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878781 5129 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878810 5129 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878836 5129 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878861 5129 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878887 5129 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878913 5129 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878938 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878963 5129 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878989 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879015 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879041 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879067 5129 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879097 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879121 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879145 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879169 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879194 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879220 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879249 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879279 5129 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879305 5129 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879329 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879354 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875008 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880542 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875018 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875111 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880573 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875140 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875244 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875326 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875486 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880663 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875839 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875839 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876212 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876262 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.876476 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877505 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877619 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877645 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.877671 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878308 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878327 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878374 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878545 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878571 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878700 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878807 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.878979 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879232 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879243 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879564 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879538 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879660 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.879796 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880025 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880175 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880317 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880339 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880413 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880491 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.881163 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.881430 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.880567 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.875475 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.880885 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.881721 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.880918 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:18.38089486 +0000 UTC m=+81.132810124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.881818 5129 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.881887 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.882077 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:18.382035925 +0000 UTC m=+81.133951149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.882153 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:18.382133328 +0000 UTC m=+81.134048572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.882283 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.882695 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.882835 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.882932 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.882908 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.883081 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.883174 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.883802 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.883810 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.884241 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.888017 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.892199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.892978 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.893348 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.893651 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.894128 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.894419 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.894304 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.894546 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.893741 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.894951 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:18.394812753 +0000 UTC m=+81.146727947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.895180 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.895374 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.895376 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.895663 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.895913 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.896395 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.896764 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.896855 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.896885 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.896906 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:17 crc kubenswrapper[5129]: E0314 07:00:17.896982 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:18.396952278 +0000 UTC m=+81.148867542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.897082 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.897171 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.897255 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.897272 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.897533 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.897774 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.897837 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.898081 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.898103 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.898331 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.898505 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.900049 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.900951 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.902001 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.902232 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.902468 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.902735 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.902838 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.902972 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.903862 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.904753 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.907129 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.910038 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.918157 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.925258 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.930971 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.934147 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.937648 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.937680 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.937689 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.937703 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.937712 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:17Z","lastTransitionTime":"2026-03-14T07:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980070 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980446 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980554 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980303 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980892 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980921 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980934 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980945 5129 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980955 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980967 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980977 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980985 5129 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.980994 5129 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981003 5129 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981012 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981021 5129 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981030 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981039 5129 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981047 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981057 5129 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981066 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981077 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981086 5129 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981096 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981106 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981114 5129 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981123 5129 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981132 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981141 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981152 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981160 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981169 5129 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981179 5129 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981188 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981198 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981208 5129 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981217 5129 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981225 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981233 5129 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981240 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981248 5129 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981258 5129 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981267 5129 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981276 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981284 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981292 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981300 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981308 5129 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981317 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981325 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981332 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981341 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981351 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981359 5129 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981368 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981376 5129 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981384 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981393 5129 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981400 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981408 5129 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981416 5129 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981423 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981433 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981441 5129 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981448 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981456 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981464 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981471 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981479 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981486 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981494 5129 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981502 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981510 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981518 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981527 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981538 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981550 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981560 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981573 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981584 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981592 5129 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981627 5129 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981638 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981647 5129 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981655 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981663 5129 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981671 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981678 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981685 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981693 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:17 crc kubenswrapper[5129]: I0314 07:00:17.981701 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.010453 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.018714 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.023190 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 07:00:18 crc kubenswrapper[5129]: W0314 07:00:18.040115 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-43885453f72e95d24288078b25167e7feea47d9eb793bcc6cec05d651e9e3a60 WatchSource:0}: Error finding container 43885453f72e95d24288078b25167e7feea47d9eb793bcc6cec05d651e9e3a60: Status 404 returned error can't find the container with id 43885453f72e95d24288078b25167e7feea47d9eb793bcc6cec05d651e9e3a60 Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.042205 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.042593 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.042822 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.043008 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.043171 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:18Z","lastTransitionTime":"2026-03-14T07:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.047156 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.049898 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.051996 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.053363 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.053397 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.055049 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.057061 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.059134 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.061396 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.063401 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.064938 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.066234 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.068804 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.069801 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.071045 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.071646 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.072484 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.074001 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.075385 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.076172 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.077089 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.078666 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.079429 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.080438 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.081786 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.082669 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.082829 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.083571 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.084237 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.085407 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.085938 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.086873 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.087313 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.087896 5129 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.087995 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.090319 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.091101 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.091909 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.093923 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.094716 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.095358 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.096498 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.097524 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.098032 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.098625 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.098989 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.099555 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.100619 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.101075 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.101946 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.102439 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.103745 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.104220 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.105278 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.105749 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.106241 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.107515 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.108907 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.112134 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.128554 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.145518 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.145553 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.145562 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.145576 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.145587 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:18Z","lastTransitionTime":"2026-03-14T07:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.248768 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.248809 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.248819 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.248862 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.248873 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:18Z","lastTransitionTime":"2026-03-14T07:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.351350 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.351401 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.351418 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.351438 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.351454 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:18Z","lastTransitionTime":"2026-03-14T07:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.384219 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.384328 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.384371 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:19.384351112 +0000 UTC m=+82.136266296 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.384396 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.384405 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.384449 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:19.384436924 +0000 UTC m=+82.136352108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.384484 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.384507 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:19.384501056 +0000 UTC m=+82.136416240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.453960 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.454004 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.454017 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.454033 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.454044 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:18Z","lastTransitionTime":"2026-03-14T07:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.485132 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:18 crc kubenswrapper[5129]: I0314 07:00:18.485175 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485329 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485351 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485373 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485384 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485397 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485406 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485463 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:19.485444948 +0000 UTC m=+82.237360142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:18 crc kubenswrapper[5129]: E0314 07:00:18.485502 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:19.4854943 +0000 UTC m=+82.237409494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.140109 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.140327 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.140708 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.140845 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.141000 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.141138 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.144592 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.144691 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.144707 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.144725 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.144737 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.148104 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.148160 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.148179 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"43885453f72e95d24288078b25167e7feea47d9eb793bcc6cec05d651e9e3a60"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.152006 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.152096 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e560c9160844b13d8ead8321064ee0e5039fcbbce4f915c1bef5d5760e7e9456"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.154946 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c5a907d75340e00966e426f732ba16d9db88ace1672fdf7eec713816ee7606cc"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.167854 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.185427 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.196495 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.206849 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.218673 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.229206 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.237514 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.247277 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.247358 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.247370 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.247388 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.247400 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.248285 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.252629 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-866b9"] Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.253008 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-lf9lh"] Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.253180 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.253391 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.255044 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.255082 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.256357 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.256397 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.256531 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.256579 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.256897 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.256896 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.261585 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.276164 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.286288 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.296401 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.307152 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.322156 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.333266 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.344081 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58bd6165-e663-4c4e-82ae-6009ff348000-rootfs\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.344171 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58bd6165-e663-4c4e-82ae-6009ff348000-mcd-auth-proxy-config\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.344224 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsfvx\" (UniqueName: \"kubernetes.io/projected/25b72bf7-03b0-43ea-be16-8b484c6e018a-kube-api-access-rsfvx\") pod \"node-resolver-866b9\" (UID: \"25b72bf7-03b0-43ea-be16-8b484c6e018a\") " pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.344252 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58bd6165-e663-4c4e-82ae-6009ff348000-proxy-tls\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.344305 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94gxc\" (UniqueName: \"kubernetes.io/projected/58bd6165-e663-4c4e-82ae-6009ff348000-kube-api-access-94gxc\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.344399 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25b72bf7-03b0-43ea-be16-8b484c6e018a-hosts-file\") pod \"node-resolver-866b9\" (UID: \"25b72bf7-03b0-43ea-be16-8b484c6e018a\") " pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.346422 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.350256 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.350286 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.350311 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.350330 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.350342 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.354838 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.363742 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.374682 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.383792 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445618 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445703 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25b72bf7-03b0-43ea-be16-8b484c6e018a-hosts-file\") pod \"node-resolver-866b9\" (UID: \"25b72bf7-03b0-43ea-be16-8b484c6e018a\") " pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445729 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445746 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58bd6165-e663-4c4e-82ae-6009ff348000-rootfs\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445782 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58bd6165-e663-4c4e-82ae-6009ff348000-rootfs\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.445798 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:21.445773497 +0000 UTC m=+84.197688681 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445918 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58bd6165-e663-4c4e-82ae-6009ff348000-mcd-auth-proxy-config\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445940 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsfvx\" (UniqueName: \"kubernetes.io/projected/25b72bf7-03b0-43ea-be16-8b484c6e018a-kube-api-access-rsfvx\") pod \"node-resolver-866b9\" (UID: \"25b72bf7-03b0-43ea-be16-8b484c6e018a\") " pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445960 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445932 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/25b72bf7-03b0-43ea-be16-8b484c6e018a-hosts-file\") pod \"node-resolver-866b9\" (UID: \"25b72bf7-03b0-43ea-be16-8b484c6e018a\") " pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.446003 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.446141 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:21.446110138 +0000 UTC m=+84.198025362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.445978 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58bd6165-e663-4c4e-82ae-6009ff348000-proxy-tls\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.446195 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.446223 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94gxc\" (UniqueName: \"kubernetes.io/projected/58bd6165-e663-4c4e-82ae-6009ff348000-kube-api-access-94gxc\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.446335 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:21.446277913 +0000 UTC m=+84.198193167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.446688 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58bd6165-e663-4c4e-82ae-6009ff348000-mcd-auth-proxy-config\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.452220 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58bd6165-e663-4c4e-82ae-6009ff348000-proxy-tls\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.452536 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.452570 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.452580 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.452591 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.452618 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.465141 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsfvx\" (UniqueName: \"kubernetes.io/projected/25b72bf7-03b0-43ea-be16-8b484c6e018a-kube-api-access-rsfvx\") pod \"node-resolver-866b9\" (UID: \"25b72bf7-03b0-43ea-be16-8b484c6e018a\") " pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.465632 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94gxc\" (UniqueName: \"kubernetes.io/projected/58bd6165-e663-4c4e-82ae-6009ff348000-kube-api-access-94gxc\") pod \"machine-config-daemon-lf9lh\" (UID: \"58bd6165-e663-4c4e-82ae-6009ff348000\") " pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.547520 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.547568 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547745 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547766 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547780 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547782 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547837 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:21.547821703 +0000 UTC m=+84.299736897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547843 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547870 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:19 crc kubenswrapper[5129]: E0314 07:00:19.547950 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:21.547921636 +0000 UTC m=+84.299836860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.554595 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.554645 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.554656 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.554672 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.554695 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.569025 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-866b9" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.573381 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.614934 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-r4btb"] Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.615307 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.615556 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-h6665"] Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.616737 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7hfdh"] Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.616895 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.617961 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.618901 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.619190 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.619307 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.619350 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.619350 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.619504 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.619657 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.624352 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.624439 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.624439 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.624686 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.624958 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.625066 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.625186 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.635850 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.647957 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-cni-binary-copy\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.647990 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-etc-kubernetes\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648012 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-slash\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648037 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-script-lib\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648087 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-system-cni-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648116 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-netns\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648154 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648179 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-ovn\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648213 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-multus-certs\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648241 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-cnibin\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648260 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-conf-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648279 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-netns\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648310 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648328 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-cnibin\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648346 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-os-release\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648364 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-cni-bin\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648386 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648407 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-daemon-config\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648427 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g42rk\" (UniqueName: \"kubernetes.io/projected/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-kube-api-access-g42rk\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648473 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-log-socket\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648495 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-k8s-cni-cncf-io\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648515 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-kubelet\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648534 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-systemd\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648554 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-var-lib-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/138e84c2-72d7-4e5b-9949-879dc02d95ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648598 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-cni-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648647 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-cni-multus\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648668 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-etc-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648686 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648705 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-config\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648862 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-env-overrides\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.648953 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-os-release\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649006 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/138e84c2-72d7-4e5b-9949-879dc02d95ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649040 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-hostroot\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649071 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-systemd-units\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649111 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-system-cni-dir\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649134 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-socket-dir-parent\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649151 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-netd\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649175 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovn-node-metrics-cert\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649211 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56xc\" (UniqueName: \"kubernetes.io/projected/138e84c2-72d7-4e5b-9949-879dc02d95ce-kube-api-access-s56xc\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649234 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-node-log\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649252 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-bin\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649275 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hssn4\" (UniqueName: \"kubernetes.io/projected/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-kube-api-access-hssn4\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649294 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-kubelet\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.649996 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.657284 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.657440 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.657460 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.657486 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.657503 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.660984 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.688928 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.717017 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.728282 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.741591 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749641 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-systemd-units\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749671 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-os-release\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749688 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/138e84c2-72d7-4e5b-9949-879dc02d95ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749704 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-hostroot\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749720 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-system-cni-dir\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749733 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-socket-dir-parent\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749747 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-netd\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749762 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovn-node-metrics-cert\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hssn4\" (UniqueName: \"kubernetes.io/projected/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-kube-api-access-hssn4\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749785 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-os-release\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749792 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56xc\" (UniqueName: \"kubernetes.io/projected/138e84c2-72d7-4e5b-9949-879dc02d95ce-kube-api-access-s56xc\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749843 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-node-log\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749865 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-bin\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749886 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-kubelet\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749918 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-cni-binary-copy\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749940 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-etc-kubernetes\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749960 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-slash\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.749980 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-script-lib\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750010 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-system-cni-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750031 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-netns\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750055 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750076 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-ovn\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750105 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-multus-certs\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750138 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-cnibin\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750158 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-conf-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750158 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-system-cni-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750178 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-netns\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750186 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-kubelet\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750160 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-etc-kubernetes\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750205 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-cni-bin\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750209 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-node-log\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750257 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-multus-certs\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750248 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-slash\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750298 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750282 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-netns\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750258 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750331 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-ovn\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750363 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-socket-dir-parent\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750363 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-cnibin\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750386 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-hostroot\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750398 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-os-release\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750192 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-systemd-units\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750437 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750478 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-k8s-cni-cncf-io\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750511 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-daemon-config\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750542 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g42rk\" (UniqueName: \"kubernetes.io/projected/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-kube-api-access-g42rk\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750571 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-log-socket\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750634 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-kubelet\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750652 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/138e84c2-72d7-4e5b-9949-879dc02d95ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750666 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-systemd\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750698 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-var-lib-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750742 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-config\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/138e84c2-72d7-4e5b-9949-879dc02d95ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750807 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-cni-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750810 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750836 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-cni-multus\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750408 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-system-cni-dir\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750867 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-etc-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-conf-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750892 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-cni-binary-copy\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750899 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750911 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-netns\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750915 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-cnibin\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750909 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-script-lib\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750947 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-os-release\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750970 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-run-k8s-cni-cncf-io\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750975 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-netd\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750945 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-cni-bin\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750160 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-bin\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750938 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751007 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-etc-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750855 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/138e84c2-72d7-4e5b-9949-879dc02d95ce-cnibin\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751035 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-cni-multus\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751090 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.750928 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-env-overrides\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751115 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-host-var-lib-kubelet\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751090 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-cni-dir\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751161 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-systemd\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751193 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-var-lib-openvswitch\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751221 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-log-socket\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751290 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-env-overrides\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-multus-daemon-config\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751406 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-config\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.751938 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/138e84c2-72d7-4e5b-9949-879dc02d95ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.753338 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.759753 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovn-node-metrics-cert\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.759864 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.759891 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.759901 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.759915 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.759924 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.763631 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56xc\" (UniqueName: \"kubernetes.io/projected/138e84c2-72d7-4e5b-9949-879dc02d95ce-kube-api-access-s56xc\") pod \"multus-additional-cni-plugins-h6665\" (UID: \"138e84c2-72d7-4e5b-9949-879dc02d95ce\") " pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.767256 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g42rk\" (UniqueName: \"kubernetes.io/projected/e37bb55b-4ace-4d62-9711-88d8a1bb8cd8-kube-api-access-g42rk\") pod \"multus-r4btb\" (UID: \"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\") " pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.768210 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.769437 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hssn4\" (UniqueName: \"kubernetes.io/projected/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-kube-api-access-hssn4\") pod \"ovnkube-node-7hfdh\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.777670 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.787665 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.798266 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.802523 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r4btb" Mar 14 07:00:19 crc kubenswrapper[5129]: W0314 07:00:19.814238 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37bb55b_4ace_4d62_9711_88d8a1bb8cd8.slice/crio-3bcc14858ea9d690055654109d37b57deb2a5df01f87e1659102e5c7b3336c72 WatchSource:0}: Error finding container 3bcc14858ea9d690055654109d37b57deb2a5df01f87e1659102e5c7b3336c72: Status 404 returned error can't find the container with id 3bcc14858ea9d690055654109d37b57deb2a5df01f87e1659102e5c7b3336c72 Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.814349 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.814472 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h6665" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.820425 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.838177 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.854060 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.861564 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.862094 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.862110 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.862123 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.862140 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.873728 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.902659 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.918301 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.939476 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.956020 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.965631 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.965668 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.965679 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.965696 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:19 crc kubenswrapper[5129]: I0314 07:00:19.965705 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:19Z","lastTransitionTime":"2026-03-14T07:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.068099 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.068138 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.068149 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.068166 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.068177 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.159407 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.159460 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.159473 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"34c9416de16b62e492e842fd9c1980442a90c0cff656161833243e30b9f292cd"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.160715 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b" exitCode=0 Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.160789 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.160850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"4c9f9bb5f86d206984cc6834ec5a637cd228953370d0b89a6cb62d56cf3cb0ca"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.161982 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerStarted","Data":"0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.162014 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerStarted","Data":"b772565f482eebfe50dd6b622fb091aa6bca2bb1bd68a6f0cda83a12c2508950"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.162863 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-866b9" event={"ID":"25b72bf7-03b0-43ea-be16-8b484c6e018a","Type":"ContainerStarted","Data":"9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.162931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-866b9" event={"ID":"25b72bf7-03b0-43ea-be16-8b484c6e018a","Type":"ContainerStarted","Data":"85e06428a9d1156955750fd83ac205d4fce264cf74c3688c69cc480ca0a8ed8f"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.163868 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerStarted","Data":"b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.163906 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerStarted","Data":"3bcc14858ea9d690055654109d37b57deb2a5df01f87e1659102e5c7b3336c72"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.170184 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.170223 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.170233 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.170265 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.170277 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.174517 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.189301 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.203673 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.218476 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.237884 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.251127 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.265557 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.272068 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.272100 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.272110 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.272128 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.272139 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.284270 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.302655 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.315660 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.328165 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.344266 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.360071 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.374466 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.374499 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.374507 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.374519 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.374536 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.376491 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.391934 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.411754 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.424343 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.441740 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.456577 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.470212 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.476894 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.476965 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.476978 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.477006 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.477020 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.481016 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.494935 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.579822 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.579884 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.579895 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.579916 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.579928 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.681690 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.681735 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.681747 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.681763 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.681774 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.784114 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.784145 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.784153 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.784167 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.784177 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.886991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.887269 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.887278 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.887292 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.887304 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.989633 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.989675 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.989688 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.989703 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:20 crc kubenswrapper[5129]: I0314 07:00:20.989715 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:20Z","lastTransitionTime":"2026-03-14T07:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.035877 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.035934 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.035986 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.036133 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.036498 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.036632 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.049352 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.049752 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.051351 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.092349 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.092382 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.092390 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.092406 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.092416 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.170456 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.170506 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.170520 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.170535 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.170547 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.171910 5129 generic.go:334] "Generic (PLEG): container finished" podID="138e84c2-72d7-4e5b-9949-879dc02d95ce" containerID="0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee" exitCode=0 Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.171976 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerDied","Data":"0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.173752 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.175799 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.175975 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.184411 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.195891 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.195929 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.195940 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.195957 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.195985 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.203716 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.222170 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.236011 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.248964 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.262183 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.273838 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.286459 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.298725 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.298768 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.298780 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.298796 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.298812 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.299974 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.315785 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.325079 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.335314 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.344932 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.357318 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.378569 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.396589 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.401905 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.401940 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.401952 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.401969 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.401980 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.417883 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.430436 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.442054 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.457192 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.465326 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.465437 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.465462 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.465558 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.465617 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:25.465593672 +0000 UTC m=+88.217508856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.465891 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:25.465883802 +0000 UTC m=+88.217798976 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.465927 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.465948 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:25.465943324 +0000 UTC m=+88.217858508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.469365 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.481202 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.485973 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lknrr"] Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.486316 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.488185 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.488336 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.488337 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.488396 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.500977 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.504788 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.504818 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.504829 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.504843 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.504853 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.515053 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.534593 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.547659 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.558555 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.566736 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngnq4\" (UniqueName: \"kubernetes.io/projected/1202ae8e-98e0-4dc5-99aa-680871888bd6-kube-api-access-ngnq4\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.566787 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.566815 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1202ae8e-98e0-4dc5-99aa-680871888bd6-serviceca\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.566929 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.566992 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1202ae8e-98e0-4dc5-99aa-680871888bd6-host\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.566943 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.567034 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.567046 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.567086 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:25.567073501 +0000 UTC m=+88.318988685 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.567009 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.567182 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.567202 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:21 crc kubenswrapper[5129]: E0314 07:00:21.567269 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:25.567252027 +0000 UTC m=+88.319167211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.570076 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.580380 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.591731 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.603114 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.606566 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.606620 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.606632 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.606648 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.606660 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.617884 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.627713 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.636998 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.667867 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngnq4\" (UniqueName: \"kubernetes.io/projected/1202ae8e-98e0-4dc5-99aa-680871888bd6-kube-api-access-ngnq4\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.667910 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1202ae8e-98e0-4dc5-99aa-680871888bd6-serviceca\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.667963 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1202ae8e-98e0-4dc5-99aa-680871888bd6-host\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.668016 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1202ae8e-98e0-4dc5-99aa-680871888bd6-host\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.670306 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1202ae8e-98e0-4dc5-99aa-680871888bd6-serviceca\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.677870 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.708355 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.708390 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.708399 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.708414 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.708423 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.711491 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngnq4\" (UniqueName: \"kubernetes.io/projected/1202ae8e-98e0-4dc5-99aa-680871888bd6-kube-api-access-ngnq4\") pod \"node-ca-lknrr\" (UID: \"1202ae8e-98e0-4dc5-99aa-680871888bd6\") " pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.739694 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.778917 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:21Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.797695 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lknrr" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.810171 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.810215 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.810224 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.810241 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.810254 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:21 crc kubenswrapper[5129]: W0314 07:00:21.859755 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1202ae8e_98e0_4dc5_99aa_680871888bd6.slice/crio-dc0177137209f5061cae4f5798ccabb2797c278b61738b21d0da4ac11796287a WatchSource:0}: Error finding container dc0177137209f5061cae4f5798ccabb2797c278b61738b21d0da4ac11796287a: Status 404 returned error can't find the container with id dc0177137209f5061cae4f5798ccabb2797c278b61738b21d0da4ac11796287a Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.914003 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.914308 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.914319 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.914333 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:21 crc kubenswrapper[5129]: I0314 07:00:21.914343 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:21Z","lastTransitionTime":"2026-03-14T07:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.016971 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.017010 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.017019 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.017033 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.017042 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.118968 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.118999 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.119007 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.119019 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.119029 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.179525 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.181294 5129 generic.go:334] "Generic (PLEG): container finished" podID="138e84c2-72d7-4e5b-9949-879dc02d95ce" containerID="e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee" exitCode=0 Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.181380 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerDied","Data":"e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.182784 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lknrr" event={"ID":"1202ae8e-98e0-4dc5-99aa-680871888bd6","Type":"ContainerStarted","Data":"80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.182815 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lknrr" event={"ID":"1202ae8e-98e0-4dc5-99aa-680871888bd6","Type":"ContainerStarted","Data":"dc0177137209f5061cae4f5798ccabb2797c278b61738b21d0da4ac11796287a"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.196844 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.208119 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.220182 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.222400 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.222440 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.222451 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.222467 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.222478 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.231401 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.249585 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.267990 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.281438 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.293963 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.306769 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.321690 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.324664 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.324697 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.324709 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.324726 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.324737 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.333234 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.343730 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.353722 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.365568 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.382325 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.420428 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.427218 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.427278 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.427290 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.427307 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.427318 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.457675 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.499889 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.530423 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.530453 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.530462 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.530475 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.530483 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.538268 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.579199 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.619477 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.633526 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.633594 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.633675 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.633701 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.633801 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.660687 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.696564 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.735711 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.735752 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.735762 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.735776 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.735787 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.740012 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.782875 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.818630 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:22Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.838157 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.838192 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.838200 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.838212 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.838221 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.940247 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.940285 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.940295 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.940313 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:22 crc kubenswrapper[5129]: I0314 07:00:22.940333 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:22Z","lastTransitionTime":"2026-03-14T07:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.036274 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.036293 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.036293 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:23 crc kubenswrapper[5129]: E0314 07:00:23.036381 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:23 crc kubenswrapper[5129]: E0314 07:00:23.036472 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:23 crc kubenswrapper[5129]: E0314 07:00:23.036541 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.042469 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.042498 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.042508 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.042524 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.042534 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.144498 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.144848 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.144862 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.144879 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.144887 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.186353 5129 generic.go:334] "Generic (PLEG): container finished" podID="138e84c2-72d7-4e5b-9949-879dc02d95ce" containerID="04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d" exitCode=0 Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.186420 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerDied","Data":"04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.205572 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.224588 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.239653 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.246453 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.246475 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.246483 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.246495 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.246504 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.251795 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.263330 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.274428 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.288865 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.301897 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.317379 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.331669 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.343841 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.348663 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.348708 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.348721 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.348742 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.348753 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.353193 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.371455 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:23Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.450776 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.450815 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.450825 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.450840 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.450849 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.557692 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.557771 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.557797 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.557831 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.557868 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.661189 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.661272 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.661287 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.661304 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.661314 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.763275 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.763316 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.763328 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.763346 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.763358 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.866787 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.866854 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.866927 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.866957 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.866978 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.969807 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.969841 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.969850 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.969864 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:23 crc kubenswrapper[5129]: I0314 07:00:23.969874 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:23Z","lastTransitionTime":"2026-03-14T07:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.073392 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.073446 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.073457 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.073474 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.073486 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.176286 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.176598 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.176778 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.176916 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.177124 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.192249 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.195078 5129 generic.go:334] "Generic (PLEG): container finished" podID="138e84c2-72d7-4e5b-9949-879dc02d95ce" containerID="aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56" exitCode=0 Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.195131 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerDied","Data":"aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.206768 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.221369 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.232381 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.248374 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.259054 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.273262 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.279516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.279546 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.279557 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.279991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.280019 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.285250 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.304399 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.317437 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.328947 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.339354 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.349471 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.362948 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:24Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.382563 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.382626 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.382637 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.382652 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.382663 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.485789 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.485822 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.485831 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.485863 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.485874 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.589816 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.589873 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.589894 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.589921 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.589942 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.692337 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.692387 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.692403 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.692425 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.692449 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.795405 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.795442 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.795450 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.795473 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.795482 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.897666 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.897702 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.897711 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.897725 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:24 crc kubenswrapper[5129]: I0314 07:00:24.897735 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:24Z","lastTransitionTime":"2026-03-14T07:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.000204 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.000239 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.000250 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.000271 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.000282 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.036317 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.036408 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.036493 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.036720 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.036337 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.036883 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.102996 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.103244 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.103354 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.103445 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.103529 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.202904 5129 generic.go:334] "Generic (PLEG): container finished" podID="138e84c2-72d7-4e5b-9949-879dc02d95ce" containerID="80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7" exitCode=0 Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.202967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerDied","Data":"80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.205661 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.205700 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.205712 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.205729 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.205741 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.234226 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.251748 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.270196 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.283818 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.300718 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.308267 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.308295 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.308306 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.308323 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.308334 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.321565 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.339480 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.361571 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.379068 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.388555 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.398845 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.411075 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.412250 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.412292 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.412304 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.412320 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.412334 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.421005 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:25Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.504002 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.504178 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.504145881 +0000 UTC m=+96.256061075 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.504379 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.504413 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.504638 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.504699 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.504687928 +0000 UTC m=+96.256603122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.504743 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.504836 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.504814261 +0000 UTC m=+96.256729455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.514197 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.514239 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.514253 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.514272 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.514286 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.605465 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.605541 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605672 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605686 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605697 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605693 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605735 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.605723212 +0000 UTC m=+96.357638396 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605740 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605759 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:25 crc kubenswrapper[5129]: E0314 07:00:25.605822 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.605803455 +0000 UTC m=+96.357718649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.616093 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.616116 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.616125 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.616137 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.616147 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.719040 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.719118 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.719139 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.719159 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.719821 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.822213 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.822421 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.822429 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.822442 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.822452 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.924642 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.924678 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.924687 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.924702 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:25 crc kubenswrapper[5129]: I0314 07:00:25.924711 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:25Z","lastTransitionTime":"2026-03-14T07:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.026288 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.026321 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.026329 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.026342 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.026350 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.128089 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.128139 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.128155 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.128177 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.128192 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.209393 5129 generic.go:334] "Generic (PLEG): container finished" podID="138e84c2-72d7-4e5b-9949-879dc02d95ce" containerID="cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c" exitCode=0 Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.209454 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerDied","Data":"cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.230242 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.230279 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.230290 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.230306 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.230319 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.230906 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.231164 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.241641 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.266464 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.278119 5129 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.284749 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.290755 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.295632 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.303531 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.314952 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.326263 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.332456 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.332478 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.332485 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.332499 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.332509 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.336174 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.355785 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.375758 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.388377 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.399289 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.409729 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.423980 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.432191 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.436900 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.436936 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.436945 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.436966 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.436986 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.447167 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.461893 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.473243 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.492695 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.514098 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.526511 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.540125 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.540389 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.540401 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.540421 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.540434 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.541592 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.560118 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.572365 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.584405 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.596659 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:26Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.641912 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.641944 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.641955 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.641971 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.641982 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.743841 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.743873 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.743881 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.743895 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.743905 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.846570 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.846633 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.846647 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.846664 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.846676 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.948670 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.948732 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.948755 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.948790 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:26 crc kubenswrapper[5129]: I0314 07:00:26.948811 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:26Z","lastTransitionTime":"2026-03-14T07:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.036026 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.036040 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:27 crc kubenswrapper[5129]: E0314 07:00:27.036288 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:27 crc kubenswrapper[5129]: E0314 07:00:27.036147 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.036387 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:27 crc kubenswrapper[5129]: E0314 07:00:27.036452 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.051520 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.051550 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.051558 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.051572 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.051581 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.154052 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.154084 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.154094 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.154111 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.154120 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.236632 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" event={"ID":"138e84c2-72d7-4e5b-9949-879dc02d95ce","Type":"ContainerStarted","Data":"82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.237270 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.237302 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.250681 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.256406 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.256446 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.256457 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.256476 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.256488 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.258480 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.262736 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.278170 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.289416 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.299300 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.316553 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.328264 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.340376 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.353287 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.359082 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.359108 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.359116 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.359128 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.359137 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.369425 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.378546 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.388567 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.400311 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.409885 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.423086 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.437375 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.449106 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.458519 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.464744 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.464838 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.464860 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.464884 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.464905 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.474976 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.486733 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.497504 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.517737 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.530774 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.545013 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.558034 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.566845 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.566889 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.566899 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.566915 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.566926 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.569931 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:27Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.670937 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.670977 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.670987 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.671002 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.671012 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.772614 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.772812 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.772904 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.772975 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.773029 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.875902 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.875946 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.875956 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.875970 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.875980 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.978295 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.978341 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.978349 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.978363 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:27 crc kubenswrapper[5129]: I0314 07:00:27.978372 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:27Z","lastTransitionTime":"2026-03-14T07:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.055057 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.063359 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.073516 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.080488 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.080524 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.080556 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.080576 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.080587 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.096624 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.111511 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.122998 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.130706 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.130740 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.130752 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.130768 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.130780 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.137291 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: E0314 07:00:28.147169 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.154476 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.154726 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.154805 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.154907 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.154984 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.156066 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.167556 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: E0314 07:00:28.170738 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.173900 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.173936 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.173945 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.173960 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.173969 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.182260 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: E0314 07:00:28.183992 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.187586 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.187629 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.187644 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.187659 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.187668 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.194913 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: E0314 07:00:28.201323 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.204229 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.204265 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.204276 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.204290 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.204301 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.206455 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: E0314 07:00:28.216223 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: E0314 07:00:28.216652 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.218120 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.218553 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.218623 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.218634 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.218649 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.218658 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.321414 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.321466 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.321482 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.321506 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.321521 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.423705 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.423749 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.423761 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.423779 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.423791 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.525444 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.525490 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.525501 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.525516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.525529 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.627812 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.628056 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.628121 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.628195 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.628265 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.729790 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.729991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.730058 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.730125 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.730197 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.832786 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.832821 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.832828 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.832842 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.832851 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.935068 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.935100 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.935123 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.935137 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:28 crc kubenswrapper[5129]: I0314 07:00:28.935146 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:28Z","lastTransitionTime":"2026-03-14T07:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.035584 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.035640 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.035733 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:29 crc kubenswrapper[5129]: E0314 07:00:29.035828 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:29 crc kubenswrapper[5129]: E0314 07:00:29.035958 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:29 crc kubenswrapper[5129]: E0314 07:00:29.036103 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.036974 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.037033 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.037046 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.037061 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.037075 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.044829 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.138830 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.138873 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.138881 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.138894 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.138902 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.241640 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.241673 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.241685 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.241701 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.241713 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.244336 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/0.log" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.247419 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667" exitCode=1 Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.247502 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.248027 5129 scope.go:117] "RemoveContainer" containerID="a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.263293 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.285147 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.314475 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"16 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 07:00:28.896779 6916 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 07:00:28.897682 6916 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 07:00:28.897724 6916 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 07:00:28.897767 6916 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 07:00:28.897801 6916 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 07:00:28.897838 6916 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 07:00:28.897698 6916 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 07:00:28.897852 6916 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 07:00:28.897767 6916 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 07:00:28.897859 6916 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 07:00:28.897902 6916 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 07:00:28.897926 6916 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 07:00:28.897952 6916 factory.go:656] Stopping watch factory\\\\nI0314 07:00:28.897978 6916 ovnkube.go:599] Stopped ovnkube\\\\nI0314 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.334325 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.344960 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.344999 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.345015 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.345030 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.345039 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.348722 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.357864 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.368532 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.378560 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.391386 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.401456 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.409879 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.419368 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.430216 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.439039 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:29Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.447715 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.447743 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.447750 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.447762 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.447771 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.550479 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.550520 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.550530 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.550548 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.550557 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.653094 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.653430 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.653442 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.653458 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.653469 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.756111 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.756145 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.756155 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.756171 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.756182 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.857642 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.857693 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.857707 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.857722 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.857734 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.960807 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.960850 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.960862 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.960879 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:29 crc kubenswrapper[5129]: I0314 07:00:29.960890 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:29Z","lastTransitionTime":"2026-03-14T07:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.063297 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.063330 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.063338 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.063350 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.063359 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.200691 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.200724 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.200732 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.200747 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.200758 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.252086 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/0.log" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.255081 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.255523 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.268364 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.279326 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.293099 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.302925 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.303235 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.303316 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.303413 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.303523 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.305990 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.318460 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.329870 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.354325 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"16 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 07:00:28.896779 6916 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 07:00:28.897682 6916 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 07:00:28.897724 6916 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 07:00:28.897767 6916 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 07:00:28.897801 6916 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 07:00:28.897838 6916 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 07:00:28.897698 6916 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 07:00:28.897852 6916 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 07:00:28.897767 6916 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 07:00:28.897859 6916 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 07:00:28.897902 6916 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 07:00:28.897926 6916 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 07:00:28.897952 6916 factory.go:656] Stopping watch factory\\\\nI0314 07:00:28.897978 6916 ovnkube.go:599] Stopped ovnkube\\\\nI0314 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.367191 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.381928 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.398522 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.406524 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.406557 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.406566 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.406581 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.406591 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.412746 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.426251 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.436559 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.448846 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.508550 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.508583 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.508593 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.508623 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.508635 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.610331 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.610367 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.610378 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.610393 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.610403 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.712216 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.712247 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.712260 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.712275 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.712285 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.815448 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.815489 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.815498 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.815513 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.815522 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.917433 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.917474 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.917484 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.917499 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:30 crc kubenswrapper[5129]: I0314 07:00:30.917509 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:30Z","lastTransitionTime":"2026-03-14T07:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.019654 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.019712 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.019729 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.019753 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.019771 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.035555 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.035638 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.035566 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:31 crc kubenswrapper[5129]: E0314 07:00:31.035695 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:31 crc kubenswrapper[5129]: E0314 07:00:31.035865 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:31 crc kubenswrapper[5129]: E0314 07:00:31.035993 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.092111 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4"] Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.092632 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.096218 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.096783 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.109161 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.122174 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.122208 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.122217 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.122232 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.122243 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.122690 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.143566 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"16 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 07:00:28.896779 6916 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 07:00:28.897682 6916 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 07:00:28.897724 6916 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 07:00:28.897767 6916 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 07:00:28.897801 6916 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 07:00:28.897838 6916 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 07:00:28.897698 6916 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 07:00:28.897852 6916 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 07:00:28.897767 6916 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 07:00:28.897859 6916 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 07:00:28.897902 6916 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 07:00:28.897926 6916 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 07:00:28.897952 6916 factory.go:656] Stopping watch factory\\\\nI0314 07:00:28.897978 6916 ovnkube.go:599] Stopped ovnkube\\\\nI0314 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.153630 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0fa377b-2382-4ada-aec5-c103d2ca74f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.153678 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0fa377b-2382-4ada-aec5-c103d2ca74f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.153755 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjnv\" (UniqueName: \"kubernetes.io/projected/d0fa377b-2382-4ada-aec5-c103d2ca74f0-kube-api-access-pbjnv\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.153806 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0fa377b-2382-4ada-aec5-c103d2ca74f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.162256 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.177948 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.190703 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.205659 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.220532 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.224269 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.224321 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.224336 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.224357 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.224372 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.236740 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.251333 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.254640 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjnv\" (UniqueName: \"kubernetes.io/projected/d0fa377b-2382-4ada-aec5-c103d2ca74f0-kube-api-access-pbjnv\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.254676 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0fa377b-2382-4ada-aec5-c103d2ca74f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.254719 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0fa377b-2382-4ada-aec5-c103d2ca74f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.254741 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0fa377b-2382-4ada-aec5-c103d2ca74f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.255212 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0fa377b-2382-4ada-aec5-c103d2ca74f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.255931 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0fa377b-2382-4ada-aec5-c103d2ca74f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.260132 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0fa377b-2382-4ada-aec5-c103d2ca74f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.261063 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/1.log" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.262215 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/0.log" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.265125 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0" exitCode=1 Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.265163 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.265201 5129 scope.go:117] "RemoveContainer" containerID="a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.266077 5129 scope.go:117] "RemoveContainer" containerID="28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0" Mar 14 07:00:31 crc kubenswrapper[5129]: E0314 07:00:31.266258 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.271088 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.277525 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjnv\" (UniqueName: \"kubernetes.io/projected/d0fa377b-2382-4ada-aec5-c103d2ca74f0-kube-api-access-pbjnv\") pod \"ovnkube-control-plane-749d76644c-qs4z4\" (UID: \"d0fa377b-2382-4ada-aec5-c103d2ca74f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.283357 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.298945 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.310018 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.319669 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.326450 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.326505 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.326515 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.326528 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.326539 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.331784 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.341377 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.350073 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.362185 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.373717 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.385920 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.394543 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.404633 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.407673 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.417065 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.433822 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.445232 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.445278 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.445289 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.445305 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.445316 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.459867 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.487857 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.502459 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.519134 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"16 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 07:00:28.896779 6916 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 07:00:28.897682 6916 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 07:00:28.897724 6916 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 07:00:28.897767 6916 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 07:00:28.897801 6916 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 07:00:28.897838 6916 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 07:00:28.897698 6916 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 07:00:28.897852 6916 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 07:00:28.897767 6916 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 07:00:28.897859 6916 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 07:00:28.897902 6916 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 07:00:28.897926 6916 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 07:00:28.897952 6916 factory.go:656] Stopping watch factory\\\\nI0314 07:00:28.897978 6916 ovnkube.go:599] Stopped ovnkube\\\\nI0314 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.530971 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.548318 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.548361 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.548369 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.548385 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.548393 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.650741 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.650772 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.650781 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.650796 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.650806 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.753519 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.753567 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.753575 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.753592 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.753632 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.856158 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.856191 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.856199 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.856213 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.856223 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.959061 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.959103 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.959113 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.959132 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:31 crc kubenswrapper[5129]: I0314 07:00:31.959145 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:31Z","lastTransitionTime":"2026-03-14T07:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.061186 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.061238 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.061252 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.061270 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.061319 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.164479 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.164532 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.164545 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.164563 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.164577 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.267125 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.267179 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.267192 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.267207 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.267216 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.268892 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" event={"ID":"d0fa377b-2382-4ada-aec5-c103d2ca74f0","Type":"ContainerStarted","Data":"1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.268924 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" event={"ID":"d0fa377b-2382-4ada-aec5-c103d2ca74f0","Type":"ContainerStarted","Data":"6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.268936 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" event={"ID":"d0fa377b-2382-4ada-aec5-c103d2ca74f0","Type":"ContainerStarted","Data":"8e5aab77bd39834d733f555929f98e385e5ac0df0d525850af05db36b8966628"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.270240 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/1.log" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.272755 5129 scope.go:117] "RemoveContainer" containerID="28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0" Mar 14 07:00:32 crc kubenswrapper[5129]: E0314 07:00:32.272955 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.284120 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.293845 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.306406 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.317079 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.328775 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.340522 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.361824 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17aedcaad4c7fe9af8aaf10beade38255efe19e1139c2a4c02e57779342f667\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:28Z\\\",\\\"message\\\":\\\"16 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 07:00:28.896779 6916 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 07:00:28.897682 6916 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0314 07:00:28.897724 6916 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0314 07:00:28.897767 6916 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 07:00:28.897801 6916 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0314 07:00:28.897838 6916 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 07:00:28.897698 6916 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 07:00:28.897852 6916 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 07:00:28.897767 6916 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0314 07:00:28.897859 6916 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0314 07:00:28.897902 6916 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0314 07:00:28.897926 6916 handler.go:208] Removed *v1.Node event handler 7\\\\nI0314 07:00:28.897952 6916 factory.go:656] Stopping watch factory\\\\nI0314 07:00:28.897978 6916 ovnkube.go:599] Stopped ovnkube\\\\nI0314 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.369712 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.369752 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.369766 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.369789 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.369803 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.376176 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.389821 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.401925 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.416223 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.427034 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.436864 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.446579 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.456785 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.468443 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.471640 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.471663 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.471674 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.471688 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.471696 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.481204 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.496726 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.510480 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.534379 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.545912 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.559549 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.570354 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.573117 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.573205 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.573262 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.573429 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.573522 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.582168 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.591618 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.601908 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.613898 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.627624 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.637982 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.647372 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:32Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.675850 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.675893 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.675904 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.675945 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.675960 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.781782 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.782083 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.785314 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.785445 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.785588 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.888306 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.888369 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.888386 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.888410 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.888428 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.991786 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.991864 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.991880 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.991909 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:32 crc kubenswrapper[5129]: I0314 07:00:32.991929 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:32Z","lastTransitionTime":"2026-03-14T07:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.036250 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.036283 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.036431 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.036446 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.036546 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.036698 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.095590 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.095683 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.095704 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.095734 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.095753 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.199352 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.199405 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.199422 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.199444 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.199458 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.295528 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-l2tzv"] Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.296852 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.297091 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.303084 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.303181 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.303207 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.303240 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.303263 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.317081 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.339015 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.354721 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.372263 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.378204 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.378250 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wfz\" (UniqueName: \"kubernetes.io/projected/ffc61f17-7577-4872-ad5d-7b33780d3d21-kube-api-access-87wfz\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.394178 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.409151 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.409185 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.409195 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.409209 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.409219 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.411940 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.429317 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.448425 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.479916 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.479993 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wfz\" (UniqueName: \"kubernetes.io/projected/ffc61f17-7577-4872-ad5d-7b33780d3d21-kube-api-access-87wfz\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.480158 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.480264 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:33.980238874 +0000 UTC m=+96.732154068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.483210 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.505238 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wfz\" (UniqueName: \"kubernetes.io/projected/ffc61f17-7577-4872-ad5d-7b33780d3d21-kube-api-access-87wfz\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.509736 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.511892 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.511935 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.511952 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.511974 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.511989 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.536284 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.554531 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.570720 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.581213 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.581951 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.581994 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.582131 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.582236 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.582511 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:00:49.582462485 +0000 UTC m=+112.334377699 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.582834 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:49.582796695 +0000 UTC m=+112.334711959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.583036 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:49.583014512 +0000 UTC m=+112.334929846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.587585 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.604331 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.616180 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.616445 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.616646 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.616788 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.616948 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.621550 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:33Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.683214 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.683293 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.683426 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.683444 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.683457 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.683507 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:49.68349147 +0000 UTC m=+112.435406664 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.684005 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.684129 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.684241 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.684419 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:49.684385067 +0000 UTC m=+112.436300331 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.720971 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.721023 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.721034 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.721053 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.721066 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.824848 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.824892 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.824904 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.824920 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.824931 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.926770 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.926802 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.926813 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.926827 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.926837 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:33Z","lastTransitionTime":"2026-03-14T07:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:33 crc kubenswrapper[5129]: I0314 07:00:33.987551 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.987741 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:33 crc kubenswrapper[5129]: E0314 07:00:33.987845 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:34.98782299 +0000 UTC m=+97.739738254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.029844 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.030110 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.030182 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.030249 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.030315 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.037220 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 07:00:34 crc kubenswrapper[5129]: E0314 07:00:34.037476 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.132991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.133080 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.133102 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.133137 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.133157 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.235544 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.235626 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.235640 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.235660 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.235673 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.338824 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.338875 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.338889 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.338907 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.338920 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.460044 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.461133 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.461159 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.461179 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.461192 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.564160 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.564211 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.564222 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.564239 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.564249 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.667966 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.668019 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.668035 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.668086 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.668099 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.771340 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.771383 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.771391 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.771407 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.771422 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.875133 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.875185 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.875195 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.875214 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.875225 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.978668 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.978721 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.978733 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.978752 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:34 crc kubenswrapper[5129]: I0314 07:00:34.978766 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:34Z","lastTransitionTime":"2026-03-14T07:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:34.999961 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:35 crc kubenswrapper[5129]: E0314 07:00:35.000164 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:35 crc kubenswrapper[5129]: E0314 07:00:35.000253 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:37.00023116 +0000 UTC m=+99.752146384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.035465 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.035546 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.035660 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.035466 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:35 crc kubenswrapper[5129]: E0314 07:00:35.035731 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:35 crc kubenswrapper[5129]: E0314 07:00:35.035858 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:35 crc kubenswrapper[5129]: E0314 07:00:35.035974 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:35 crc kubenswrapper[5129]: E0314 07:00:35.036154 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.081699 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.081777 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.081795 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.081816 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.081832 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.184877 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.184916 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.184927 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.184946 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.184958 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.288141 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.288203 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.288220 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.288286 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.288307 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.390982 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.391106 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.391138 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.391168 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.391186 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.494215 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.494293 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.494306 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.494330 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.494349 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.596628 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.596678 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.596690 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.596709 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.596722 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.699773 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.699848 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.699867 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.699900 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.699919 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.802343 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.802374 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.802383 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.802397 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.802408 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.906318 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.906497 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.906516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.906544 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:35 crc kubenswrapper[5129]: I0314 07:00:35.906559 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:35Z","lastTransitionTime":"2026-03-14T07:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.009676 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.009752 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.009769 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.009786 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.009824 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.112520 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.112571 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.112581 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.112598 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.112625 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.215498 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.215549 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.215558 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.215573 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.215584 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.319011 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.319077 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.319093 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.319119 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.319147 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.422103 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.422146 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.422162 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.422178 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.422189 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.524825 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.524863 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.524877 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.524893 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.524905 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.627406 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.627446 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.627456 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.627472 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.627483 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.729701 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.729741 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.729750 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.729766 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.729775 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.832738 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.833002 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.833082 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.833146 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.833200 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.935754 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.935819 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.935843 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.935872 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:36 crc kubenswrapper[5129]: I0314 07:00:36.935895 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:36Z","lastTransitionTime":"2026-03-14T07:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.022496 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:37 crc kubenswrapper[5129]: E0314 07:00:37.022783 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:37 crc kubenswrapper[5129]: E0314 07:00:37.023041 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:41.023014666 +0000 UTC m=+103.774929930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.036276 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:37 crc kubenswrapper[5129]: E0314 07:00:37.036378 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.036421 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.036480 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:37 crc kubenswrapper[5129]: E0314 07:00:37.036900 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.036686 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:37 crc kubenswrapper[5129]: E0314 07:00:37.036591 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:37 crc kubenswrapper[5129]: E0314 07:00:37.036985 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.040876 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.040914 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.040926 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.040942 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.040954 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.143430 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.143482 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.143496 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.143514 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.143526 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.245806 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.245844 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.245855 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.245871 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.245883 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.348221 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.348265 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.348327 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.348346 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.348358 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.450580 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.450648 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.450661 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.450680 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.450693 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.553522 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.553585 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.553593 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.553626 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.553636 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.655690 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.655732 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.655743 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.655760 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.655771 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.758333 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.758368 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.758395 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.758411 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.758420 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.861041 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.861078 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.861087 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.861102 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.861114 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.964197 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.964261 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.964278 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.964301 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:37 crc kubenswrapper[5129]: I0314 07:00:37.964320 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:37Z","lastTransitionTime":"2026-03-14T07:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.049370 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.062530 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.067234 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.067278 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.067290 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.067309 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.067322 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.082991 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.103550 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.117023 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.131660 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.149571 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.165304 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.169642 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.169675 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.169705 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.169727 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.169739 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.187185 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.202152 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.216182 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.230151 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.246894 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.258279 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.267731 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.273314 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.273358 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.273368 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.273387 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.273397 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.280312 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.376249 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.376292 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.376301 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.376316 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.376326 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.391342 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.391390 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.391399 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.391415 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.391424 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: E0314 07:00:38.403738 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.407769 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.407819 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.407837 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.407860 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.407876 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: E0314 07:00:38.421947 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.426486 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.426526 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.426535 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.426550 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.426560 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: E0314 07:00:38.436703 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.440341 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.440385 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.440393 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.440407 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.440416 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: E0314 07:00:38.452195 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.455625 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.455674 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.455689 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.455708 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.455721 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: E0314 07:00:38.466585 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:38Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:38 crc kubenswrapper[5129]: E0314 07:00:38.467096 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.478385 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.478492 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.478561 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.478646 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.478708 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.580517 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.580550 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.580560 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.580576 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.580585 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.682914 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.682978 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.682995 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.683016 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.683033 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.786206 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.786259 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.786276 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.786301 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.786324 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.889560 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.889622 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.889635 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.889650 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.889659 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.992495 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.992559 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.992571 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.992588 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:38 crc kubenswrapper[5129]: I0314 07:00:38.992619 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:38Z","lastTransitionTime":"2026-03-14T07:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.036072 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.036148 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.036199 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:39 crc kubenswrapper[5129]: E0314 07:00:39.036219 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:39 crc kubenswrapper[5129]: E0314 07:00:39.036304 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:39 crc kubenswrapper[5129]: E0314 07:00:39.036484 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.036530 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:39 crc kubenswrapper[5129]: E0314 07:00:39.036851 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.095591 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.095654 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.095665 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.095682 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.095693 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.198444 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.198708 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.198790 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.198883 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.198986 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.301454 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.301513 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.301530 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.301554 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.301570 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.403945 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.403991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.404008 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.404036 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.404054 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.506709 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.506918 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.506976 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.507049 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.507146 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.609823 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.609854 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.609864 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.609876 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.609884 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.712710 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.712758 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.712772 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.712788 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.712800 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.815512 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.815544 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.815555 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.815569 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.815578 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.918353 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.918401 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.918417 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.918435 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:39 crc kubenswrapper[5129]: I0314 07:00:39.918450 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:39Z","lastTransitionTime":"2026-03-14T07:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.022343 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.022384 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.022395 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.022412 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.022426 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.124964 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.125016 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.125031 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.125050 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.125066 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.227932 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.227988 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.228001 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.228020 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.228033 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.330343 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.330389 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.330400 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.330417 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.330429 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.433138 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.433520 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.433755 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.433922 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.434066 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.536540 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.536591 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.536626 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.536649 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.536661 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.639368 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.640031 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.640124 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.640244 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.640324 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.744090 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.744152 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.744167 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.744197 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.744209 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.847203 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.847647 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.847801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.847961 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.848102 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.951166 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.951575 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.951631 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.951667 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:40 crc kubenswrapper[5129]: I0314 07:00:40.951690 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:40Z","lastTransitionTime":"2026-03-14T07:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.035833 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.035907 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:41 crc kubenswrapper[5129]: E0314 07:00:41.035979 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.035996 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:41 crc kubenswrapper[5129]: E0314 07:00:41.036127 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.035851 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:41 crc kubenswrapper[5129]: E0314 07:00:41.036292 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:41 crc kubenswrapper[5129]: E0314 07:00:41.036528 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.054034 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.054077 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.054091 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.054115 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.054132 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.067295 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:41 crc kubenswrapper[5129]: E0314 07:00:41.067469 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:41 crc kubenswrapper[5129]: E0314 07:00:41.067569 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:00:49.067539556 +0000 UTC m=+111.819454810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.157573 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.157691 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.157716 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.157746 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.157770 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.260592 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.260675 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.260691 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.260713 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.260733 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.363862 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.364167 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.364282 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.364393 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.364509 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.467558 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.467938 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.468100 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.468320 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.468515 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.571250 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.571301 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.571315 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.571333 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.571344 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.674871 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.675400 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.675516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.675585 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.675674 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.779195 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.779232 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.779241 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.779254 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.779262 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.882903 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.882945 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.882953 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.882969 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.882978 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.989803 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.989859 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.989877 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.989916 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:41 crc kubenswrapper[5129]: I0314 07:00:41.989931 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:41Z","lastTransitionTime":"2026-03-14T07:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.092642 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.092695 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.092706 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.092723 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.092735 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.196204 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.196262 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.196274 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.196292 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.196308 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.299298 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.299371 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.299396 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.299428 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.299449 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.401941 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.401977 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.401991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.402009 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.402022 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.504745 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.504836 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.504859 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.504887 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.504906 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.607013 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.607055 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.607069 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.607085 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.607097 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.710240 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.710296 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.710429 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.710455 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.710479 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.813947 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.814041 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.814054 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.814072 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.814088 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.917855 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.917953 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.917979 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.918015 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:42 crc kubenswrapper[5129]: I0314 07:00:42.918040 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:42Z","lastTransitionTime":"2026-03-14T07:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.021518 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.021560 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.021568 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.021582 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.021593 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.035405 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.035474 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.035502 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.035405 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:43 crc kubenswrapper[5129]: E0314 07:00:43.035593 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:43 crc kubenswrapper[5129]: E0314 07:00:43.035848 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:43 crc kubenswrapper[5129]: E0314 07:00:43.035898 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:43 crc kubenswrapper[5129]: E0314 07:00:43.035948 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.125203 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.125262 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.125283 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.125314 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.125337 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.228278 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.228336 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.228354 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.228378 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.228397 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.331130 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.331199 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.331217 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.331242 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.331260 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.433835 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.433868 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.433878 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.433892 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.433900 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.537539 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.537671 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.537690 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.537763 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.537784 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.639898 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.639938 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.639950 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.639966 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.639978 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.742842 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.742937 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.742966 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.742994 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.743011 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.846635 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.846717 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.846738 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.846784 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.846810 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.952087 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.952178 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.952248 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.952281 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:43 crc kubenswrapper[5129]: I0314 07:00:43.952298 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:43Z","lastTransitionTime":"2026-03-14T07:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.057519 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.060650 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.060674 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.060683 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.060694 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.060702 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.162879 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.162914 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.162924 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.162937 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.162946 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.265379 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.265428 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.265437 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.265465 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.265474 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.368161 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.368224 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.368242 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.368264 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.368281 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.471197 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.471271 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.471294 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.471337 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.471361 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.574703 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.574770 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.574793 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.574824 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.574847 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.677867 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.677951 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.678041 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.678068 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.678089 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.781105 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.781157 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.781170 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.781190 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.781203 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.884016 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.884054 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.884062 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.884078 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.884086 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.986738 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.986783 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.986797 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.986814 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:44 crc kubenswrapper[5129]: I0314 07:00:44.986828 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:44Z","lastTransitionTime":"2026-03-14T07:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.036238 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.036265 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.036322 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:45 crc kubenswrapper[5129]: E0314 07:00:45.036354 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.036381 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:45 crc kubenswrapper[5129]: E0314 07:00:45.036488 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:45 crc kubenswrapper[5129]: E0314 07:00:45.036642 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:45 crc kubenswrapper[5129]: E0314 07:00:45.036671 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.089909 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.089955 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.089971 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.089992 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.090010 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.192385 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.192454 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.192479 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.192506 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.192527 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.295464 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.295537 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.295555 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.295580 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.295628 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.398453 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.398506 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.398519 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.398539 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.398551 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.502286 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.502335 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.502347 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.502367 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.502382 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.604950 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.605008 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.605025 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.605047 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.605064 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.707966 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.708020 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.708035 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.708057 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.708074 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.810532 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.810573 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.810586 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.810616 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.810629 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.913961 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.914024 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.914043 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.914068 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:45 crc kubenswrapper[5129]: I0314 07:00:45.914085 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:45Z","lastTransitionTime":"2026-03-14T07:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.016533 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.016582 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.016596 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.016638 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.016654 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.120687 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.120758 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.120775 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.120801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.120822 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.223396 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.223464 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.223481 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.223507 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.223526 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.326327 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.326391 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.326409 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.326435 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.326453 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.431285 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.431376 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.431396 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.431423 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.431442 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.535006 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.535058 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.535067 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.535079 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.535088 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.638161 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.638211 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.638229 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.638250 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.638268 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.741287 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.741363 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.741375 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.741390 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.741401 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.849293 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.849380 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.849405 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.849439 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.849720 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.953983 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.954022 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.954032 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.954049 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:46 crc kubenswrapper[5129]: I0314 07:00:46.954062 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:46Z","lastTransitionTime":"2026-03-14T07:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.035481 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.035557 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:47 crc kubenswrapper[5129]: E0314 07:00:47.035696 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.035731 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.035749 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:47 crc kubenswrapper[5129]: E0314 07:00:47.036393 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:47 crc kubenswrapper[5129]: E0314 07:00:47.035862 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:47 crc kubenswrapper[5129]: E0314 07:00:47.036436 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.036498 5129 scope.go:117] "RemoveContainer" containerID="28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.056486 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.056509 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.056517 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.056528 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.056537 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.158771 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.158804 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.158812 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.158824 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.158833 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.260875 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.260920 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.260940 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.260958 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.260969 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.329004 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/1.log" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.331213 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.332076 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.349549 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.363640 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.363691 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.363708 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.363730 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.363747 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.373948 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.391757 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.412541 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.434798 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.452503 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.466580 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.466638 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.466649 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.466664 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.466676 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.475546 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.488190 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.500427 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.513973 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.524823 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.533909 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.546218 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.559084 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.569077 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.569121 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.569134 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.569156 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.569168 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.571987 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.588792 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.598747 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:47Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.672474 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.672530 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.672555 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.672576 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.672591 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.775283 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.775346 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.775364 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.775391 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.775410 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.878188 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.878247 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.878256 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.878268 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.878277 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.980820 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.980873 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.980890 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.980911 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:47 crc kubenswrapper[5129]: I0314 07:00:47.980929 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:47Z","lastTransitionTime":"2026-03-14T07:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.063458 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.083293 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.083336 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.083348 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.083366 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.083381 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.083834 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.096560 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.113937 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.129441 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.153253 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.171528 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.186137 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.186182 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.186193 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.186208 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.186219 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.196334 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.211284 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.226349 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.242482 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.258705 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.273246 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.282358 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.290272 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.290320 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.290331 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.290351 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.290364 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.293544 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.303460 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.312724 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.336327 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/2.log" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.337109 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/1.log" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.339223 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81" exitCode=1 Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.339262 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.339299 5129 scope.go:117] "RemoveContainer" containerID="28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.340040 5129 scope.go:117] "RemoveContainer" containerID="1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81" Mar 14 07:00:48 crc kubenswrapper[5129]: E0314 07:00:48.340245 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.354202 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.364388 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.376679 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.392867 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.392918 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.392932 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.392950 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.392963 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.402186 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.414977 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.425252 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.436365 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.454397 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7d47be841ec02a017334c9b4f07c9318931ec36fb1eec9a2932651050f9a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:30Z\\\",\\\"message\\\":\\\"ntity/network-node-identity-vrzqb in node crc\\\\nI0314 07:00:30.608046 7050 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 07:00:30.607340 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-h6665\\\\nI0314 07:00:30.608058 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0314 07:00:30.608063 7050 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-h6665 in node crc\\\\nI0314 07:00:30.608065 7050 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:30.607282 7050 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608079 7050 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-866b9\\\\nI0314 07:00:30.608088 7050 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-866b9 in node crc\\\\nI0314 07:00:30.608095 7050 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-866b9 after 0 failed attempt(s)\\\\nF0314 07:00:30.608100 7050 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.472006 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.483794 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.495147 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.495407 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.495518 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.495634 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.495726 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.524623 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.537375 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.556338 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.569814 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.580551 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.593948 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.597707 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.597759 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.597776 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.597799 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.597818 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.605548 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.700217 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.700263 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.700291 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.700307 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.700317 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.803296 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.803354 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.803366 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.803382 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.803395 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.862485 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.862541 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.862550 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.862586 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.862630 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: E0314 07:00:48.877881 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.881021 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.881061 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.881072 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.881091 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.881102 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: E0314 07:00:48.899331 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.903218 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.903326 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.903388 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.903459 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.903529 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: E0314 07:00:48.919434 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.923400 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.923495 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.923532 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.923553 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.923565 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: E0314 07:00:48.941066 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.944712 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.944894 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.944971 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.945050 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.945107 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:48 crc kubenswrapper[5129]: E0314 07:00:48.957906 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:48Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:48 crc kubenswrapper[5129]: E0314 07:00:48.958061 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.959948 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.959981 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.959991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.960006 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:48 crc kubenswrapper[5129]: I0314 07:00:48.960015 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:48Z","lastTransitionTime":"2026-03-14T07:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.035977 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.036067 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.036192 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.036191 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.036429 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.036572 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.036703 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.036771 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.036904 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.063425 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.063468 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.063483 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.063503 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.063519 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.153540 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.153725 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.153806 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:05.15378332 +0000 UTC m=+127.905698524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.165736 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.165779 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.165791 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.165809 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.165821 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.268577 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.268622 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.268631 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.268644 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.268653 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.343770 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/2.log" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.346796 5129 scope.go:117] "RemoveContainer" containerID="1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.346915 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.347639 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.349572 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.350010 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.360307 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.370662 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.370680 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.370689 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.370700 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.370708 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.372418 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.379892 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.387170 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.397830 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.409335 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.420702 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.429820 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.441769 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.458951 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.469147 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.472670 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.472695 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.472704 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.472716 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.472725 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.481216 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.494705 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.509729 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.523230 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.542865 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.555667 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.565499 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.575811 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.575841 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.575951 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.575963 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.575980 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.575991 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.586537 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.603674 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.616362 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.625724 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.641830 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.654866 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.658905 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.659065 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:01:21.659045185 +0000 UTC m=+144.410960379 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.659135 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.659162 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.659258 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.659307 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.659321 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:21.659307704 +0000 UTC m=+144.411222888 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.659344 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:21.659334395 +0000 UTC m=+144.411249589 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.674505 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.678750 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.678816 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.678835 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.678859 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.678875 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.690979 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.708090 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.720178 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.736173 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.751723 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.760153 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.760234 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760348 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760382 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760398 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760461 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:21.760437201 +0000 UTC m=+144.512352405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760352 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760495 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760507 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:49 crc kubenswrapper[5129]: E0314 07:00:49.760542 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:21.760530144 +0000 UTC m=+144.512445338 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.764549 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.782091 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.782170 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.782197 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.782228 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.782251 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.783232 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.801882 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:49Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.884276 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.884323 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.884333 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.884352 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.884362 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.986835 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.986868 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.986879 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.986893 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:49 crc kubenswrapper[5129]: I0314 07:00:49.986902 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:49Z","lastTransitionTime":"2026-03-14T07:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.089709 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.089783 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.089801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.089825 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.089843 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.192155 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.192207 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.192224 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.192246 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.192261 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.294576 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.294652 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.294665 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.294687 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.294698 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.398185 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.398243 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.398252 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.398269 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.398280 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.500076 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.500133 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.500144 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.500163 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.500174 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.602167 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.602208 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.602216 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.602230 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.602241 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.705035 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.705095 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.705112 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.705132 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.705147 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.807970 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.808015 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.808027 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.808043 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.808055 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.911206 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.911247 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.911258 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.911274 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:50 crc kubenswrapper[5129]: I0314 07:00:50.911286 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:50Z","lastTransitionTime":"2026-03-14T07:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.014487 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.014537 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.014552 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.014575 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.014591 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.036524 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.036531 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:51 crc kubenswrapper[5129]: E0314 07:00:51.036662 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.036540 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:51 crc kubenswrapper[5129]: E0314 07:00:51.036729 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.036530 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:51 crc kubenswrapper[5129]: E0314 07:00:51.036793 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:51 crc kubenswrapper[5129]: E0314 07:00:51.036842 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.117855 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.117910 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.117927 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.117987 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.118038 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.221391 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.221477 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.221495 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.221517 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.221562 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.325674 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.325724 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.325736 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.325756 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.325769 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.427864 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.427907 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.427918 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.427934 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.427945 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.530146 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.530194 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.530203 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.530217 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.530225 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.632720 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.632773 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.632782 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.632796 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.632806 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.736290 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.736347 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.736381 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.736401 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.736415 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.839571 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.839677 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.839700 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.839778 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.839805 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.942502 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.942688 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.942751 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.942783 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:51 crc kubenswrapper[5129]: I0314 07:00:51.942850 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:51Z","lastTransitionTime":"2026-03-14T07:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.045516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.045574 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.045592 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.045647 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.045665 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.148349 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.148397 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.148413 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.148436 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.148452 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.250506 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.250584 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.250634 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.250662 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.250688 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.353295 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.353351 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.353367 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.353390 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.353415 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.456876 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.456930 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.456947 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.456968 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.456984 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.559710 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.559774 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.559796 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.559827 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.559944 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.663310 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.663396 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.663423 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.663452 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.663473 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.766700 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.766774 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.766796 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.766824 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.766845 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.869483 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.869513 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.869523 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.869536 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.869546 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.971259 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.971312 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.971329 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.971350 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:52 crc kubenswrapper[5129]: I0314 07:00:52.971368 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:52Z","lastTransitionTime":"2026-03-14T07:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.036398 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.036441 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.036453 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.036409 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:53 crc kubenswrapper[5129]: E0314 07:00:53.036593 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:53 crc kubenswrapper[5129]: E0314 07:00:53.036758 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:53 crc kubenswrapper[5129]: E0314 07:00:53.036832 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:53 crc kubenswrapper[5129]: E0314 07:00:53.036910 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.073710 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.073784 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.073801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.073825 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.073841 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.176412 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.176470 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.176490 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.176514 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.176532 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.278925 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.278974 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.278984 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.279002 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.279013 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.381850 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.381892 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.381909 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.381927 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.381938 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.484442 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.484501 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.484519 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.484542 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.484560 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.587667 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.587727 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.587741 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.587763 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.587779 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.690117 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.690159 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.690175 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.690191 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.690201 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.792260 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.792308 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.792324 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.792341 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.792352 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.895127 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.895186 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.895196 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.895210 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.895220 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.998412 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.998491 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.998517 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.998563 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:53 crc kubenswrapper[5129]: I0314 07:00:53.998636 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:53Z","lastTransitionTime":"2026-03-14T07:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.101497 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.101579 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.101590 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.101629 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.101643 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.204689 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.204765 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.204789 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.204821 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.204868 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.308089 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.308159 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.308182 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.308210 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.308231 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.411281 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.411312 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.411323 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.411339 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.411348 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.513972 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.514042 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.514064 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.514093 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.514115 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.616699 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.616761 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.616776 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.616798 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.616815 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.719582 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.719653 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.719663 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.719678 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.719689 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.822441 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.822487 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.822496 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.822516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.822525 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.924841 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.924875 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.924886 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.924901 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:54 crc kubenswrapper[5129]: I0314 07:00:54.924912 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:54Z","lastTransitionTime":"2026-03-14T07:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.027317 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.027372 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.027385 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.027402 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.027420 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.035707 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.035743 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.035767 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.035728 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:55 crc kubenswrapper[5129]: E0314 07:00:55.035874 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:55 crc kubenswrapper[5129]: E0314 07:00:55.035982 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:55 crc kubenswrapper[5129]: E0314 07:00:55.036037 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:55 crc kubenswrapper[5129]: E0314 07:00:55.036128 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.130053 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.130107 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.130122 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.130143 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.130159 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.233096 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.233141 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.233178 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.233194 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.233205 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.336104 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.336164 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.336178 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.336195 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.336209 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.439345 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.439419 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.439453 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.439482 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.439504 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.542768 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.542848 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.542872 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.542901 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.542921 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.645921 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.645977 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.645994 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.646016 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.646032 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.749317 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.749380 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.749402 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.749429 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.749454 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.852735 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.852808 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.852829 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.852856 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.852877 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.956408 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.956469 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.956486 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.956508 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:55 crc kubenswrapper[5129]: I0314 07:00:55.956524 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:55Z","lastTransitionTime":"2026-03-14T07:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.059471 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.059534 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.059551 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.059575 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.059595 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.162982 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.163061 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.163072 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.163086 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.163114 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.266427 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.266503 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.266521 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.266970 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.267030 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.370043 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.370099 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.370110 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.370145 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.370156 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.472895 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.472986 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.473035 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.473063 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.473080 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.575991 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.576041 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.576052 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.576070 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.576083 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.679738 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.679818 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.679836 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.679865 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.679889 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.782516 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.782588 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.782647 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.782678 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.782695 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.886749 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.886824 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.886841 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.886869 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.886889 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.990799 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.990937 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.990961 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.991033 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:56 crc kubenswrapper[5129]: I0314 07:00:56.991062 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:56Z","lastTransitionTime":"2026-03-14T07:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.036117 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.036171 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.036200 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.036117 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:57 crc kubenswrapper[5129]: E0314 07:00:57.036382 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:57 crc kubenswrapper[5129]: E0314 07:00:57.036522 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:57 crc kubenswrapper[5129]: E0314 07:00:57.036718 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:57 crc kubenswrapper[5129]: E0314 07:00:57.036823 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.095210 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.095294 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.095315 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.095346 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.095370 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.198102 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.198169 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.198186 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.198210 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.198230 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.301114 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.301164 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.301174 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.301190 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.301202 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.403482 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.403543 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.403554 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.403571 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.403588 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.506027 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.506071 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.506081 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.506094 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.506104 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.609029 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.609132 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.609151 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.609178 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.609198 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.711228 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.711270 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.711282 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.711302 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.711313 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.814123 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.814162 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.814170 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.814191 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.814201 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.916734 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.916796 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.916807 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.916821 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:57 crc kubenswrapper[5129]: I0314 07:00:57.916832 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:57Z","lastTransitionTime":"2026-03-14T07:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:58 crc kubenswrapper[5129]: E0314 07:00:58.017932 5129 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.061573 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.080875 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.098782 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.116941 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.130460 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.145161 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.157624 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: E0314 07:00:58.161705 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.177038 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.186831 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.199503 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.211822 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.224566 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.233399 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.242596 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.251940 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.264266 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:58 crc kubenswrapper[5129]: I0314 07:00:58.277577 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:58Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.036408 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.036510 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.036444 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.036717 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.036813 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.036963 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.037378 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.037711 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.103581 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.103664 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.103681 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.103702 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.103717 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:59Z","lastTransitionTime":"2026-03-14T07:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.123092 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:59Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.128102 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.128143 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.128155 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.128172 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.128182 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:59Z","lastTransitionTime":"2026-03-14T07:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.144875 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:59Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.149536 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.149650 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.149674 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.149706 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.149732 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:59Z","lastTransitionTime":"2026-03-14T07:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.164831 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:59Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.171400 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.171485 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.171517 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.171548 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.171574 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:59Z","lastTransitionTime":"2026-03-14T07:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.184761 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:59Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.188218 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.188260 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.188273 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.188290 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:00:59 crc kubenswrapper[5129]: I0314 07:00:59.188302 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:00:59Z","lastTransitionTime":"2026-03-14T07:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.205720 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:00:59Z is after 2025-08-24T17:21:41Z" Mar 14 07:00:59 crc kubenswrapper[5129]: E0314 07:00:59.205829 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.035851 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:01 crc kubenswrapper[5129]: E0314 07:01:01.036579 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.035950 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:01 crc kubenswrapper[5129]: E0314 07:01:01.036851 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.035923 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:01 crc kubenswrapper[5129]: E0314 07:01:01.037036 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.036098 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:01 crc kubenswrapper[5129]: E0314 07:01:01.037246 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.640927 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.657485 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.674242 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.692964 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.712531 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.737798 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.749934 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.762486 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.773019 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.785479 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.807565 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.821483 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.831832 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.843094 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.860194 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.872482 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.885870 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:01 crc kubenswrapper[5129]: I0314 07:01:01.897359 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:01Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:03 crc kubenswrapper[5129]: I0314 07:01:03.036254 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:03 crc kubenswrapper[5129]: I0314 07:01:03.036342 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:03 crc kubenswrapper[5129]: E0314 07:01:03.036447 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:03 crc kubenswrapper[5129]: I0314 07:01:03.036506 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:03 crc kubenswrapper[5129]: I0314 07:01:03.036640 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:03 crc kubenswrapper[5129]: E0314 07:01:03.036796 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:03 crc kubenswrapper[5129]: E0314 07:01:03.036877 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:03 crc kubenswrapper[5129]: E0314 07:01:03.036982 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:03 crc kubenswrapper[5129]: E0314 07:01:03.162811 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:04 crc kubenswrapper[5129]: I0314 07:01:04.036429 5129 scope.go:117] "RemoveContainer" containerID="1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81" Mar 14 07:01:04 crc kubenswrapper[5129]: E0314 07:01:04.036635 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:01:05 crc kubenswrapper[5129]: I0314 07:01:05.035585 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:05 crc kubenswrapper[5129]: I0314 07:01:05.035644 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:05 crc kubenswrapper[5129]: I0314 07:01:05.035692 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:05 crc kubenswrapper[5129]: E0314 07:01:05.035846 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:05 crc kubenswrapper[5129]: I0314 07:01:05.035908 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:05 crc kubenswrapper[5129]: E0314 07:01:05.035951 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:05 crc kubenswrapper[5129]: E0314 07:01:05.036102 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:05 crc kubenswrapper[5129]: E0314 07:01:05.036319 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:05 crc kubenswrapper[5129]: I0314 07:01:05.219215 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:05 crc kubenswrapper[5129]: E0314 07:01:05.219386 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:01:05 crc kubenswrapper[5129]: E0314 07:01:05.219448 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:01:37.21943241 +0000 UTC m=+159.971347594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.036375 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.036508 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:07 crc kubenswrapper[5129]: E0314 07:01:07.036567 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.036478 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:07 crc kubenswrapper[5129]: E0314 07:01:07.036722 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:07 crc kubenswrapper[5129]: E0314 07:01:07.036893 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.036909 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:07 crc kubenswrapper[5129]: E0314 07:01:07.037017 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.406368 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/0.log" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.406426 5129 generic.go:334] "Generic (PLEG): container finished" podID="e37bb55b-4ace-4d62-9711-88d8a1bb8cd8" containerID="b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828" exitCode=1 Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.406457 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerDied","Data":"b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828"} Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.406893 5129 scope.go:117] "RemoveContainer" containerID="b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.437575 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.461627 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.481773 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.500856 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.521279 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.536044 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.555192 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.568794 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.590886 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.606690 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.622729 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.639698 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.655492 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.670346 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.689495 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.704565 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:07 crc kubenswrapper[5129]: I0314 07:01:07.716219 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:07Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.048168 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.063721 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.078734 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.108863 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.123163 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.135649 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.145937 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: E0314 07:01:08.163545 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.171281 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.185709 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.198680 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.214959 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.225959 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.235920 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.250324 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.266875 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.284895 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.301522 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.411951 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/0.log" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.412057 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerStarted","Data":"a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02"} Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.429398 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.444268 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.457058 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.471534 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.489816 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.503465 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.515768 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.525683 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.536003 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.553424 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.567293 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.579450 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.593121 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.620018 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.636385 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.652576 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:08 crc kubenswrapper[5129]: I0314 07:01:08.667305 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:08Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.035976 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.036002 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.036038 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.036100 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.036143 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.036257 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.036313 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.036361 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.237721 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.237789 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.237801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.237819 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.237833 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:09Z","lastTransitionTime":"2026-03-14T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.249511 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:09Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.253206 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.253238 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.253246 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.253259 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.253269 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:09Z","lastTransitionTime":"2026-03-14T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.264650 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:09Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.267767 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.267822 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.267831 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.267844 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.267853 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:09Z","lastTransitionTime":"2026-03-14T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.283710 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:09Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.287438 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.287487 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.287502 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.287519 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.287531 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:09Z","lastTransitionTime":"2026-03-14T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.298847 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:09Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.302275 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.302310 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.302320 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.302333 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:09 crc kubenswrapper[5129]: I0314 07:01:09.302342 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:09Z","lastTransitionTime":"2026-03-14T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.315053 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:09Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:09 crc kubenswrapper[5129]: E0314 07:01:09.315215 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:01:11 crc kubenswrapper[5129]: I0314 07:01:11.036030 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:11 crc kubenswrapper[5129]: I0314 07:01:11.036129 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:11 crc kubenswrapper[5129]: E0314 07:01:11.036145 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:11 crc kubenswrapper[5129]: I0314 07:01:11.036177 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:11 crc kubenswrapper[5129]: E0314 07:01:11.036293 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:11 crc kubenswrapper[5129]: I0314 07:01:11.036319 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:11 crc kubenswrapper[5129]: E0314 07:01:11.036407 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:11 crc kubenswrapper[5129]: E0314 07:01:11.036458 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:11 crc kubenswrapper[5129]: I0314 07:01:11.045665 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 14 07:01:11 crc kubenswrapper[5129]: I0314 07:01:11.046144 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 07:01:13 crc kubenswrapper[5129]: I0314 07:01:13.036104 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:13 crc kubenswrapper[5129]: I0314 07:01:13.036341 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:13 crc kubenswrapper[5129]: I0314 07:01:13.036592 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:13 crc kubenswrapper[5129]: E0314 07:01:13.036674 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:13 crc kubenswrapper[5129]: E0314 07:01:13.036409 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:13 crc kubenswrapper[5129]: I0314 07:01:13.036545 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:13 crc kubenswrapper[5129]: E0314 07:01:13.036895 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:13 crc kubenswrapper[5129]: E0314 07:01:13.036955 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:13 crc kubenswrapper[5129]: E0314 07:01:13.164492 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:15 crc kubenswrapper[5129]: I0314 07:01:15.035481 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:15 crc kubenswrapper[5129]: I0314 07:01:15.035563 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:15 crc kubenswrapper[5129]: I0314 07:01:15.035563 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:15 crc kubenswrapper[5129]: E0314 07:01:15.035650 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:15 crc kubenswrapper[5129]: E0314 07:01:15.035769 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:15 crc kubenswrapper[5129]: E0314 07:01:15.035830 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:15 crc kubenswrapper[5129]: I0314 07:01:15.035860 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:15 crc kubenswrapper[5129]: E0314 07:01:15.036065 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:17 crc kubenswrapper[5129]: I0314 07:01:17.036090 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:17 crc kubenswrapper[5129]: I0314 07:01:17.036175 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:17 crc kubenswrapper[5129]: I0314 07:01:17.036207 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:17 crc kubenswrapper[5129]: I0314 07:01:17.036125 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:17 crc kubenswrapper[5129]: E0314 07:01:17.036235 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:17 crc kubenswrapper[5129]: E0314 07:01:17.036285 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:17 crc kubenswrapper[5129]: E0314 07:01:17.036351 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:17 crc kubenswrapper[5129]: E0314 07:01:17.036449 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.038018 5129 scope.go:117] "RemoveContainer" containerID="1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.056189 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.068054 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.080530 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.093761 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.106362 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.122144 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.134716 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.145762 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.157466 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: E0314 07:01:18.165278 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.174394 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.185732 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.195164 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.207510 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.219996 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.233082 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.255366 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.271933 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.285843 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6623d6f5-a916-41d7-aec4-09da71c1fa91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4933abe33b7df00e5183eceab4fa4714af0304d9910886fb723e907c75dcd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:30Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 06:59:00.044096 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 06:59:00.046837 1 observer_polling.go:159] Starting file observer\\\\nI0314 06:59:00.083448 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 06:59:00.087238 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 06:59:30.349556 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 06:59:30.349674 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:29Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7235ddb1c63b5f61521cf2e92d0070dc5c8d6f9899565129aa993711c1715caf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf7436ce9438c258552ab9d2cdb6fc34a3f0369aa4151691f33d5e9997208bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.304837 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433c295d-49dc-47f2-b715-eb10c8c16f13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c928fbb323aa809b60e546f9ab99a82ad8ea28ae853a828483412b8a3175f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5e09d308061549f8c3183d7885644cd785cc8769880e0643f2bdc30e8a361c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://579eedbe1341c93457c7fbd71eda29dcd1172371231300b16d9319491196cc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.446199 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/2.log" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.448183 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.448918 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.460354 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.491507 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.509458 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.531967 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.543747 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.553338 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.564865 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.575204 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6623d6f5-a916-41d7-aec4-09da71c1fa91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4933abe33b7df00e5183eceab4fa4714af0304d9910886fb723e907c75dcd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:30Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 06:59:00.044096 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 06:59:00.046837 1 observer_polling.go:159] Starting file observer\\\\nI0314 06:59:00.083448 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 06:59:00.087238 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 06:59:30.349556 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 06:59:30.349674 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:29Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7235ddb1c63b5f61521cf2e92d0070dc5c8d6f9899565129aa993711c1715caf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf7436ce9438c258552ab9d2cdb6fc34a3f0369aa4151691f33d5e9997208bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.585710 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433c295d-49dc-47f2-b715-eb10c8c16f13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c928fbb323aa809b60e546f9ab99a82ad8ea28ae853a828483412b8a3175f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5e09d308061549f8c3183d7885644cd785cc8769880e0643f2bdc30e8a361c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://579eedbe1341c93457c7fbd71eda29dcd1172371231300b16d9319491196cc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.598659 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.609668 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.619326 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.634373 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.643378 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.653872 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.663689 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.675487 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.686845 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:18 crc kubenswrapper[5129]: I0314 07:01:18.699621 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:18Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.036052 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.036130 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.036167 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.036227 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.036241 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.036318 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.036468 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.036541 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.630095 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.630138 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.630151 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.630166 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.630176 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:19Z","lastTransitionTime":"2026-03-14T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.641100 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.649537 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.649593 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.649631 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.649665 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.649684 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:19Z","lastTransitionTime":"2026-03-14T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.664226 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.668154 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.668215 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.668227 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.668243 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.668253 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:19Z","lastTransitionTime":"2026-03-14T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.679051 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.683385 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.683421 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.683431 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.683450 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.683464 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:19Z","lastTransitionTime":"2026-03-14T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.694349 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.697940 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.697988 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.697999 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.698019 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:19 crc kubenswrapper[5129]: I0314 07:01:19.698033 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:19Z","lastTransitionTime":"2026-03-14T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.709476 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:19Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:19 crc kubenswrapper[5129]: E0314 07:01:19.709595 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.458718 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/3.log" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.460325 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/2.log" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.469044 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" exitCode=1 Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.469110 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.469244 5129 scope.go:117] "RemoveContainer" containerID="1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.470587 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:01:20 crc kubenswrapper[5129]: E0314 07:01:20.470932 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.489215 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.509690 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6623d6f5-a916-41d7-aec4-09da71c1fa91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4933abe33b7df00e5183eceab4fa4714af0304d9910886fb723e907c75dcd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:30Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 06:59:00.044096 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 06:59:00.046837 1 observer_polling.go:159] Starting file observer\\\\nI0314 06:59:00.083448 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 06:59:00.087238 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 06:59:30.349556 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 06:59:30.349674 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:29Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7235ddb1c63b5f61521cf2e92d0070dc5c8d6f9899565129aa993711c1715caf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf7436ce9438c258552ab9d2cdb6fc34a3f0369aa4151691f33d5e9997208bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.531689 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433c295d-49dc-47f2-b715-eb10c8c16f13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c928fbb323aa809b60e546f9ab99a82ad8ea28ae853a828483412b8a3175f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5e09d308061549f8c3183d7885644cd785cc8769880e0643f2bdc30e8a361c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://579eedbe1341c93457c7fbd71eda29dcd1172371231300b16d9319491196cc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.551078 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.570416 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.589583 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.644838 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\" 07:01:19.830438 7683 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-lf9lh after 0 failed attempt(s)\\\\nI0314 07:01:19.830443 7683 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-lf9lh\\\\nI0314 07:01:19.830414 7683 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 07:01:19.830463 7683 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 07:01:19.830524 7683 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.660925 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.676514 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.695869 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.719411 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.737388 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.749475 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.762557 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.774416 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.789050 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.817279 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.830968 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:20 crc kubenswrapper[5129]: I0314 07:01:20.842130 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:20Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.035663 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.035772 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.035804 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.035836 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.035785 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.036051 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.036211 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.036333 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.475184 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/3.log" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.706366 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.706648 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.706577455 +0000 UTC m=+208.458492699 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.706788 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.706832 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.706995 5129 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.707026 5129 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.707077 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.70706847 +0000 UTC m=+208.458983654 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.707094 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.707085841 +0000 UTC m=+208.459001295 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.808232 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:21 crc kubenswrapper[5129]: I0314 07:01:21.808312 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808486 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808532 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808552 5129 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808650 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.808624464 +0000 UTC m=+208.560539678 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808485 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808689 5129 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808707 5129 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:01:21 crc kubenswrapper[5129]: E0314 07:01:21.808765 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.808745898 +0000 UTC m=+208.560661092 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 07:01:23 crc kubenswrapper[5129]: I0314 07:01:23.036196 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:23 crc kubenswrapper[5129]: I0314 07:01:23.036234 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:23 crc kubenswrapper[5129]: I0314 07:01:23.036215 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:23 crc kubenswrapper[5129]: E0314 07:01:23.036315 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:23 crc kubenswrapper[5129]: I0314 07:01:23.036330 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:23 crc kubenswrapper[5129]: E0314 07:01:23.036385 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:23 crc kubenswrapper[5129]: E0314 07:01:23.036432 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:23 crc kubenswrapper[5129]: E0314 07:01:23.036483 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:23 crc kubenswrapper[5129]: E0314 07:01:23.166499 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:25 crc kubenswrapper[5129]: I0314 07:01:25.035575 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:25 crc kubenswrapper[5129]: I0314 07:01:25.035687 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:25 crc kubenswrapper[5129]: I0314 07:01:25.035742 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:25 crc kubenswrapper[5129]: I0314 07:01:25.035760 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:25 crc kubenswrapper[5129]: E0314 07:01:25.035934 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:25 crc kubenswrapper[5129]: E0314 07:01:25.036015 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:25 crc kubenswrapper[5129]: E0314 07:01:25.036110 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:25 crc kubenswrapper[5129]: E0314 07:01:25.036189 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:27 crc kubenswrapper[5129]: I0314 07:01:27.035445 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:27 crc kubenswrapper[5129]: E0314 07:01:27.036072 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:27 crc kubenswrapper[5129]: I0314 07:01:27.035589 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:27 crc kubenswrapper[5129]: E0314 07:01:27.036194 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:27 crc kubenswrapper[5129]: I0314 07:01:27.035725 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:27 crc kubenswrapper[5129]: E0314 07:01:27.036285 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:27 crc kubenswrapper[5129]: I0314 07:01:27.035500 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:27 crc kubenswrapper[5129]: E0314 07:01:27.036388 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.050446 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.061784 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6623d6f5-a916-41d7-aec4-09da71c1fa91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4933abe33b7df00e5183eceab4fa4714af0304d9910886fb723e907c75dcd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:30Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 06:59:00.044096 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 06:59:00.046837 1 observer_polling.go:159] Starting file observer\\\\nI0314 06:59:00.083448 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 06:59:00.087238 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 06:59:30.349556 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 06:59:30.349674 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:29Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7235ddb1c63b5f61521cf2e92d0070dc5c8d6f9899565129aa993711c1715caf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf7436ce9438c258552ab9d2cdb6fc34a3f0369aa4151691f33d5e9997208bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.072753 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433c295d-49dc-47f2-b715-eb10c8c16f13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c928fbb323aa809b60e546f9ab99a82ad8ea28ae853a828483412b8a3175f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5e09d308061549f8c3183d7885644cd785cc8769880e0643f2bdc30e8a361c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://579eedbe1341c93457c7fbd71eda29dcd1172371231300b16d9319491196cc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.085853 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.097907 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.107418 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.131382 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc030fc288abc514ce0c8e923f70646b8a0b43b898877c8267ba7b07cd6ac81\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:00:47Z\\\",\\\"message\\\":\\\"rator LB template configs for network=default: []services.lbConfig(nil)\\\\nI0314 07:00:47.811328 7316 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 07:00:47.811486 7316 services_controller.go:356] Processing sync for service openshift-route-controller-manager/route-controller-manager for network=default\\\\nI0314 07:00:47.811866 7316 services_controller.go:434] Service openshift-config-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-config-operator caae15e3-df99-43f2-b37b-d68e01dce8bd 4093 0 2025-02-23 05:12:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:openshift-config-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:config-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007031e9f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\" 07:01:19.830438 7683 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-lf9lh after 0 failed attempt(s)\\\\nI0314 07:01:19.830443 7683 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-lf9lh\\\\nI0314 07:01:19.830414 7683 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 07:01:19.830463 7683 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 07:01:19.830524 7683 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.141289 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.153890 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: E0314 07:01:28.167569 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.171403 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.188928 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.200713 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.214210 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.230014 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.242528 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.258851 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.282158 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.298546 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:28 crc kubenswrapper[5129]: I0314 07:01:28.311392 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:28Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:29 crc kubenswrapper[5129]: I0314 07:01:29.035733 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:29 crc kubenswrapper[5129]: I0314 07:01:29.035889 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:29 crc kubenswrapper[5129]: I0314 07:01:29.036372 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:29 crc kubenswrapper[5129]: E0314 07:01:29.036639 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:29 crc kubenswrapper[5129]: I0314 07:01:29.036946 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:29 crc kubenswrapper[5129]: E0314 07:01:29.037078 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:29 crc kubenswrapper[5129]: E0314 07:01:29.037195 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:29 crc kubenswrapper[5129]: E0314 07:01:29.037343 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.040655 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.040716 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.040736 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.040766 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.040789 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:30Z","lastTransitionTime":"2026-03-14T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:30 crc kubenswrapper[5129]: E0314 07:01:30.055783 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.058764 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.058793 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.058801 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.058815 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.058824 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:30Z","lastTransitionTime":"2026-03-14T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:30 crc kubenswrapper[5129]: E0314 07:01:30.072288 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.075759 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.075790 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.075798 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.075808 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.075816 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:30Z","lastTransitionTime":"2026-03-14T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:30 crc kubenswrapper[5129]: E0314 07:01:30.088368 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.092115 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.092330 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.092343 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.092359 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.092372 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:30Z","lastTransitionTime":"2026-03-14T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:30 crc kubenswrapper[5129]: E0314 07:01:30.105331 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.109134 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.109179 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.109196 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.109219 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:30 crc kubenswrapper[5129]: I0314 07:01:30.109235 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:30Z","lastTransitionTime":"2026-03-14T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:30 crc kubenswrapper[5129]: E0314 07:01:30.124762 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"56e5df46-9227-44ac-8b48-da482731e804\\\",\\\"systemUUID\\\":\\\"f9bf91bc-f395-4d83-a8e9-849213d9a3dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:30Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:30 crc kubenswrapper[5129]: E0314 07:01:30.124981 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.036089 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:31 crc kubenswrapper[5129]: E0314 07:01:31.036215 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.036382 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.036409 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.036430 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:31 crc kubenswrapper[5129]: E0314 07:01:31.036479 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:31 crc kubenswrapper[5129]: E0314 07:01:31.036590 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:31 crc kubenswrapper[5129]: E0314 07:01:31.036657 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.037634 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:01:31 crc kubenswrapper[5129]: E0314 07:01:31.037841 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.060186 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c8d8519-8ade-4cb4-9d40-ddb5867683d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08240fa12d4a763177663f53078814fc63dfd7b6380b488118f464eb3cef8256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d22d9ee2e77a2eda2714189e26eeafd53bd1c9e78e7a365627c323ff58de99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d124c1940bbbc2625b57188d7eaafc0fa13d4366961c1870f336f9779d968f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b1dcae08e7a810e4cd9309ac9aa20cf694a33bac6ad060d64909c27704c7ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e517883a289ad10a8078720d229aab5e813291e72e307d8e6e837c05da74ff93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8e187583d0fa43c491526cfa4ca9486bd83083a8a8d95bef6c50ccb7d42f09e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f158883c72df22711f3ac9e39244c707d58860078d87524a515212716613034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://306bf1929e3f202f5cfea652715ce899afc294233a0444686b16799089f2ca01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.073196 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-r4btb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:06Z\\\",\\\"message\\\":\\\"2026-03-14T07:00:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72\\\\n2026-03-14T07:00:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6925a09f-85a6-49ca-98a8-e18b249dfc72 to /host/opt/cni/bin/\\\\n2026-03-14T07:00:21Z [verbose] multus-daemon started\\\\n2026-03-14T07:00:21Z [verbose] Readiness Indicator file check\\\\n2026-03-14T07:01:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g42rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-r4btb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.084757 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lknrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1202ae8e-98e0-4dc5-99aa-680871888bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a7786f1af3695b03897a3a46d3e4a1fd528debb6f16f8c8c57f58a07641882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ngnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lknrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.100197 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84967d01-e382-4c62-98c2-9e8209f31aa0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:01:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 06:59:55.862268 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 06:59:55.862432 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 06:59:55.863426 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2538475225/tls.crt::/tmp/serving-cert-2538475225/tls.key\\\\\\\"\\\\nI0314 06:59:56.085814 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 06:59:56.089097 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 06:59:56.089119 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 06:59:56.089142 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 06:59:56.089147 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 06:59:56.097371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0314 06:59:56.097402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097408 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 06:59:56.097416 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 06:59:56.097421 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 06:59:56.097424 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 06:59:56.097427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0314 06:59:56.097595 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0314 06:59:56.099769 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:59:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.116732 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6623d6f5-a916-41d7-aec4-09da71c1fa91\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4933abe33b7df00e5183eceab4fa4714af0304d9910886fb723e907c75dcd018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d84344093736ed5f4571f4a04217738adc1db9b27b3a238149dfcd03aa6725b2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T06:59:30Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 06:59:00.044096 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 06:59:00.046837 1 observer_polling.go:159] Starting file observer\\\\nI0314 06:59:00.083448 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 06:59:00.087238 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 06:59:30.349556 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 06:59:30.349674 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T06:59:29Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7235ddb1c63b5f61521cf2e92d0070dc5c8d6f9899565129aa993711c1715caf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf7436ce9438c258552ab9d2cdb6fc34a3f0369aa4151691f33d5e9997208bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.131384 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433c295d-49dc-47f2-b715-eb10c8c16f13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c928fbb323aa809b60e546f9ab99a82ad8ea28ae853a828483412b8a3175f965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f5e09d308061549f8c3183d7885644cd785cc8769880e0643f2bdc30e8a361c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://579eedbe1341c93457c7fbd71eda29dcd1172371231300b16d9319491196cc0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://606d6d86bf5050324c43cf977dae10a46e1e8f28346febf07a950544ee4f2ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.145379 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2b421c281e5d34b4f1a3ab38904e543c1e99c043eefd5ce201f61276f05365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.162193 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://802c96ed55945b0bc56d6f86048a9b08498639c0404128ae46cc01ebe58261ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fca724b371317420024dfa8d0629c997e3769bf65585050a64c8a42cdb21c7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.179720 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.208716 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T07:01:19Z\\\",\\\"message\\\":\\\" 07:01:19.830438 7683 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-lf9lh after 0 failed attempt(s)\\\\nI0314 07:01:19.830443 7683 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-lf9lh\\\\nI0314 07:01:19.830414 7683 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 07:01:19.830463 7683 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 07:01:19.830524 7683 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T07:01:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hssn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7hfdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.222694 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3fe0e7-a7ef-4ef5-a52c-d768cfe1f608\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T06:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daab415dacb110dfe1cd60bdfaf10a0791f1d12a10071bd128d6d9732c8783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T06:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37076447ecde115fb2695108c6f7ffb0110df825059791ed835ec0b1aeffdd77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T06:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T06:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T06:58:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.241317 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.262196 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.278817 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h6665" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"138e84c2-72d7-4e5b-9949-879dc02d95ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82522aa18eb548a4a958806c456083dd50efc0c3cf6eaa13282b9124db2bc1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ba872806181e4128526ba39b3004b084f8db5327ed548d926b437c30f26f6ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e70c2865613d9d70ba80ca41e8498e3c0fd48a469ff8763c3cf31f25499556ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04cdb462d3e769c7ca6cf3911d674916951ae5818c2c20346765df301b85d24d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aba4564e9d4c0a664325f45fa88da3f285d5061812a0ea31402a86cb2e638e56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80548939a79b6ed0b01b77ae3c2408fe4374e85d92a31f04b4d6ae91bf0058b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2bae909083143d5bff7a490c4bef96ee943ab066660f49a4100d56ee69de4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T07:00:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T07:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s56xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h6665\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.301014 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0fa377b-2382-4ada-aec5-c103d2ca74f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d5c8cb90ba19f234b3dc2ab9df96b6d592417a9d6018f30fa16291809033c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b2749448c6583567088a9c2ae2e692e6e1801ced603ef23dc42e8c113568bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qs4z4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.313124 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffc61f17-7577-4872-ad5d-7b33780d3d21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-87wfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l2tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.326627 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af5d10977c94e0449d08d9b0582c13f66c80ee66ddbbe4f0b4d0a39a6f6dd953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.341068 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-866b9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25b72bf7-03b0-43ea-be16-8b484c6e018a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0254f2871cc6b13f5ecd1ef56f994d1b800ef7f534a5638110a145e7db790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-866b9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:31 crc kubenswrapper[5129]: I0314 07:01:31.354828 5129 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58bd6165-e663-4c4e-82ae-6009ff348000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T07:00:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa2261cbfecc4b0d4e4262ae6f4b95d007d1613f305344d1da959a19cbb4cfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T07:00:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94gxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T07:00:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lf9lh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T07:01:31Z is after 2025-08-24T17:21:41Z" Mar 14 07:01:33 crc kubenswrapper[5129]: I0314 07:01:33.035821 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:33 crc kubenswrapper[5129]: I0314 07:01:33.035836 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:33 crc kubenswrapper[5129]: I0314 07:01:33.035952 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:33 crc kubenswrapper[5129]: E0314 07:01:33.036016 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:33 crc kubenswrapper[5129]: I0314 07:01:33.036031 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:33 crc kubenswrapper[5129]: E0314 07:01:33.036171 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:33 crc kubenswrapper[5129]: E0314 07:01:33.036264 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:33 crc kubenswrapper[5129]: E0314 07:01:33.036318 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:33 crc kubenswrapper[5129]: E0314 07:01:33.168479 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:35 crc kubenswrapper[5129]: I0314 07:01:35.035347 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:35 crc kubenswrapper[5129]: I0314 07:01:35.035443 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:35 crc kubenswrapper[5129]: E0314 07:01:35.035547 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:35 crc kubenswrapper[5129]: I0314 07:01:35.035347 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:35 crc kubenswrapper[5129]: I0314 07:01:35.035360 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:35 crc kubenswrapper[5129]: E0314 07:01:35.035715 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:35 crc kubenswrapper[5129]: E0314 07:01:35.035862 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:35 crc kubenswrapper[5129]: E0314 07:01:35.036230 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:37 crc kubenswrapper[5129]: I0314 07:01:37.035862 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:37 crc kubenswrapper[5129]: I0314 07:01:37.035947 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:37 crc kubenswrapper[5129]: I0314 07:01:37.035862 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:37 crc kubenswrapper[5129]: I0314 07:01:37.035897 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:37 crc kubenswrapper[5129]: E0314 07:01:37.036082 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:37 crc kubenswrapper[5129]: E0314 07:01:37.036224 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:37 crc kubenswrapper[5129]: E0314 07:01:37.036351 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:37 crc kubenswrapper[5129]: E0314 07:01:37.036545 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:37 crc kubenswrapper[5129]: I0314 07:01:37.277063 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:37 crc kubenswrapper[5129]: E0314 07:01:37.277399 5129 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:01:37 crc kubenswrapper[5129]: E0314 07:01:37.277674 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs podName:ffc61f17-7577-4872-ad5d-7b33780d3d21 nodeName:}" failed. No retries permitted until 2026-03-14 07:02:41.277510514 +0000 UTC m=+224.029425748 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs") pod "network-metrics-daemon-l2tzv" (UID: "ffc61f17-7577-4872-ad5d-7b33780d3d21") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.107566 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-866b9" podStartSLOduration=111.107542441 podStartE2EDuration="1m51.107542441s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.089100409 +0000 UTC m=+160.841015653" watchObservedRunningTime="2026-03-14 07:01:38.107542441 +0000 UTC m=+160.859457635" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.148309 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podStartSLOduration=111.148277291 podStartE2EDuration="1m51.148277291s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.109676117 +0000 UTC m=+160.861591341" watchObservedRunningTime="2026-03-14 07:01:38.148277291 +0000 UTC m=+160.900192525" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.148556 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=54.148544 podStartE2EDuration="54.148544s" podCreationTimestamp="2026-03-14 07:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.143435222 +0000 UTC m=+160.895350486" watchObservedRunningTime="2026-03-14 07:01:38.148544 +0000 UTC m=+160.900459224" Mar 14 07:01:38 crc kubenswrapper[5129]: E0314 07:01:38.170451 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.179594 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r4btb" podStartSLOduration=111.179564261 podStartE2EDuration="1m51.179564261s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.166009831 +0000 UTC m=+160.917925085" watchObservedRunningTime="2026-03-14 07:01:38.179564261 +0000 UTC m=+160.931479485" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.180775 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lknrr" podStartSLOduration=111.180758817 podStartE2EDuration="1m51.180758817s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.179107946 +0000 UTC m=+160.931023180" watchObservedRunningTime="2026-03-14 07:01:38.180758817 +0000 UTC m=+160.932674041" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.276997 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.276975206 podStartE2EDuration="1m17.276975206s" podCreationTimestamp="2026-03-14 07:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.25385847 +0000 UTC m=+161.005773714" watchObservedRunningTime="2026-03-14 07:01:38.276975206 +0000 UTC m=+161.028890400" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.291548 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=27.291531787 podStartE2EDuration="27.291531787s" podCreationTimestamp="2026-03-14 07:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.291369252 +0000 UTC m=+161.043284436" watchObservedRunningTime="2026-03-14 07:01:38.291531787 +0000 UTC m=+161.043446971" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.291910 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=27.291905019 podStartE2EDuration="27.291905019s" podCreationTimestamp="2026-03-14 07:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.277936956 +0000 UTC m=+161.029852180" watchObservedRunningTime="2026-03-14 07:01:38.291905019 +0000 UTC m=+161.043820203" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.340865 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=69.340849163 podStartE2EDuration="1m9.340849163s" podCreationTimestamp="2026-03-14 07:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.340258635 +0000 UTC m=+161.092173839" watchObservedRunningTime="2026-03-14 07:01:38.340849163 +0000 UTC m=+161.092764347" Mar 14 07:01:38 crc kubenswrapper[5129]: I0314 07:01:38.392489 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-h6665" podStartSLOduration=111.392466252 podStartE2EDuration="1m51.392466252s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.380846491 +0000 UTC m=+161.132761675" watchObservedRunningTime="2026-03-14 07:01:38.392466252 +0000 UTC m=+161.144381436" Mar 14 07:01:39 crc kubenswrapper[5129]: I0314 07:01:39.036232 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:39 crc kubenswrapper[5129]: I0314 07:01:39.036315 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:39 crc kubenswrapper[5129]: I0314 07:01:39.036401 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:39 crc kubenswrapper[5129]: E0314 07:01:39.036389 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:39 crc kubenswrapper[5129]: E0314 07:01:39.036489 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:39 crc kubenswrapper[5129]: E0314 07:01:39.036579 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:39 crc kubenswrapper[5129]: I0314 07:01:39.036626 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:39 crc kubenswrapper[5129]: E0314 07:01:39.036870 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.393689 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.393772 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.393798 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.393830 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.393872 5129 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T07:01:40Z","lastTransitionTime":"2026-03-14T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.461892 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qs4z4" podStartSLOduration=112.461860859 podStartE2EDuration="1m52.461860859s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:38.391906864 +0000 UTC m=+161.143822048" watchObservedRunningTime="2026-03-14 07:01:40.461860859 +0000 UTC m=+163.213776083" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.463134 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc"] Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.463830 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.467514 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.467780 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.468181 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.468822 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.614193 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.614681 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.614727 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.614775 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.614820 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.716529 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.716662 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.716690 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.716721 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.716761 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.716782 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.716862 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.718550 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.727075 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.741646 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8d58bcd-436c-4a6c-b737-8625b1b8fc81-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f2jxc\" (UID: \"c8d58bcd-436c-4a6c-b737-8625b1b8fc81\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:40 crc kubenswrapper[5129]: I0314 07:01:40.786040 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.036509 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.036534 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:41 crc kubenswrapper[5129]: E0314 07:01:41.036740 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.036824 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.036835 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:41 crc kubenswrapper[5129]: E0314 07:01:41.036887 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:41 crc kubenswrapper[5129]: E0314 07:01:41.037014 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:41 crc kubenswrapper[5129]: E0314 07:01:41.037160 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.189209 5129 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.199677 5129 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.558218 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" event={"ID":"c8d58bcd-436c-4a6c-b737-8625b1b8fc81","Type":"ContainerStarted","Data":"0c1b33f335e878dbff293720f079be75a8f4bf77033be57eac696554445e47c0"} Mar 14 07:01:41 crc kubenswrapper[5129]: I0314 07:01:41.558302 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" event={"ID":"c8d58bcd-436c-4a6c-b737-8625b1b8fc81","Type":"ContainerStarted","Data":"728879a9023babbc168a9941acca2c2b4786fbd339dbc96566e6e3df57fc41f8"} Mar 14 07:01:42 crc kubenswrapper[5129]: I0314 07:01:42.036717 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:01:42 crc kubenswrapper[5129]: E0314 07:01:42.036905 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:01:43 crc kubenswrapper[5129]: I0314 07:01:43.036119 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:43 crc kubenswrapper[5129]: I0314 07:01:43.036304 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:43 crc kubenswrapper[5129]: E0314 07:01:43.036849 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:43 crc kubenswrapper[5129]: I0314 07:01:43.036340 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:43 crc kubenswrapper[5129]: E0314 07:01:43.036973 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:43 crc kubenswrapper[5129]: I0314 07:01:43.036319 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:43 crc kubenswrapper[5129]: E0314 07:01:43.037223 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:43 crc kubenswrapper[5129]: E0314 07:01:43.037375 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:43 crc kubenswrapper[5129]: E0314 07:01:43.172239 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:45 crc kubenswrapper[5129]: I0314 07:01:45.036079 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:45 crc kubenswrapper[5129]: I0314 07:01:45.036176 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:45 crc kubenswrapper[5129]: I0314 07:01:45.036230 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:45 crc kubenswrapper[5129]: E0314 07:01:45.036308 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:45 crc kubenswrapper[5129]: I0314 07:01:45.036393 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:45 crc kubenswrapper[5129]: E0314 07:01:45.036569 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:45 crc kubenswrapper[5129]: E0314 07:01:45.036741 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:45 crc kubenswrapper[5129]: E0314 07:01:45.037321 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:47 crc kubenswrapper[5129]: I0314 07:01:47.035699 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:47 crc kubenswrapper[5129]: E0314 07:01:47.036740 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:47 crc kubenswrapper[5129]: I0314 07:01:47.035721 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:47 crc kubenswrapper[5129]: E0314 07:01:47.036932 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:47 crc kubenswrapper[5129]: I0314 07:01:47.035701 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:47 crc kubenswrapper[5129]: E0314 07:01:47.037106 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:47 crc kubenswrapper[5129]: I0314 07:01:47.035778 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:47 crc kubenswrapper[5129]: E0314 07:01:47.037290 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:48 crc kubenswrapper[5129]: E0314 07:01:48.172997 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:49 crc kubenswrapper[5129]: I0314 07:01:49.035363 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:49 crc kubenswrapper[5129]: I0314 07:01:49.035417 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:49 crc kubenswrapper[5129]: I0314 07:01:49.035426 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:49 crc kubenswrapper[5129]: E0314 07:01:49.036238 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:49 crc kubenswrapper[5129]: E0314 07:01:49.035911 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:49 crc kubenswrapper[5129]: I0314 07:01:49.035435 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:49 crc kubenswrapper[5129]: E0314 07:01:49.036320 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:49 crc kubenswrapper[5129]: E0314 07:01:49.036598 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:51 crc kubenswrapper[5129]: I0314 07:01:51.036253 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:51 crc kubenswrapper[5129]: I0314 07:01:51.036325 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:51 crc kubenswrapper[5129]: I0314 07:01:51.036335 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:51 crc kubenswrapper[5129]: E0314 07:01:51.036396 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:51 crc kubenswrapper[5129]: E0314 07:01:51.036495 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:51 crc kubenswrapper[5129]: I0314 07:01:51.036524 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:51 crc kubenswrapper[5129]: E0314 07:01:51.036654 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:51 crc kubenswrapper[5129]: E0314 07:01:51.036740 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.036391 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.036438 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.036474 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.037698 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:53 crc kubenswrapper[5129]: E0314 07:01:53.038023 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:53 crc kubenswrapper[5129]: E0314 07:01:53.038424 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:53 crc kubenswrapper[5129]: E0314 07:01:53.038571 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:53 crc kubenswrapper[5129]: E0314 07:01:53.038719 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:53 crc kubenswrapper[5129]: E0314 07:01:53.175707 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.607868 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/1.log" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.608515 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/0.log" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.608563 5129 generic.go:334] "Generic (PLEG): container finished" podID="e37bb55b-4ace-4d62-9711-88d8a1bb8cd8" containerID="a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02" exitCode=1 Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.608592 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerDied","Data":"a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02"} Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.608657 5129 scope.go:117] "RemoveContainer" containerID="b79d62cab26256f621842160c639046dbe506dde52056632a98d3237a8393828" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.609444 5129 scope.go:117] "RemoveContainer" containerID="a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02" Mar 14 07:01:53 crc kubenswrapper[5129]: E0314 07:01:53.609735 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-r4btb_openshift-multus(e37bb55b-4ace-4d62-9711-88d8a1bb8cd8)\"" pod="openshift-multus/multus-r4btb" podUID="e37bb55b-4ace-4d62-9711-88d8a1bb8cd8" Mar 14 07:01:53 crc kubenswrapper[5129]: I0314 07:01:53.632007 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f2jxc" podStartSLOduration=126.631983602 podStartE2EDuration="2m6.631983602s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:01:41.580417881 +0000 UTC m=+164.332333105" watchObservedRunningTime="2026-03-14 07:01:53.631983602 +0000 UTC m=+176.383898836" Mar 14 07:01:54 crc kubenswrapper[5129]: I0314 07:01:54.614906 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/1.log" Mar 14 07:01:55 crc kubenswrapper[5129]: I0314 07:01:55.035586 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:55 crc kubenswrapper[5129]: I0314 07:01:55.035698 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:55 crc kubenswrapper[5129]: I0314 07:01:55.035697 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:55 crc kubenswrapper[5129]: I0314 07:01:55.035722 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:55 crc kubenswrapper[5129]: E0314 07:01:55.035894 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:55 crc kubenswrapper[5129]: E0314 07:01:55.036071 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:55 crc kubenswrapper[5129]: E0314 07:01:55.036261 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:55 crc kubenswrapper[5129]: E0314 07:01:55.036436 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:57 crc kubenswrapper[5129]: I0314 07:01:57.035893 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:57 crc kubenswrapper[5129]: I0314 07:01:57.035975 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:57 crc kubenswrapper[5129]: I0314 07:01:57.035930 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:57 crc kubenswrapper[5129]: E0314 07:01:57.036124 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:57 crc kubenswrapper[5129]: E0314 07:01:57.036268 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:01:57 crc kubenswrapper[5129]: I0314 07:01:57.036375 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:57 crc kubenswrapper[5129]: E0314 07:01:57.036465 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:57 crc kubenswrapper[5129]: E0314 07:01:57.037190 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:57 crc kubenswrapper[5129]: I0314 07:01:57.037567 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:01:57 crc kubenswrapper[5129]: E0314 07:01:57.037844 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7hfdh_openshift-ovn-kubernetes(8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" Mar 14 07:01:58 crc kubenswrapper[5129]: E0314 07:01:58.177456 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:01:59 crc kubenswrapper[5129]: I0314 07:01:59.035580 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:01:59 crc kubenswrapper[5129]: I0314 07:01:59.035648 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:01:59 crc kubenswrapper[5129]: I0314 07:01:59.035709 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:01:59 crc kubenswrapper[5129]: E0314 07:01:59.035921 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:01:59 crc kubenswrapper[5129]: I0314 07:01:59.036010 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:01:59 crc kubenswrapper[5129]: E0314 07:01:59.036113 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:01:59 crc kubenswrapper[5129]: E0314 07:01:59.036280 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:01:59 crc kubenswrapper[5129]: E0314 07:01:59.036411 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:01 crc kubenswrapper[5129]: I0314 07:02:01.035937 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:01 crc kubenswrapper[5129]: I0314 07:02:01.036071 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:01 crc kubenswrapper[5129]: E0314 07:02:01.036205 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:01 crc kubenswrapper[5129]: I0314 07:02:01.036259 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:01 crc kubenswrapper[5129]: E0314 07:02:01.036399 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:01 crc kubenswrapper[5129]: E0314 07:02:01.036521 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:01 crc kubenswrapper[5129]: I0314 07:02:01.036837 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:01 crc kubenswrapper[5129]: E0314 07:02:01.037124 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:03 crc kubenswrapper[5129]: I0314 07:02:03.035388 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:03 crc kubenswrapper[5129]: E0314 07:02:03.035566 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:03 crc kubenswrapper[5129]: I0314 07:02:03.035584 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:03 crc kubenswrapper[5129]: I0314 07:02:03.035708 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:03 crc kubenswrapper[5129]: E0314 07:02:03.036065 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:03 crc kubenswrapper[5129]: E0314 07:02:03.036184 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:03 crc kubenswrapper[5129]: I0314 07:02:03.036384 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:03 crc kubenswrapper[5129]: E0314 07:02:03.036500 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:03 crc kubenswrapper[5129]: E0314 07:02:03.179222 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:02:05 crc kubenswrapper[5129]: I0314 07:02:05.035721 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:05 crc kubenswrapper[5129]: I0314 07:02:05.035832 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:05 crc kubenswrapper[5129]: E0314 07:02:05.035865 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:05 crc kubenswrapper[5129]: I0314 07:02:05.035735 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:05 crc kubenswrapper[5129]: E0314 07:02:05.035959 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:05 crc kubenswrapper[5129]: I0314 07:02:05.035973 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:05 crc kubenswrapper[5129]: E0314 07:02:05.036230 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:05 crc kubenswrapper[5129]: I0314 07:02:05.036433 5129 scope.go:117] "RemoveContainer" containerID="a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02" Mar 14 07:02:05 crc kubenswrapper[5129]: E0314 07:02:05.036558 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:05 crc kubenswrapper[5129]: I0314 07:02:05.655783 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/1.log" Mar 14 07:02:05 crc kubenswrapper[5129]: I0314 07:02:05.656135 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerStarted","Data":"f2f3e31f8fa47821dc29e4dfcafdd95ccd4d6ff2f72e48303e4f326b6c1c2ce8"} Mar 14 07:02:07 crc kubenswrapper[5129]: I0314 07:02:07.036188 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:07 crc kubenswrapper[5129]: I0314 07:02:07.036251 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:07 crc kubenswrapper[5129]: E0314 07:02:07.036373 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:07 crc kubenswrapper[5129]: I0314 07:02:07.036404 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:07 crc kubenswrapper[5129]: E0314 07:02:07.036784 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:07 crc kubenswrapper[5129]: E0314 07:02:07.036860 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:07 crc kubenswrapper[5129]: I0314 07:02:07.037288 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:07 crc kubenswrapper[5129]: E0314 07:02:07.037467 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:08 crc kubenswrapper[5129]: E0314 07:02:08.180681 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:02:09 crc kubenswrapper[5129]: I0314 07:02:09.035711 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:09 crc kubenswrapper[5129]: I0314 07:02:09.035751 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:09 crc kubenswrapper[5129]: I0314 07:02:09.035794 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:09 crc kubenswrapper[5129]: E0314 07:02:09.035866 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:09 crc kubenswrapper[5129]: E0314 07:02:09.035941 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:09 crc kubenswrapper[5129]: E0314 07:02:09.036011 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:09 crc kubenswrapper[5129]: I0314 07:02:09.036331 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:09 crc kubenswrapper[5129]: E0314 07:02:09.036549 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.036176 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.036261 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:11 crc kubenswrapper[5129]: E0314 07:02:11.036317 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.036330 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.036188 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:11 crc kubenswrapper[5129]: E0314 07:02:11.036438 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:11 crc kubenswrapper[5129]: E0314 07:02:11.036391 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:11 crc kubenswrapper[5129]: E0314 07:02:11.036578 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.039092 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.674204 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/3.log" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.676071 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerStarted","Data":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.676465 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.702137 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podStartSLOduration=144.702122666 podStartE2EDuration="2m24.702122666s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:11.701662712 +0000 UTC m=+194.453577896" watchObservedRunningTime="2026-03-14 07:02:11.702122666 +0000 UTC m=+194.454037840" Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.835733 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l2tzv"] Mar 14 07:02:11 crc kubenswrapper[5129]: I0314 07:02:11.835856 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:11 crc kubenswrapper[5129]: E0314 07:02:11.835958 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:13 crc kubenswrapper[5129]: I0314 07:02:13.036351 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:13 crc kubenswrapper[5129]: I0314 07:02:13.036368 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:13 crc kubenswrapper[5129]: E0314 07:02:13.037271 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:13 crc kubenswrapper[5129]: I0314 07:02:13.036525 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:13 crc kubenswrapper[5129]: E0314 07:02:13.037398 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:13 crc kubenswrapper[5129]: E0314 07:02:13.037456 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:13 crc kubenswrapper[5129]: I0314 07:02:13.036540 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:13 crc kubenswrapper[5129]: E0314 07:02:13.037555 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:13 crc kubenswrapper[5129]: E0314 07:02:13.182128 5129 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:02:15 crc kubenswrapper[5129]: I0314 07:02:15.035312 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:15 crc kubenswrapper[5129]: E0314 07:02:15.035456 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:15 crc kubenswrapper[5129]: I0314 07:02:15.035756 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:15 crc kubenswrapper[5129]: E0314 07:02:15.035821 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:15 crc kubenswrapper[5129]: I0314 07:02:15.035885 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:15 crc kubenswrapper[5129]: E0314 07:02:15.036012 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:15 crc kubenswrapper[5129]: I0314 07:02:15.036021 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:15 crc kubenswrapper[5129]: E0314 07:02:15.036100 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:17 crc kubenswrapper[5129]: I0314 07:02:17.035341 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:17 crc kubenswrapper[5129]: I0314 07:02:17.035387 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:17 crc kubenswrapper[5129]: I0314 07:02:17.035361 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:17 crc kubenswrapper[5129]: I0314 07:02:17.035344 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:17 crc kubenswrapper[5129]: E0314 07:02:17.035484 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 07:02:17 crc kubenswrapper[5129]: E0314 07:02:17.035677 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 07:02:17 crc kubenswrapper[5129]: E0314 07:02:17.035724 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2tzv" podUID="ffc61f17-7577-4872-ad5d-7b33780d3d21" Mar 14 07:02:17 crc kubenswrapper[5129]: E0314 07:02:17.035775 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.035788 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.035879 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.035947 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.035943 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.039008 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.039130 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.039196 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.039855 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.039996 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.040397 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.574494 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.574592 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:02:19 crc kubenswrapper[5129]: I0314 07:02:19.839165 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.012781 5129 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.046777 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.047512 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.047930 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.049015 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.056166 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xgc7f"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.056671 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vtzj7"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.056928 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.057075 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.086492 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.087093 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.125514 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.125946 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.126345 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.126507 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.127064 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.127178 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.127296 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.127297 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.128025 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.128044 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.128089 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.128272 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.128514 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.128692 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.128781 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.129495 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.129511 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.129544 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.129551 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.129558 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.129592 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.129679 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.130235 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.130382 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.130543 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.130885 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.131162 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.132118 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.132517 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.133021 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vpw78"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.133748 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.134261 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k7q7x"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.135048 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.135260 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.135287 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.135422 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.137337 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmn4k"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.138536 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.146020 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.164140 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.164521 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181359 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181415 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181448 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-etcd-client\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181472 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181502 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-machine-approver-tls\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181532 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181561 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-serving-cert\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181607 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-dir\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181639 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181697 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5zb\" (UniqueName: \"kubernetes.io/projected/095d17bf-b9c3-42e6-b8c9-39b929c52d50-kube-api-access-cf5zb\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181729 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8752z\" (UniqueName: \"kubernetes.io/projected/e8915d74-77d5-4d0f-9264-37b6f8167a6d-kube-api-access-8752z\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181770 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181798 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-service-ca\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181830 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095d17bf-b9c3-42e6-b8c9-39b929c52d50-config\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181851 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181898 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181927 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-audit-policies\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.181948 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-config\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182238 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dvmr\" (UniqueName: \"kubernetes.io/projected/e61dec00-008c-454f-a37d-87b8a21d4733-kube-api-access-8dvmr\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182266 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182292 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-serving-cert\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182313 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvws6\" (UniqueName: \"kubernetes.io/projected/92aed46c-5740-4407-81ed-4ff642a70c54-kube-api-access-dvws6\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182340 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-encryption-config\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182367 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-policies\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182397 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-oauth-serving-cert\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182424 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xft2m\" (UniqueName: \"kubernetes.io/projected/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-kube-api-access-xft2m\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182448 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gm5z\" (UniqueName: \"kubernetes.io/projected/b633b0c1-73e2-445d-8f82-0f37854b2fc7-kube-api-access-7gm5z\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182501 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-client-ca\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182530 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e61dec00-008c-454f-a37d-87b8a21d4733-node-pullsecrets\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182556 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-oauth-config\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182618 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8915d74-77d5-4d0f-9264-37b6f8167a6d-serving-cert\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182644 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b633b0c1-73e2-445d-8f82-0f37854b2fc7-audit-dir\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182669 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-config\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182693 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-serving-cert\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182716 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbgg\" (UniqueName: \"kubernetes.io/projected/89e1dfc8-c715-4bc9-9f72-764254643adc-kube-api-access-ffbgg\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182754 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-trusted-ca-bundle\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182774 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/095d17bf-b9c3-42e6-b8c9-39b929c52d50-images\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182795 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-image-import-ca\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182818 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-encryption-config\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182841 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-client-ca\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182866 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182888 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-config\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182927 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b81a68d-81ce-4406-ae24-511cda2d8936-serving-cert\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182949 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/095d17bf-b9c3-42e6-b8c9-39b929c52d50-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.182973 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-etcd-client\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183001 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vnfw\" (UniqueName: \"kubernetes.io/projected/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-kube-api-access-7vnfw\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183038 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-audit\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183063 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e1dfc8-c715-4bc9-9f72-764254643adc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183088 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183115 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183137 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183163 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-console-config\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183188 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-etcd-serving-ca\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183212 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183238 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89tfr\" (UniqueName: \"kubernetes.io/projected/4b81a68d-81ce-4406-ae24-511cda2d8936-kube-api-access-89tfr\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183263 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-auth-proxy-config\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183290 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e61dec00-008c-454f-a37d-87b8a21d4733-audit-dir\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183317 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e1dfc8-c715-4bc9-9f72-764254643adc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.183347 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-config\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.195638 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.196368 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.196388 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.196510 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.196559 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.196571 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.196787 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hcrvq"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.196999 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.197108 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gkdgj"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.197203 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.198167 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.198353 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.198459 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.198663 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.203706 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-trtr5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.204363 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.204376 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.204703 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.205578 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hrz85"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.206144 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hrz85" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.206744 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.207427 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.209175 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cqxl7"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.209588 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.209777 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.210096 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.211756 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.212012 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pggtq"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.212293 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.212463 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.216479 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pphjw"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.217236 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.223315 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.223837 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.225025 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.225299 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.225455 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.225640 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.225980 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.226172 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.226344 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.226516 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.226730 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.226766 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.227517 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.227710 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.227900 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.227947 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228043 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228202 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228412 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228416 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gvck8"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228456 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228471 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228554 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.228603 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229433 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229519 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229672 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229766 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229937 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229971 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230008 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230022 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230055 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230138 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230149 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230317 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230398 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229973 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.229976 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230667 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.230991 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.231102 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.231184 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.231309 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.231478 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.231553 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m7rjz"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.231898 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.232045 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.232243 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.232376 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.232523 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.237257 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.237788 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.239913 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.241382 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.241666 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.241919 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.246501 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.249736 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.249840 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.250051 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.250104 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.250222 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.250881 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.255215 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.265770 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.266731 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.266841 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.267261 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.267778 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.267956 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.268455 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.269844 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.270156 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.270397 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.273914 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.271307 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.273017 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.275435 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.276565 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.277519 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.278288 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.280476 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.283282 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557862-lf4bl"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288157 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.284942 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-dir\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288365 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288410 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5zb\" (UniqueName: \"kubernetes.io/projected/095d17bf-b9c3-42e6-b8c9-39b929c52d50-kube-api-access-cf5zb\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288446 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288475 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8752z\" (UniqueName: \"kubernetes.io/projected/e8915d74-77d5-4d0f-9264-37b6f8167a6d-kube-api-access-8752z\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288503 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288535 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288563 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcrn8\" (UniqueName: \"kubernetes.io/projected/b99871bf-2660-4b7c-8ba4-e9cdae008681-kube-api-access-xcrn8\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288589 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ws92\" (UniqueName: \"kubernetes.io/projected/285a43ca-3a99-4b85-96b3-6b8619bb5853-kube-api-access-5ws92\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288615 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-images\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288668 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncv72\" (UniqueName: \"kubernetes.io/projected/e16d3836-813b-4a25-8dbb-8f5330008ec7-kube-api-access-ncv72\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288704 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41165ac7-1905-437e-b313-2917b3f80168-webhook-cert\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288730 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-config\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288753 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-proxy-tls\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288777 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-stats-auth\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288795 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-config\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288816 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288840 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-serving-cert\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288861 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b99871bf-2660-4b7c-8ba4-e9cdae008681-available-featuregates\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288882 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288920 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a50e5b-e484-4916-be70-893887f8405e-service-ca-bundle\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288945 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rpd\" (UniqueName: \"kubernetes.io/projected/41165ac7-1905-437e-b313-2917b3f80168-kube-api-access-76rpd\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288966 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvzs\" (UniqueName: \"kubernetes.io/projected/ce155720-ca35-4e8d-8b93-1d20fe7368e6-kube-api-access-hrvzs\") pod \"package-server-manager-789f6589d5-tm2rg\" (UID: \"ce155720-ca35-4e8d-8b93-1d20fe7368e6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.288993 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-encryption-config\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289016 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-policies\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289045 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/285a43ca-3a99-4b85-96b3-6b8619bb5853-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289074 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvcw\" (UniqueName: \"kubernetes.io/projected/14a59e71-3fa0-4957-b79e-927190083c0d-kube-api-access-kdvcw\") pod \"multus-admission-controller-857f4d67dd-pphjw\" (UID: \"14a59e71-3fa0-4957-b79e-927190083c0d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289106 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-oauth-config\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289130 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-serving-cert\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289152 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599bd4dc-4ab7-403b-9ede-30ed9dbca320-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289172 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-default-certificate\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289192 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbd40ee-070a-48b0-a67b-deb66288082b-trusted-ca\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289214 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/405015c6-fd32-4d6f-86ba-7d44197a6a23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289234 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289266 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-trusted-ca-bundle\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289285 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/095d17bf-b9c3-42e6-b8c9-39b929c52d50-images\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289305 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-config\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289323 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/405015c6-fd32-4d6f-86ba-7d44197a6a23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289343 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d03542f-7483-4aea-9651-4ab9a4ff1378-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j6z96\" (UID: \"0d03542f-7483-4aea-9651-4ab9a4ff1378\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289368 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-etcd-client\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289385 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbd40ee-070a-48b0-a67b-deb66288082b-config\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289405 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e1dfc8-c715-4bc9-9f72-764254643adc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289426 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289449 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vnfw\" (UniqueName: \"kubernetes.io/projected/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-kube-api-access-7vnfw\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289465 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-audit\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289483 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289501 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289519 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-console-config\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289538 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de214af-00b4-4c8f-8f7d-633419b87e45-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289559 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-auth-proxy-config\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289585 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e61dec00-008c-454f-a37d-87b8a21d4733-audit-dir\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289608 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e1dfc8-c715-4bc9-9f72-764254643adc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289629 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289666 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289689 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzqgg\" (UniqueName: \"kubernetes.io/projected/dfbd40ee-070a-48b0-a67b-deb66288082b-kube-api-access-zzqgg\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289707 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289727 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-machine-approver-tls\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289750 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289768 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqqj\" (UniqueName: \"kubernetes.io/projected/599bd4dc-4ab7-403b-9ede-30ed9dbca320-kube-api-access-qhqqj\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289790 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-serving-cert\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289807 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfbd40ee-070a-48b0-a67b-deb66288082b-serving-cert\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289827 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/285a43ca-3a99-4b85-96b3-6b8619bb5853-metrics-tls\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289844 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289861 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599bd4dc-4ab7-403b-9ede-30ed9dbca320-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289877 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glz5g\" (UniqueName: \"kubernetes.io/projected/29a50e5b-e484-4916-be70-893887f8405e-kube-api-access-glz5g\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289906 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289922 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-service-ca\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289939 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095d17bf-b9c3-42e6-b8c9-39b929c52d50-config\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289961 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-signing-cabundle\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.289998 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290024 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-audit-policies\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290048 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dvmr\" (UniqueName: \"kubernetes.io/projected/e61dec00-008c-454f-a37d-87b8a21d4733-kube-api-access-8dvmr\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290071 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290097 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290120 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-proxy-tls\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290147 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-service-ca-bundle\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290445 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290787 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6de214af-00b4-4c8f-8f7d-633419b87e45-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290820 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvws6\" (UniqueName: \"kubernetes.io/projected/92aed46c-5740-4407-81ed-4ff642a70c54-kube-api-access-dvws6\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290840 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-oauth-serving-cert\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290857 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/41165ac7-1905-437e-b313-2917b3f80168-tmpfs\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290878 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xft2m\" (UniqueName: \"kubernetes.io/projected/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-kube-api-access-xft2m\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290899 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gm5z\" (UniqueName: \"kubernetes.io/projected/b633b0c1-73e2-445d-8f82-0f37854b2fc7-kube-api-access-7gm5z\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290921 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-client-ca\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290941 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41165ac7-1905-437e-b313-2917b3f80168-apiservice-cert\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290984 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e16d3836-813b-4a25-8dbb-8f5330008ec7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291021 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e61dec00-008c-454f-a37d-87b8a21d4733-node-pullsecrets\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291071 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291102 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8915d74-77d5-4d0f-9264-37b6f8167a6d-serving-cert\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291122 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b633b0c1-73e2-445d-8f82-0f37854b2fc7-audit-dir\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291141 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-config\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291161 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbgg\" (UniqueName: \"kubernetes.io/projected/89e1dfc8-c715-4bc9-9f72-764254643adc-kube-api-access-ffbgg\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291182 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-metrics-certs\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.284189 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.285012 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-dir\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291632 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.292853 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095d17bf-b9c3-42e6-b8c9-39b929c52d50-config\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.293902 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-config\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.294233 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-client-ca\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.294228 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-audit\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.294308 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e61dec00-008c-454f-a37d-87b8a21d4733-node-pullsecrets\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.294643 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.294559 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.295397 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-policies\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.296840 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.297974 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.298635 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.301819 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b633b0c1-73e2-445d-8f82-0f37854b2fc7-audit-dir\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.302463 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-console-config\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.303176 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-audit-policies\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.303466 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e61dec00-008c-454f-a37d-87b8a21d4733-audit-dir\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.304031 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-oauth-serving-cert\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.304136 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.305319 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e1dfc8-c715-4bc9-9f72-764254643adc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.305270 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-service-ca\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.305424 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-machine-approver-tls\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.305493 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-auth-proxy-config\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.305817 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.306289 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.291202 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klptf\" (UniqueName: \"kubernetes.io/projected/50cfd6b1-bf1f-4a8b-bd6d-9799281f1359-kube-api-access-klptf\") pod \"downloads-7954f5f757-hrz85\" (UID: \"50cfd6b1-bf1f-4a8b-bd6d-9799281f1359\") " pod="openshift-console/downloads-7954f5f757-hrz85" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.317795 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-trusted-ca-bundle\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.319850 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.320100 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-config\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.320208 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-encryption-config\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.320474 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-serving-cert\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.320873 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-oauth-config\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.321046 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-serving-cert\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.321126 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8915d74-77d5-4d0f-9264-37b6f8167a6d-serving-cert\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.322193 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-config\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.322282 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2b5g4"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.322878 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbgc2\" (UniqueName: \"kubernetes.io/projected/0d03542f-7483-4aea-9651-4ab9a4ff1378-kube-api-access-wbgc2\") pod \"cluster-samples-operator-665b6dd947-j6z96\" (UID: \"0d03542f-7483-4aea-9651-4ab9a4ff1378\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.322923 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-image-import-ca\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.322961 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.322976 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/285a43ca-3a99-4b85-96b3-6b8619bb5853-trusted-ca\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.323025 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.323224 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-serving-cert\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.290589 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.323564 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.323727 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6sj2\" (UniqueName: \"kubernetes.io/projected/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-kube-api-access-w6sj2\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.323779 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-encryption-config\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.323879 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-serving-cert\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.323917 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/405015c6-fd32-4d6f-86ba-7d44197a6a23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.324011 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-client-ca\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.324149 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.324183 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.324544 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-image-import-ca\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.324623 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cb8c5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.325065 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4pb\" (UniqueName: \"kubernetes.io/projected/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-kube-api-access-cx4pb\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326062 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-config\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326135 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/095d17bf-b9c3-42e6-b8c9-39b929c52d50-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326140 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/095d17bf-b9c3-42e6-b8c9-39b929c52d50-images\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326482 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e1dfc8-c715-4bc9-9f72-764254643adc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326712 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b633b0c1-73e2-445d-8f82-0f37854b2fc7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326721 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326962 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-signing-key\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326974 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.326981 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.327006 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce155720-ca35-4e8d-8b93-1d20fe7368e6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tm2rg\" (UID: \"ce155720-ca35-4e8d-8b93-1d20fe7368e6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.327061 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de214af-00b4-4c8f-8f7d-633419b87e45-config\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.327335 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-client-ca\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.327566 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b81a68d-81ce-4406-ae24-511cda2d8936-serving-cert\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.327694 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.329356 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.330239 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-etcd-serving-ca\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.330339 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.330371 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mbb\" (UniqueName: \"kubernetes.io/projected/405015c6-fd32-4d6f-86ba-7d44197a6a23-kube-api-access-g9mbb\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.330436 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-config\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.330479 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89tfr\" (UniqueName: \"kubernetes.io/projected/4b81a68d-81ce-4406-ae24-511cda2d8936-kube-api-access-89tfr\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331233 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-etcd-serving-ca\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331285 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331309 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14a59e71-3fa0-4957-b79e-927190083c0d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pphjw\" (UID: \"14a59e71-3fa0-4957-b79e-927190083c0d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331340 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9nn4\" (UniqueName: \"kubernetes.io/projected/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-kube-api-access-m9nn4\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331404 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16d3836-813b-4a25-8dbb-8f5330008ec7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331428 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331463 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-etcd-client\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331530 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99871bf-2660-4b7c-8ba4-e9cdae008681-serving-cert\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331541 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331558 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkwwp\" (UniqueName: \"kubernetes.io/projected/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-kube-api-access-qkwwp\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.331557 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.333015 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.333386 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.334198 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-config\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.334924 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-etcd-client\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.335539 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61dec00-008c-454f-a37d-87b8a21d4733-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.335912 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e61dec00-008c-454f-a37d-87b8a21d4733-encryption-config\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.336228 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xgc7f"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.340226 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/095d17bf-b9c3-42e6-b8c9-39b929c52d50-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.341859 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vpw78"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.342558 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.343621 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.343874 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b633b0c1-73e2-445d-8f82-0f37854b2fc7-etcd-client\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.344035 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-msrqq"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.344673 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.346536 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.348525 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b81a68d-81ce-4406-ae24-511cda2d8936-serving-cert\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.348805 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k7q7x"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.348884 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.350776 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.350908 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.352151 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vtzj7"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.354320 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmn4k"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.355743 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.357413 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gkdgj"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.358749 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rc492"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.359892 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.364716 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cqxl7"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.364793 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.368995 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.370420 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.371256 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.372804 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-trtr5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.376408 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.379677 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.384019 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hcrvq"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.385916 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.389319 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.390785 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.392552 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.393880 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hrz85"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.395270 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m7rjz"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.396309 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.397414 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.398452 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pphjw"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.399462 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t9ngb"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.400581 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.400679 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.402252 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rc492"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.403326 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.404296 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-lf4bl"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.405355 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.406487 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.407484 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-msrqq"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.408684 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.409435 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.409797 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pggtq"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.410942 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.411947 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.412960 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2b5g4"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.413945 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cb8c5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.414900 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.415898 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dsrgt"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.416657 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.416942 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-f2lk5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.417682 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.418001 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t9ngb"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.419049 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f2lk5"] Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.429791 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.433370 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de214af-00b4-4c8f-8f7d-633419b87e45-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.433551 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzqgg\" (UniqueName: \"kubernetes.io/projected/dfbd40ee-070a-48b0-a67b-deb66288082b-kube-api-access-zzqgg\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.433732 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqqj\" (UniqueName: \"kubernetes.io/projected/599bd4dc-4ab7-403b-9ede-30ed9dbca320-kube-api-access-qhqqj\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.433904 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfbd40ee-070a-48b0-a67b-deb66288082b-serving-cert\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.434020 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/285a43ca-3a99-4b85-96b3-6b8619bb5853-metrics-tls\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.434133 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.434253 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599bd4dc-4ab7-403b-9ede-30ed9dbca320-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.434361 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glz5g\" (UniqueName: \"kubernetes.io/projected/29a50e5b-e484-4916-be70-893887f8405e-kube-api-access-glz5g\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.434762 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-signing-cabundle\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.434912 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-proxy-tls\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435017 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435119 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-service-ca-bundle\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435217 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6de214af-00b4-4c8f-8f7d-633419b87e45-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435384 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/41165ac7-1905-437e-b313-2917b3f80168-tmpfs\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435522 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41165ac7-1905-437e-b313-2917b3f80168-apiservice-cert\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435664 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e16d3836-813b-4a25-8dbb-8f5330008ec7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435790 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-service-ca-bundle\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435797 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-metrics-certs\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435855 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klptf\" (UniqueName: \"kubernetes.io/projected/50cfd6b1-bf1f-4a8b-bd6d-9799281f1359-kube-api-access-klptf\") pod \"downloads-7954f5f757-hrz85\" (UID: \"50cfd6b1-bf1f-4a8b-bd6d-9799281f1359\") " pod="openshift-console/downloads-7954f5f757-hrz85" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435877 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbgc2\" (UniqueName: \"kubernetes.io/projected/0d03542f-7483-4aea-9651-4ab9a4ff1378-kube-api-access-wbgc2\") pod \"cluster-samples-operator-665b6dd947-j6z96\" (UID: \"0d03542f-7483-4aea-9651-4ab9a4ff1378\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435890 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/41165ac7-1905-437e-b313-2917b3f80168-tmpfs\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435902 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/285a43ca-3a99-4b85-96b3-6b8619bb5853-trusted-ca\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435921 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6sj2\" (UniqueName: \"kubernetes.io/projected/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-kube-api-access-w6sj2\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435943 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-serving-cert\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435959 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/405015c6-fd32-4d6f-86ba-7d44197a6a23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.435976 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4pb\" (UniqueName: \"kubernetes.io/projected/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-kube-api-access-cx4pb\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436006 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-config\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436024 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-signing-key\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436041 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce155720-ca35-4e8d-8b93-1d20fe7368e6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tm2rg\" (UID: \"ce155720-ca35-4e8d-8b93-1d20fe7368e6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436060 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de214af-00b4-4c8f-8f7d-633419b87e45-config\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436089 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mbb\" (UniqueName: \"kubernetes.io/projected/405015c6-fd32-4d6f-86ba-7d44197a6a23-kube-api-access-g9mbb\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436113 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14a59e71-3fa0-4957-b79e-927190083c0d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pphjw\" (UID: \"14a59e71-3fa0-4957-b79e-927190083c0d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436131 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9nn4\" (UniqueName: \"kubernetes.io/projected/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-kube-api-access-m9nn4\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436149 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99871bf-2660-4b7c-8ba4-e9cdae008681-serving-cert\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436166 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16d3836-813b-4a25-8dbb-8f5330008ec7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436183 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436201 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkwwp\" (UniqueName: \"kubernetes.io/projected/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-kube-api-access-qkwwp\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436228 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436244 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436261 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcrn8\" (UniqueName: \"kubernetes.io/projected/b99871bf-2660-4b7c-8ba4-e9cdae008681-kube-api-access-xcrn8\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436277 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ws92\" (UniqueName: \"kubernetes.io/projected/285a43ca-3a99-4b85-96b3-6b8619bb5853-kube-api-access-5ws92\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436292 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-images\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436308 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncv72\" (UniqueName: \"kubernetes.io/projected/e16d3836-813b-4a25-8dbb-8f5330008ec7-kube-api-access-ncv72\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41165ac7-1905-437e-b313-2917b3f80168-webhook-cert\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436516 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-proxy-tls\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436540 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b99871bf-2660-4b7c-8ba4-e9cdae008681-available-featuregates\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436666 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436733 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de214af-00b4-4c8f-8f7d-633419b87e45-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436819 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436869 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-stats-auth\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436885 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-config\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436912 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436965 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a50e5b-e484-4916-be70-893887f8405e-service-ca-bundle\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.436990 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76rpd\" (UniqueName: \"kubernetes.io/projected/41165ac7-1905-437e-b313-2917b3f80168-kube-api-access-76rpd\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437011 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvzs\" (UniqueName: \"kubernetes.io/projected/ce155720-ca35-4e8d-8b93-1d20fe7368e6-kube-api-access-hrvzs\") pod \"package-server-manager-789f6589d5-tm2rg\" (UID: \"ce155720-ca35-4e8d-8b93-1d20fe7368e6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437032 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/285a43ca-3a99-4b85-96b3-6b8619bb5853-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437048 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvcw\" (UniqueName: \"kubernetes.io/projected/14a59e71-3fa0-4957-b79e-927190083c0d-kube-api-access-kdvcw\") pod \"multus-admission-controller-857f4d67dd-pphjw\" (UID: \"14a59e71-3fa0-4957-b79e-927190083c0d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437066 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599bd4dc-4ab7-403b-9ede-30ed9dbca320-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437091 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-default-certificate\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437106 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbd40ee-070a-48b0-a67b-deb66288082b-trusted-ca\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437122 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/405015c6-fd32-4d6f-86ba-7d44197a6a23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437137 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437153 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/405015c6-fd32-4d6f-86ba-7d44197a6a23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437169 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d03542f-7483-4aea-9651-4ab9a4ff1378-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j6z96\" (UID: \"0d03542f-7483-4aea-9651-4ab9a4ff1378\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437260 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b99871bf-2660-4b7c-8ba4-e9cdae008681-available-featuregates\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437365 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbd40ee-070a-48b0-a67b-deb66288082b-config\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.437496 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-config\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.438135 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de214af-00b4-4c8f-8f7d-633419b87e45-config\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.438299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbd40ee-070a-48b0-a67b-deb66288082b-config\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.438448 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.438470 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.439184 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-serving-cert\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.439720 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/405015c6-fd32-4d6f-86ba-7d44197a6a23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.440139 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/285a43ca-3a99-4b85-96b3-6b8619bb5853-trusted-ca\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.440147 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbd40ee-070a-48b0-a67b-deb66288082b-trusted-ca\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.440630 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/285a43ca-3a99-4b85-96b3-6b8619bb5853-metrics-tls\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.441214 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d03542f-7483-4aea-9651-4ab9a4ff1378-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j6z96\" (UID: \"0d03542f-7483-4aea-9651-4ab9a4ff1378\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.442485 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/405015c6-fd32-4d6f-86ba-7d44197a6a23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.443725 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99871bf-2660-4b7c-8ba4-e9cdae008681-serving-cert\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.448527 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfbd40ee-070a-48b0-a67b-deb66288082b-serving-cert\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.449697 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.457979 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-config\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.470510 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.489868 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.514016 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.530288 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.540460 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.549162 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.569091 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.579475 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/14a59e71-3fa0-4957-b79e-927190083c0d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pphjw\" (UID: \"14a59e71-3fa0-4957-b79e-927190083c0d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.589781 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.609298 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.629700 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.640085 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.649903 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.655771 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.671063 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.696700 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.710381 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.730439 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.739535 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e16d3836-813b-4a25-8dbb-8f5330008ec7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.750730 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.758494 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16d3836-813b-4a25-8dbb-8f5330008ec7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.770905 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.810325 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.828937 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.836202 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599bd4dc-4ab7-403b-9ede-30ed9dbca320-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.849945 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.862406 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599bd4dc-4ab7-403b-9ede-30ed9dbca320-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.871702 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.891756 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.898160 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a50e5b-e484-4916-be70-893887f8405e-service-ca-bundle\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.910556 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.930207 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.941343 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-stats-auth\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.950665 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.971050 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.982445 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-default-certificate\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:21 crc kubenswrapper[5129]: I0314 07:02:21.990180 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.011179 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.021977 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29a50e5b-e484-4916-be70-893887f8405e-metrics-certs\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.029978 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.050382 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.070949 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.089051 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.101787 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-signing-key\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.109529 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.117248 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-signing-cabundle\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.129788 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.150131 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.172565 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.196955 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.210510 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.221051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41165ac7-1905-437e-b313-2917b3f80168-webhook-cert\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.222027 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41165ac7-1905-437e-b313-2917b3f80168-apiservice-cert\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.231166 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.240051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce155720-ca35-4e8d-8b93-1d20fe7368e6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tm2rg\" (UID: \"ce155720-ca35-4e8d-8b93-1d20fe7368e6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.250034 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.259017 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-proxy-tls\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.270526 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.287848 5129 request.go:700] Waited for 1.01103504s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.289771 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.298498 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-images\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.311030 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.332085 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.340930 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-proxy-tls\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.371482 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.390772 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.425086 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vnfw\" (UniqueName: \"kubernetes.io/projected/65a0ac6e-2c3b-4f24-8495-b13aa58017c1-kube-api-access-7vnfw\") pod \"machine-approver-56656f9798-gnkjw\" (UID: \"65a0ac6e-2c3b-4f24-8495-b13aa58017c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.430867 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.450185 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.488112 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xft2m\" (UniqueName: \"kubernetes.io/projected/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-kube-api-access-xft2m\") pod \"oauth-openshift-558db77b4-fmn4k\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.508259 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gm5z\" (UniqueName: \"kubernetes.io/projected/b633b0c1-73e2-445d-8f82-0f37854b2fc7-kube-api-access-7gm5z\") pod \"apiserver-7bbb656c7d-6l8xj\" (UID: \"b633b0c1-73e2-445d-8f82-0f37854b2fc7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.528465 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5zb\" (UniqueName: \"kubernetes.io/projected/095d17bf-b9c3-42e6-b8c9-39b929c52d50-kube-api-access-cf5zb\") pod \"machine-api-operator-5694c8668f-vtzj7\" (UID: \"095d17bf-b9c3-42e6-b8c9-39b929c52d50\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.550253 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.556015 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8752z\" (UniqueName: \"kubernetes.io/projected/e8915d74-77d5-4d0f-9264-37b6f8167a6d-kube-api-access-8752z\") pod \"controller-manager-879f6c89f-xgc7f\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.567814 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.589440 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.595195 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dvmr\" (UniqueName: \"kubernetes.io/projected/e61dec00-008c-454f-a37d-87b8a21d4733-kube-api-access-8dvmr\") pod \"apiserver-76f77b778f-k7q7x\" (UID: \"e61dec00-008c-454f-a37d-87b8a21d4733\") " pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.603421 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.609723 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.614271 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvws6\" (UniqueName: \"kubernetes.io/projected/92aed46c-5740-4407-81ed-4ff642a70c54-kube-api-access-dvws6\") pod \"console-f9d7485db-vpw78\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.631150 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.633488 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbgg\" (UniqueName: \"kubernetes.io/projected/89e1dfc8-c715-4bc9-9f72-764254643adc-kube-api-access-ffbgg\") pod \"openshift-apiserver-operator-796bbdcf4f-9d5pw\" (UID: \"89e1dfc8-c715-4bc9-9f72-764254643adc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.645444 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.650057 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.655476 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.661214 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.669987 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.690280 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.727145 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.737093 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.737472 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.750944 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" event={"ID":"65a0ac6e-2c3b-4f24-8495-b13aa58017c1","Type":"ContainerStarted","Data":"18f4c6b24e7864dbd581b559b03ee685dee4cf514788cc7bd7131c017bf1782d"} Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.756989 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.771625 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.797473 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.810043 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.849617 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.850357 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89tfr\" (UniqueName: \"kubernetes.io/projected/4b81a68d-81ce-4406-ae24-511cda2d8936-kube-api-access-89tfr\") pod \"route-controller-manager-6576b87f9c-98rd8\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.852205 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj"] Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.870687 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.880279 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.890769 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.912461 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.930353 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.950373 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.970261 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vpw78"] Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.970297 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: I0314 07:02:22.990331 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 07:02:22 crc kubenswrapper[5129]: W0314 07:02:22.994024 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92aed46c_5740_4407_81ed_4ff642a70c54.slice/crio-c68908b8746452e572666602e51cba1b4bc426bb35f3796601aa3f4490720b20 WatchSource:0}: Error finding container c68908b8746452e572666602e51cba1b4bc426bb35f3796601aa3f4490720b20: Status 404 returned error can't find the container with id c68908b8746452e572666602e51cba1b4bc426bb35f3796601aa3f4490720b20 Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.012175 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.030245 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.045494 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k7q7x"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.049958 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.058239 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.070756 5129 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 07:02:23 crc kubenswrapper[5129]: W0314 07:02:23.084927 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e1dfc8_c715_4bc9_9f72_764254643adc.slice/crio-74c200ebddfb7f139974380f0f7c4eae56171c8bbe41e1c4aa3fb712b1a016b1 WatchSource:0}: Error finding container 74c200ebddfb7f139974380f0f7c4eae56171c8bbe41e1c4aa3fb712b1a016b1: Status 404 returned error can't find the container with id 74c200ebddfb7f139974380f0f7c4eae56171c8bbe41e1c4aa3fb712b1a016b1 Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.089781 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.092746 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vtzj7"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.109546 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmn4k"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.109617 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xgc7f"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.113176 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.130291 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.151384 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.164233 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.169943 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.194059 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.209540 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.230315 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.250037 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.269792 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.288466 5129 request.go:700] Waited for 1.870454775s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.290968 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.309946 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.351559 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzqgg\" (UniqueName: \"kubernetes.io/projected/dfbd40ee-070a-48b0-a67b-deb66288082b-kube-api-access-zzqgg\") pod \"console-operator-58897d9998-gkdgj\" (UID: \"dfbd40ee-070a-48b0-a67b-deb66288082b\") " pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.374523 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqqj\" (UniqueName: \"kubernetes.io/projected/599bd4dc-4ab7-403b-9ede-30ed9dbca320-kube-api-access-qhqqj\") pod \"kube-storage-version-migrator-operator-b67b599dd-rsmg2\" (UID: \"599bd4dc-4ab7-403b-9ede-30ed9dbca320\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.384096 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glz5g\" (UniqueName: \"kubernetes.io/projected/29a50e5b-e484-4916-be70-893887f8405e-kube-api-access-glz5g\") pod \"router-default-5444994796-gvck8\" (UID: \"29a50e5b-e484-4916-be70-893887f8405e\") " pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.395115 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.402289 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6de214af-00b4-4c8f-8f7d-633419b87e45-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nz9kg\" (UID: \"6de214af-00b4-4c8f-8f7d-633419b87e45\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.425183 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klptf\" (UniqueName: \"kubernetes.io/projected/50cfd6b1-bf1f-4a8b-bd6d-9799281f1359-kube-api-access-klptf\") pod \"downloads-7954f5f757-hrz85\" (UID: \"50cfd6b1-bf1f-4a8b-bd6d-9799281f1359\") " pod="openshift-console/downloads-7954f5f757-hrz85" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.448125 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6sj2\" (UniqueName: \"kubernetes.io/projected/fa4b0c05-3ff2-4e52-b028-603d2eb22adc-kube-api-access-w6sj2\") pod \"machine-config-operator-74547568cd-d5bdj\" (UID: \"fa4b0c05-3ff2-4e52-b028-603d2eb22adc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.466112 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbgc2\" (UniqueName: \"kubernetes.io/projected/0d03542f-7483-4aea-9651-4ab9a4ff1378-kube-api-access-wbgc2\") pod \"cluster-samples-operator-665b6dd947-j6z96\" (UID: \"0d03542f-7483-4aea-9651-4ab9a4ff1378\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.471364 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hrz85" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.488144 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/405015c6-fd32-4d6f-86ba-7d44197a6a23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.499088 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.505206 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ws92\" (UniqueName: \"kubernetes.io/projected/285a43ca-3a99-4b85-96b3-6b8619bb5853-kube-api-access-5ws92\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.524425 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkwwp\" (UniqueName: \"kubernetes.io/projected/b81a1cd4-9e3a-411a-921f-3cd1139be7e3-kube-api-access-qkwwp\") pod \"machine-config-controller-84d6567774-4z5nz\" (UID: \"b81a1cd4-9e3a-411a-921f-3cd1139be7e3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.540131 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.545839 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.558143 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97c3ccc-71bd-4f3e-8457-5e023732cb5b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mss56\" (UID: \"a97c3ccc-71bd-4f3e-8457-5e023732cb5b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.570879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcrn8\" (UniqueName: \"kubernetes.io/projected/b99871bf-2660-4b7c-8ba4-e9cdae008681-kube-api-access-xcrn8\") pod \"openshift-config-operator-7777fb866f-trtr5\" (UID: \"b99871bf-2660-4b7c-8ba4-e9cdae008681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.586525 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.596355 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gkdgj"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.596561 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.599899 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncv72\" (UniqueName: \"kubernetes.io/projected/e16d3836-813b-4a25-8dbb-8f5330008ec7-kube-api-access-ncv72\") pod \"openshift-controller-manager-operator-756b6f6bc6-lr6qd\" (UID: \"e16d3836-813b-4a25-8dbb-8f5330008ec7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.606356 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9nn4\" (UniqueName: \"kubernetes.io/projected/6c116925-b69d-4719-9e7b-7bfdf13ee5f5-kube-api-access-m9nn4\") pod \"authentication-operator-69f744f599-hcrvq\" (UID: \"6c116925-b69d-4719-9e7b-7bfdf13ee5f5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.624464 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.631908 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.639960 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mbb\" (UniqueName: \"kubernetes.io/projected/405015c6-fd32-4d6f-86ba-7d44197a6a23-kube-api-access-g9mbb\") pod \"cluster-image-registry-operator-dc59b4c8b-s4fr9\" (UID: \"405015c6-fd32-4d6f-86ba-7d44197a6a23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.647525 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nfv9f\" (UID: \"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.674081 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rpd\" (UniqueName: \"kubernetes.io/projected/41165ac7-1905-437e-b313-2917b3f80168-kube-api-access-76rpd\") pod \"packageserver-d55dfcdfc-kmpng\" (UID: \"41165ac7-1905-437e-b313-2917b3f80168\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.691051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvzs\" (UniqueName: \"kubernetes.io/projected/ce155720-ca35-4e8d-8b93-1d20fe7368e6-kube-api-access-hrvzs\") pod \"package-server-manager-789f6589d5-tm2rg\" (UID: \"ce155720-ca35-4e8d-8b93-1d20fe7368e6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.691260 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.710235 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4pb\" (UniqueName: \"kubernetes.io/projected/42d74e97-1c27-4447-afe2-d10bb4b3a1b6-kube-api-access-cx4pb\") pod \"service-ca-9c57cc56f-m7rjz\" (UID: \"42d74e97-1c27-4447-afe2-d10bb4b3a1b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.729793 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hrz85"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.741289 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvcw\" (UniqueName: \"kubernetes.io/projected/14a59e71-3fa0-4957-b79e-927190083c0d-kube-api-access-kdvcw\") pod \"multus-admission-controller-857f4d67dd-pphjw\" (UID: \"14a59e71-3fa0-4957-b79e-927190083c0d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.756958 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/285a43ca-3a99-4b85-96b3-6b8619bb5853-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dgls2\" (UID: \"285a43ca-3a99-4b85-96b3-6b8619bb5853\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.762118 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.762373 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.765796 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hrz85" event={"ID":"50cfd6b1-bf1f-4a8b-bd6d-9799281f1359","Type":"ContainerStarted","Data":"5e488e4785ca80a158750cca2cf4917f8fc979fc0661ca064c64453924018d95"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777341 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-ca\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777402 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-bound-sa-token\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777422 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-client\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777511 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-config\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777529 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a7f356-6278-409f-9047-efece8492b78-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777545 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d8d49c8-334d-45ba-bd88-2bd200200093-serving-cert\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777559 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-registry-tls\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777576 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-service-ca\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777596 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777630 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wbs\" (UniqueName: \"kubernetes.io/projected/7d8d49c8-334d-45ba-bd88-2bd200200093-kube-api-access-74wbs\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777648 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-trusted-ca\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777666 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-registry-certificates\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777704 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5hm2\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-kube-api-access-g5hm2\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.777724 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a7f356-6278-409f-9047-efece8492b78-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: E0314 07:02:23.778022 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.278009984 +0000 UTC m=+207.029925168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.779290 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.795678 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" event={"ID":"4b81a68d-81ce-4406-ae24-511cda2d8936","Type":"ContainerStarted","Data":"10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.795710 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" event={"ID":"4b81a68d-81ce-4406-ae24-511cda2d8936","Type":"ContainerStarted","Data":"33711b805c5e25cd55e40c989fc60eff86b9de4bbf083bd87a875f6f1d9cb626"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.796526 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.798872 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.801759 5129 generic.go:334] "Generic (PLEG): container finished" podID="e61dec00-008c-454f-a37d-87b8a21d4733" containerID="14e1c1306db615ab787f13512432209aa69ad70d2a1ddfdd6334e86381ccc2fd" exitCode=0 Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.801811 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" event={"ID":"e61dec00-008c-454f-a37d-87b8a21d4733","Type":"ContainerDied","Data":"14e1c1306db615ab787f13512432209aa69ad70d2a1ddfdd6334e86381ccc2fd"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.801832 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" event={"ID":"e61dec00-008c-454f-a37d-87b8a21d4733","Type":"ContainerStarted","Data":"6945f0ec07e2c1ea4967d7d05e214dc91e80405c6fcca474c819cdb076c36197"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.812064 5129 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-98rd8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.812103 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" podUID="4b81a68d-81ce-4406-ae24-511cda2d8936" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.815612 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.824853 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.831177 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpw78" event={"ID":"92aed46c-5740-4407-81ed-4ff642a70c54","Type":"ContainerStarted","Data":"2c36c6c958b1570e5ca98640d10a3c2833d85fc9b0f3d98768338aa9eaba79db"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.831210 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpw78" event={"ID":"92aed46c-5740-4407-81ed-4ff642a70c54","Type":"ContainerStarted","Data":"c68908b8746452e572666602e51cba1b4bc426bb35f3796601aa3f4490720b20"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.831621 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.844810 5129 generic.go:334] "Generic (PLEG): container finished" podID="b633b0c1-73e2-445d-8f82-0f37854b2fc7" containerID="86684eacbc3b072dc10e76e319134fe97031799f83b613e08e23a5b799405f97" exitCode=0 Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.844917 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" event={"ID":"b633b0c1-73e2-445d-8f82-0f37854b2fc7","Type":"ContainerDied","Data":"86684eacbc3b072dc10e76e319134fe97031799f83b613e08e23a5b799405f97"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.844966 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" event={"ID":"b633b0c1-73e2-445d-8f82-0f37854b2fc7","Type":"ContainerStarted","Data":"431a8b8869e8f11cab09101a6745fa9a73bdd0e0b50c9e7add3c4156e8dae522"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.852157 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" event={"ID":"dfbd40ee-070a-48b0-a67b-deb66288082b","Type":"ContainerStarted","Data":"5f221ede9f6f2d0cb30c41ad883fbbfc29758f819e6a79fdf40c69015730f251"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.859326 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gvck8" event={"ID":"29a50e5b-e484-4916-be70-893887f8405e","Type":"ContainerStarted","Data":"6aa9163eb7fcc5162930d59dea6b0e18ce4a4c85fc632073abc2ba14a03d8d63"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.863676 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.866778 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" event={"ID":"89e1dfc8-c715-4bc9-9f72-764254643adc","Type":"ContainerStarted","Data":"324acd8ed3e761346b08fac19167a45fe408922548ba49e0352c92203bff371d"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.866830 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" event={"ID":"89e1dfc8-c715-4bc9-9f72-764254643adc","Type":"ContainerStarted","Data":"74c200ebddfb7f139974380f0f7c4eae56171c8bbe41e1c4aa3fb712b1a016b1"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.871500 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.878407 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.878503 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.878686 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hdk\" (UniqueName: \"kubernetes.io/projected/581cb821-3eab-416b-bb1e-013db618ed67-kube-api-access-c7hdk\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:23 crc kubenswrapper[5129]: E0314 07:02:23.878800 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.378747701 +0000 UTC m=+207.130662955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.878893 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-plugins-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.878939 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/67f3717d-a52d-46b9-9132-044239d564c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rrt62\" (UID: \"67f3717d-a52d-46b9-9132-044239d564c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.879016 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pht5r\" (UniqueName: \"kubernetes.io/projected/deb9ddae-7954-4db6-9864-bae7e7c25744-kube-api-access-pht5r\") pod \"ingress-canary-f2lk5\" (UID: \"deb9ddae-7954-4db6-9864-bae7e7c25744\") " pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.879286 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/021f1f3b-8ee9-424f-9c04-c56631332e92-config-volume\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.879313 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5516ae48-72a7-4f9e-9dd8-30618817f0c8-metrics-tls\") pod \"dns-operator-744455d44c-2b5g4\" (UID: \"5516ae48-72a7-4f9e-9dd8-30618817f0c8\") " pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.879875 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-csi-data-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.879911 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lfsd\" (UniqueName: \"kubernetes.io/projected/e9e49a0a-8f9f-4e78-8098-d195fe3297bd-kube-api-access-7lfsd\") pod \"auto-csr-approver-29557862-lf4bl\" (UID: \"e9e49a0a-8f9f-4e78-8098-d195fe3297bd\") " pod="openshift-infra/auto-csr-approver-29557862-lf4bl" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.879929 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb9ddae-7954-4db6-9864-bae7e7c25744-cert\") pod \"ingress-canary-f2lk5\" (UID: \"deb9ddae-7954-4db6-9864-bae7e7c25744\") " pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.880230 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/50a96d01-d66b-4110-a608-aacd554c5111-profile-collector-cert\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.880324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-bound-sa-token\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.880719 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-client\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.880813 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npnj9\" (UniqueName: \"kubernetes.io/projected/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-kube-api-access-npnj9\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881003 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581cb821-3eab-416b-bb1e-013db618ed67-serving-cert\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881028 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581cb821-3eab-416b-bb1e-013db618ed67-config\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881074 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-config\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881090 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ct8n\" (UniqueName: \"kubernetes.io/projected/021f1f3b-8ee9-424f-9c04-c56631332e92-kube-api-access-4ct8n\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881105 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5facc307-168d-4313-868a-8bf0db5e65ca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881152 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/021f1f3b-8ee9-424f-9c04-c56631332e92-secret-volume\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881178 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a7f356-6278-409f-9047-efece8492b78-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881209 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d8d49c8-334d-45ba-bd88-2bd200200093-serving-cert\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881224 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f7hp\" (UniqueName: \"kubernetes.io/projected/72b1d134-f49d-4b86-9630-a7d851ac1c40-kube-api-access-7f7hp\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881267 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-registry-tls\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881284 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881309 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-service-ca\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881337 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881352 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/50a96d01-d66b-4110-a608-aacd554c5111-srv-cert\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881369 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmv6t\" (UniqueName: \"kubernetes.io/projected/5516ae48-72a7-4f9e-9dd8-30618817f0c8-kube-api-access-fmv6t\") pod \"dns-operator-744455d44c-2b5g4\" (UID: \"5516ae48-72a7-4f9e-9dd8-30618817f0c8\") " pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881384 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wbs\" (UniqueName: \"kubernetes.io/projected/7d8d49c8-334d-45ba-bd88-2bd200200093-kube-api-access-74wbs\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881425 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-trusted-ca\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881473 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-registry-certificates\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881494 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p25l\" (UniqueName: \"kubernetes.io/projected/67f3717d-a52d-46b9-9132-044239d564c3-kube-api-access-7p25l\") pod \"control-plane-machine-set-operator-78cbb6b69f-rrt62\" (UID: \"67f3717d-a52d-46b9-9132-044239d564c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881552 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-registration-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881566 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5facc307-168d-4313-868a-8bf0db5e65ca-srv-cert\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881662 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5w5\" (UniqueName: \"kubernetes.io/projected/0986b18b-4eea-4f98-b246-017c17cbe895-kube-api-access-kt5w5\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881685 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-certs\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881701 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fxlw\" (UniqueName: \"kubernetes.io/projected/50a96d01-d66b-4110-a608-aacd554c5111-kube-api-access-7fxlw\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881726 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5hm2\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-kube-api-access-g5hm2\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881815 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b1d134-f49d-4b86-9630-a7d851ac1c40-config-volume\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881848 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-node-bootstrap-token\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881874 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a7f356-6278-409f-9047-efece8492b78-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.881911 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9t7f\" (UniqueName: \"kubernetes.io/projected/5facc307-168d-4313-868a-8bf0db5e65ca-kube-api-access-f9t7f\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.882003 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.882050 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-mountpoint-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.882651 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-config\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.883638 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-ca\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.883665 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-474v5\" (UniqueName: \"kubernetes.io/projected/7a298896-ef40-44ff-a9ba-45fba603014b-kube-api-access-474v5\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.883683 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhvtz\" (UniqueName: \"kubernetes.io/projected/a54d1367-e54c-4ebd-99f2-2ec9e8a449ec-kube-api-access-dhvtz\") pod \"migrator-59844c95c7-qjqs7\" (UID: \"a54d1367-e54c-4ebd-99f2-2ec9e8a449ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.883715 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-socket-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.883758 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72b1d134-f49d-4b86-9630-a7d851ac1c40-metrics-tls\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:23 crc kubenswrapper[5129]: E0314 07:02:23.883823 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.383802426 +0000 UTC m=+207.135717610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.884070 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a7f356-6278-409f-9047-efece8492b78-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.889314 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-registry-certificates\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.889904 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" event={"ID":"e82ae62e-63ea-4a75-9c2f-7a02ade5768a","Type":"ContainerStarted","Data":"cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.889937 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" event={"ID":"e82ae62e-63ea-4a75-9c2f-7a02ade5768a","Type":"ContainerStarted","Data":"d2eb097f3ed68dcdc4629c7da07456a1e9bf848d25c7f4abd42dcb5e01661282"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.890772 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.890827 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-service-ca\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.891301 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-client\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.891735 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-trusted-ca\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.892376 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d8d49c8-334d-45ba-bd88-2bd200200093-serving-cert\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.896446 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d8d49c8-334d-45ba-bd88-2bd200200093-etcd-ca\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.901648 5129 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fmn4k container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.901693 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" podUID="e82ae62e-63ea-4a75-9c2f-7a02ade5768a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.904046 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a7f356-6278-409f-9047-efece8492b78-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.905006 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-registry-tls\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.930081 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" event={"ID":"e8915d74-77d5-4d0f-9264-37b6f8167a6d","Type":"ContainerStarted","Data":"8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.930127 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" event={"ID":"e8915d74-77d5-4d0f-9264-37b6f8167a6d","Type":"ContainerStarted","Data":"f0040a5ad862ec92b613ac2596511147b20f77d1c6cc5c9856c4fdf45095bc38"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.931044 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.933798 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-bound-sa-token\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.941141 5129 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xgc7f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.941187 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" podUID="e8915d74-77d5-4d0f-9264-37b6f8167a6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.941684 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" event={"ID":"095d17bf-b9c3-42e6-b8c9-39b929c52d50","Type":"ContainerStarted","Data":"62fdb3b0e6388a85df7033e09ba365bdb7f76d099a90917624c9cc39a0166861"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.941718 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" event={"ID":"095d17bf-b9c3-42e6-b8c9-39b929c52d50","Type":"ContainerStarted","Data":"a56ab1ef32c18a034203cfa43d5c99e7a5d1fcbd7c08cf78b63ec939e782b95b"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.941733 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" event={"ID":"095d17bf-b9c3-42e6-b8c9-39b929c52d50","Type":"ContainerStarted","Data":"274e242d13eb949982f5ddb24c55168695bf50b6de52d12e8d6217dc6746b883"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.948503 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.953568 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5hm2\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-kube-api-access-g5hm2\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.956123 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" event={"ID":"65a0ac6e-2c3b-4f24-8495-b13aa58017c1","Type":"ContainerStarted","Data":"8e03c4b0713b79cb3d429d1dc2c1b83bbf2f44b8c8bc0d555cab553769e6eba4"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.956155 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" event={"ID":"65a0ac6e-2c3b-4f24-8495-b13aa58017c1","Type":"ContainerStarted","Data":"7c49c55acdd33b8f1f7b8e6350045a4381191fa5d73e7f19e6dbbeda4bf3375d"} Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.966243 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz"] Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.979848 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wbs\" (UniqueName: \"kubernetes.io/projected/7d8d49c8-334d-45ba-bd88-2bd200200093-kube-api-access-74wbs\") pod \"etcd-operator-b45778765-cqxl7\" (UID: \"7d8d49c8-334d-45ba-bd88-2bd200200093\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984183 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984399 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhvtz\" (UniqueName: \"kubernetes.io/projected/a54d1367-e54c-4ebd-99f2-2ec9e8a449ec-kube-api-access-dhvtz\") pod \"migrator-59844c95c7-qjqs7\" (UID: \"a54d1367-e54c-4ebd-99f2-2ec9e8a449ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984425 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72b1d134-f49d-4b86-9630-a7d851ac1c40-metrics-tls\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984452 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-socket-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984522 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hdk\" (UniqueName: \"kubernetes.io/projected/581cb821-3eab-416b-bb1e-013db618ed67-kube-api-access-c7hdk\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984542 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-plugins-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984564 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/67f3717d-a52d-46b9-9132-044239d564c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rrt62\" (UID: \"67f3717d-a52d-46b9-9132-044239d564c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984587 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pht5r\" (UniqueName: \"kubernetes.io/projected/deb9ddae-7954-4db6-9864-bae7e7c25744-kube-api-access-pht5r\") pod \"ingress-canary-f2lk5\" (UID: \"deb9ddae-7954-4db6-9864-bae7e7c25744\") " pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984660 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/021f1f3b-8ee9-424f-9c04-c56631332e92-config-volume\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984690 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5516ae48-72a7-4f9e-9dd8-30618817f0c8-metrics-tls\") pod \"dns-operator-744455d44c-2b5g4\" (UID: \"5516ae48-72a7-4f9e-9dd8-30618817f0c8\") " pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984722 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-csi-data-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984738 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lfsd\" (UniqueName: \"kubernetes.io/projected/e9e49a0a-8f9f-4e78-8098-d195fe3297bd-kube-api-access-7lfsd\") pod \"auto-csr-approver-29557862-lf4bl\" (UID: \"e9e49a0a-8f9f-4e78-8098-d195fe3297bd\") " pod="openshift-infra/auto-csr-approver-29557862-lf4bl" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984754 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb9ddae-7954-4db6-9864-bae7e7c25744-cert\") pod \"ingress-canary-f2lk5\" (UID: \"deb9ddae-7954-4db6-9864-bae7e7c25744\") " pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984821 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/50a96d01-d66b-4110-a608-aacd554c5111-profile-collector-cert\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984843 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npnj9\" (UniqueName: \"kubernetes.io/projected/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-kube-api-access-npnj9\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984876 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581cb821-3eab-416b-bb1e-013db618ed67-serving-cert\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984893 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581cb821-3eab-416b-bb1e-013db618ed67-config\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984978 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ct8n\" (UniqueName: \"kubernetes.io/projected/021f1f3b-8ee9-424f-9c04-c56631332e92-kube-api-access-4ct8n\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.984999 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5facc307-168d-4313-868a-8bf0db5e65ca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985025 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/021f1f3b-8ee9-424f-9c04-c56631332e92-secret-volume\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985052 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f7hp\" (UniqueName: \"kubernetes.io/projected/72b1d134-f49d-4b86-9630-a7d851ac1c40-kube-api-access-7f7hp\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985076 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985121 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/50a96d01-d66b-4110-a608-aacd554c5111-srv-cert\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985141 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmv6t\" (UniqueName: \"kubernetes.io/projected/5516ae48-72a7-4f9e-9dd8-30618817f0c8-kube-api-access-fmv6t\") pod \"dns-operator-744455d44c-2b5g4\" (UID: \"5516ae48-72a7-4f9e-9dd8-30618817f0c8\") " pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985167 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p25l\" (UniqueName: \"kubernetes.io/projected/67f3717d-a52d-46b9-9132-044239d564c3-kube-api-access-7p25l\") pod \"control-plane-machine-set-operator-78cbb6b69f-rrt62\" (UID: \"67f3717d-a52d-46b9-9132-044239d564c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985195 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-registration-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985210 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5facc307-168d-4313-868a-8bf0db5e65ca-srv-cert\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985255 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5w5\" (UniqueName: \"kubernetes.io/projected/0986b18b-4eea-4f98-b246-017c17cbe895-kube-api-access-kt5w5\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985279 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-certs\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985293 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fxlw\" (UniqueName: \"kubernetes.io/projected/50a96d01-d66b-4110-a608-aacd554c5111-kube-api-access-7fxlw\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985318 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b1d134-f49d-4b86-9630-a7d851ac1c40-config-volume\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985336 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-node-bootstrap-token\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985389 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9t7f\" (UniqueName: \"kubernetes.io/projected/5facc307-168d-4313-868a-8bf0db5e65ca-kube-api-access-f9t7f\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985425 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985450 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-mountpoint-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.985467 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-474v5\" (UniqueName: \"kubernetes.io/projected/7a298896-ef40-44ff-a9ba-45fba603014b-kube-api-access-474v5\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:23 crc kubenswrapper[5129]: E0314 07:02:23.986787 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.486769107 +0000 UTC m=+207.238684291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.993909 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72b1d134-f49d-4b86-9630-a7d851ac1c40-metrics-tls\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.993980 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-socket-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.994220 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-plugins-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.995830 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.996894 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-csi-data-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.996907 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/021f1f3b-8ee9-424f-9c04-c56631332e92-config-volume\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.998749 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-registration-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.999255 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/67f3717d-a52d-46b9-9132-044239d564c3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rrt62\" (UID: \"67f3717d-a52d-46b9-9132-044239d564c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:23 crc kubenswrapper[5129]: I0314 07:02:23.999538 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581cb821-3eab-416b-bb1e-013db618ed67-config\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.000837 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b1d134-f49d-4b86-9630-a7d851ac1c40-config-volume\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.000929 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0986b18b-4eea-4f98-b246-017c17cbe895-mountpoint-dir\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.002611 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/021f1f3b-8ee9-424f-9c04-c56631332e92-secret-volume\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.003580 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deb9ddae-7954-4db6-9864-bae7e7c25744-cert\") pod \"ingress-canary-f2lk5\" (UID: \"deb9ddae-7954-4db6-9864-bae7e7c25744\") " pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.017201 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/50a96d01-d66b-4110-a608-aacd554c5111-profile-collector-cert\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.017755 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5facc307-168d-4313-868a-8bf0db5e65ca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.019006 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-node-bootstrap-token\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.020483 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/50a96d01-d66b-4110-a608-aacd554c5111-srv-cert\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.020908 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581cb821-3eab-416b-bb1e-013db618ed67-serving-cert\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.021001 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-certs\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.021299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5516ae48-72a7-4f9e-9dd8-30618817f0c8-metrics-tls\") pod \"dns-operator-744455d44c-2b5g4\" (UID: \"5516ae48-72a7-4f9e-9dd8-30618817f0c8\") " pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.028650 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.038566 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5facc307-168d-4313-868a-8bf0db5e65ca-srv-cert\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.044085 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-474v5\" (UniqueName: \"kubernetes.io/projected/7a298896-ef40-44ff-a9ba-45fba603014b-kube-api-access-474v5\") pod \"marketplace-operator-79b997595-cb8c5\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.058822 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ct8n\" (UniqueName: \"kubernetes.io/projected/021f1f3b-8ee9-424f-9c04-c56631332e92-kube-api-access-4ct8n\") pod \"collect-profiles-29557860-zdvj5\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.068324 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhvtz\" (UniqueName: \"kubernetes.io/projected/a54d1367-e54c-4ebd-99f2-2ec9e8a449ec-kube-api-access-dhvtz\") pod \"migrator-59844c95c7-qjqs7\" (UID: \"a54d1367-e54c-4ebd-99f2-2ec9e8a449ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.087365 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.087703 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.587690208 +0000 UTC m=+207.339605392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.087793 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.088183 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.101856 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5w5\" (UniqueName: \"kubernetes.io/projected/0986b18b-4eea-4f98-b246-017c17cbe895-kube-api-access-kt5w5\") pod \"csi-hostpathplugin-rc492\" (UID: \"0986b18b-4eea-4f98-b246-017c17cbe895\") " pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.110305 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pht5r\" (UniqueName: \"kubernetes.io/projected/deb9ddae-7954-4db6-9864-bae7e7c25744-kube-api-access-pht5r\") pod \"ingress-canary-f2lk5\" (UID: \"deb9ddae-7954-4db6-9864-bae7e7c25744\") " pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.130151 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7hdk\" (UniqueName: \"kubernetes.io/projected/581cb821-3eab-416b-bb1e-013db618ed67-kube-api-access-c7hdk\") pod \"service-ca-operator-777779d784-msrqq\" (UID: \"581cb821-3eab-416b-bb1e-013db618ed67\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.159720 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f7hp\" (UniqueName: \"kubernetes.io/projected/72b1d134-f49d-4b86-9630-a7d851ac1c40-kube-api-access-7f7hp\") pod \"dns-default-t9ngb\" (UID: \"72b1d134-f49d-4b86-9630-a7d851ac1c40\") " pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.169031 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmv6t\" (UniqueName: \"kubernetes.io/projected/5516ae48-72a7-4f9e-9dd8-30618817f0c8-kube-api-access-fmv6t\") pod \"dns-operator-744455d44c-2b5g4\" (UID: \"5516ae48-72a7-4f9e-9dd8-30618817f0c8\") " pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.188782 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.189021 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.688982351 +0000 UTC m=+207.440897535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.189518 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.189890 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.689876077 +0000 UTC m=+207.441791261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.193992 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p25l\" (UniqueName: \"kubernetes.io/projected/67f3717d-a52d-46b9-9132-044239d564c3-kube-api-access-7p25l\") pod \"control-plane-machine-set-operator-78cbb6b69f-rrt62\" (UID: \"67f3717d-a52d-46b9-9132-044239d564c3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.198696 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hcrvq"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.200273 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.219948 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fxlw\" (UniqueName: \"kubernetes.io/projected/50a96d01-d66b-4110-a608-aacd554c5111-kube-api-access-7fxlw\") pod \"catalog-operator-68c6474976-2snfc\" (UID: \"50a96d01-d66b-4110-a608-aacd554c5111\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.228232 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9t7f\" (UniqueName: \"kubernetes.io/projected/5facc307-168d-4313-868a-8bf0db5e65ca-kube-api-access-f9t7f\") pod \"olm-operator-6b444d44fb-lf26g\" (UID: \"5facc307-168d-4313-868a-8bf0db5e65ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.243408 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.243484 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.246275 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.251227 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lfsd\" (UniqueName: \"kubernetes.io/projected/e9e49a0a-8f9f-4e78-8098-d195fe3297bd-kube-api-access-7lfsd\") pod \"auto-csr-approver-29557862-lf4bl\" (UID: \"e9e49a0a-8f9f-4e78-8098-d195fe3297bd\") " pod="openshift-infra/auto-csr-approver-29557862-lf4bl" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.251501 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.259863 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:24 crc kubenswrapper[5129]: W0314 07:02:24.265106 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa4b0c05_3ff2_4e52_b028_603d2eb22adc.slice/crio-6c2ce7db23f89acff828bf52443b02ba1257c2437561403064d4028b7a7ca1ad WatchSource:0}: Error finding container 6c2ce7db23f89acff828bf52443b02ba1257c2437561403064d4028b7a7ca1ad: Status 404 returned error can't find the container with id 6c2ce7db23f89acff828bf52443b02ba1257c2437561403064d4028b7a7ca1ad Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.266876 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.272589 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npnj9\" (UniqueName: \"kubernetes.io/projected/4d51a3db-e1b0-4b4d-91ad-5ef4217890d2-kube-api-access-npnj9\") pod \"machine-config-server-dsrgt\" (UID: \"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2\") " pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.278534 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.286161 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.290809 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.291110 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.791095187 +0000 UTC m=+207.543010371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.297494 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.301287 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rc492" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.339363 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.348034 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dsrgt" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.348189 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.351673 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f2lk5" Mar 14 07:02:24 crc kubenswrapper[5129]: W0314 07:02:24.353283 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c116925_b69d_4719_9e7b_7bfdf13ee5f5.slice/crio-90d0a935fa7654229e09276e45e80289722218ed1b2b09c779b5710f934a5316 WatchSource:0}: Error finding container 90d0a935fa7654229e09276e45e80289722218ed1b2b09c779b5710f934a5316: Status 404 returned error can't find the container with id 90d0a935fa7654229e09276e45e80289722218ed1b2b09c779b5710f934a5316 Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.395511 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-trtr5"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.405165 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.405460 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:24.905446727 +0000 UTC m=+207.657361911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.506315 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.506430 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.006409929 +0000 UTC m=+207.758325123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.509759 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.510270 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.01025709 +0000 UTC m=+207.762172274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.524796 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.525017 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.527501 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.537984 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.611125 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.611462 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.111447689 +0000 UTC m=+207.863362873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.622255 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pphjw"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.648939 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35636: no serving certificate available for the kubelet" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.694827 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.713214 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.713481 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.213469563 +0000 UTC m=+207.965384747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.752702 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m7rjz"] Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.756037 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35652: no serving certificate available for the kubelet" Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.814602 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.815433 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.315415774 +0000 UTC m=+208.067330958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.903076 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35656: no serving certificate available for the kubelet" Mar 14 07:02:24 crc kubenswrapper[5129]: W0314 07:02:24.907098 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a59e71_3fa0_4957_b79e_927190083c0d.slice/crio-54a9df6c13494351e04b92f68a919ace401ca913d2775e834e52b2e9b9c94ab5 WatchSource:0}: Error finding container 54a9df6c13494351e04b92f68a919ace401ca913d2775e834e52b2e9b9c94ab5: Status 404 returned error can't find the container with id 54a9df6c13494351e04b92f68a919ace401ca913d2775e834e52b2e9b9c94ab5 Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.918005 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:24 crc kubenswrapper[5129]: E0314 07:02:24.918560 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.41854361 +0000 UTC m=+208.170458804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:24 crc kubenswrapper[5129]: W0314 07:02:24.933778 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41165ac7_1905_437e_b313_2917b3f80168.slice/crio-8a5335daa3e650f1027323e4c23f1eed2f759e38cd0d9c7bdccd7edc7469d995 WatchSource:0}: Error finding container 8a5335daa3e650f1027323e4c23f1eed2f759e38cd0d9c7bdccd7edc7469d995: Status 404 returned error can't find the container with id 8a5335daa3e650f1027323e4c23f1eed2f759e38cd0d9c7bdccd7edc7469d995 Mar 14 07:02:24 crc kubenswrapper[5129]: I0314 07:02:24.959021 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35670: no serving certificate available for the kubelet" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.013651 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" event={"ID":"e16d3836-813b-4a25-8dbb-8f5330008ec7","Type":"ContainerStarted","Data":"5208de7f827220d809a86582940703ec428448a69ef4ac0d1a06a0a14acf2e87"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.017717 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" event={"ID":"599bd4dc-4ab7-403b-9ede-30ed9dbca320","Type":"ContainerStarted","Data":"5190fa508a30e27168ef31ad78667c228c06b088405c52cbff0a98256466c1b2"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.017783 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" event={"ID":"599bd4dc-4ab7-403b-9ede-30ed9dbca320","Type":"ContainerStarted","Data":"c645ccdc055116a61bd7045e9ee92e8362be7cfc919f3f6746f5023a60adef1b"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.019085 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.019354 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.519337538 +0000 UTC m=+208.271252722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.028672 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" event={"ID":"b81a1cd4-9e3a-411a-921f-3cd1139be7e3","Type":"ContainerStarted","Data":"ad8f9ee40a0b97e1c8f5d6a357ab57321b992152ff3f651abf185727cdb4f1bd"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.028716 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" event={"ID":"b81a1cd4-9e3a-411a-921f-3cd1139be7e3","Type":"ContainerStarted","Data":"1f9b093530165b79b3b11a64f30abb3acc7eb2cbdb4d3964556dd28d1d55bf10"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.032921 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" event={"ID":"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a","Type":"ContainerStarted","Data":"3ac3c1985d9757bf587683e8007340456a039342f69f1440848270ff631f63e7"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.057009 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35680: no serving certificate available for the kubelet" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.059844 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" event={"ID":"41165ac7-1905-437e-b313-2917b3f80168","Type":"ContainerStarted","Data":"8a5335daa3e650f1027323e4c23f1eed2f759e38cd0d9c7bdccd7edc7469d995"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.070578 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vpw78" podStartSLOduration=158.070558915 podStartE2EDuration="2m38.070558915s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:25.068900637 +0000 UTC m=+207.820815841" watchObservedRunningTime="2026-03-14 07:02:25.070558915 +0000 UTC m=+207.822474089" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.078383 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gvck8" event={"ID":"29a50e5b-e484-4916-be70-893887f8405e","Type":"ContainerStarted","Data":"9059784796de44043fb0837c28527fc12cb1e327827e224ad4349b93f9dc32ef"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.082416 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" event={"ID":"6c116925-b69d-4719-9e7b-7bfdf13ee5f5","Type":"ContainerStarted","Data":"90d0a935fa7654229e09276e45e80289722218ed1b2b09c779b5710f934a5316"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.101898 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" event={"ID":"405015c6-fd32-4d6f-86ba-7d44197a6a23","Type":"ContainerStarted","Data":"9b50fdce14430cde5ec6f58b70431bcd8d5ad805cc9186c3949e0f0e313bd98e"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.114131 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" event={"ID":"a97c3ccc-71bd-4f3e-8457-5e023732cb5b","Type":"ContainerStarted","Data":"402b9afe1607203155deaf53e6a3f87cfa0afa79a6481ec2bbb208be6b4009f2"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.132612 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.134166 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.63414731 +0000 UTC m=+208.386062594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.138794 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" event={"ID":"285a43ca-3a99-4b85-96b3-6b8619bb5853","Type":"ContainerStarted","Data":"6d2388e632009cf31c8cb6575c77dc374dced34caca309cf3796a045a79400a3"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.171498 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35682: no serving certificate available for the kubelet" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.172250 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" event={"ID":"6de214af-00b4-4c8f-8f7d-633419b87e45","Type":"ContainerStarted","Data":"944029f93458423974fbf606bd61f57be0956b53960a796fdfe839ef652bc3f8"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.172301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" event={"ID":"6de214af-00b4-4c8f-8f7d-633419b87e45","Type":"ContainerStarted","Data":"2754b528ea39d1a6f7d2f559fddecf05aa6c63dba32276bf45ac89cac60b675d"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.185365 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" event={"ID":"14a59e71-3fa0-4957-b79e-927190083c0d","Type":"ContainerStarted","Data":"54a9df6c13494351e04b92f68a919ace401ca913d2775e834e52b2e9b9c94ab5"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.235694 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" event={"ID":"e61dec00-008c-454f-a37d-87b8a21d4733","Type":"ContainerStarted","Data":"4803ec91293021dceb48e9e93d80ab56756b886adecdcef7a892f0d809d664fa"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.236212 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.239798 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.739759916 +0000 UTC m=+208.491675100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.240133 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.241152 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.741132536 +0000 UTC m=+208.493047720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.250916 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" event={"ID":"fa4b0c05-3ff2-4e52-b028-603d2eb22adc","Type":"ContainerStarted","Data":"6c2ce7db23f89acff828bf52443b02ba1257c2437561403064d4028b7a7ca1ad"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.257333 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" event={"ID":"b99871bf-2660-4b7c-8ba4-e9cdae008681","Type":"ContainerStarted","Data":"790bc1c3c80f38eeaad42cdf3b19be8fd51fde15ceb9700bc694e88465ebcd44"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.277325 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35694: no serving certificate available for the kubelet" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.290021 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hrz85" event={"ID":"50cfd6b1-bf1f-4a8b-bd6d-9799281f1359","Type":"ContainerStarted","Data":"12152df38e84f4c844235a18b1ab37126c2245821a9bd1b51aacf5a06010d974"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.290916 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hrz85" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.310899 5129 patch_prober.go:28] interesting pod/downloads-7954f5f757-hrz85 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.310995 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hrz85" podUID="50cfd6b1-bf1f-4a8b-bd6d-9799281f1359" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.321102 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cqxl7"] Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.328394 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" event={"ID":"0d03542f-7483-4aea-9651-4ab9a4ff1378","Type":"ContainerStarted","Data":"3c02b2eaa566d0ba708cec1369e4b3e21527e2718831470da55569333e5be22b"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.338274 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" podStartSLOduration=158.338249378 podStartE2EDuration="2m38.338249378s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:25.313293638 +0000 UTC m=+208.065208822" watchObservedRunningTime="2026-03-14 07:02:25.338249378 +0000 UTC m=+208.090164562" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.342202 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.342720 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.842699956 +0000 UTC m=+208.594615140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.357055 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2b5g4"] Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.357820 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9d5pw" podStartSLOduration=158.357790702 podStartE2EDuration="2m38.357790702s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:25.347073443 +0000 UTC m=+208.098988637" watchObservedRunningTime="2026-03-14 07:02:25.357790702 +0000 UTC m=+208.109705886" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.381677 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" event={"ID":"dfbd40ee-070a-48b0-a67b-deb66288082b","Type":"ContainerStarted","Data":"e8231ca7633dc857580374bd2b91ce1ae46645237b2db600631f1c4b6e283ff1"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.382305 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.386787 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" event={"ID":"b633b0c1-73e2-445d-8f82-0f37854b2fc7","Type":"ContainerStarted","Data":"b01158579157e2c67e7f14dcab067f1b7c8717f4810120f8d4c21f060fa3550c"} Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.392760 5129 patch_prober.go:28] interesting pod/console-operator-58897d9998-gkdgj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.392828 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" podUID="dfbd40ee-070a-48b0-a67b-deb66288082b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.397221 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.414930 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.428055 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35700: no serving certificate available for the kubelet" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.487137 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.489373 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:25.989339358 +0000 UTC m=+208.741254732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.495704 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg"] Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.546834 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.587590 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gnkjw" podStartSLOduration=158.587575192 podStartE2EDuration="2m38.587575192s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:25.549857703 +0000 UTC m=+208.301772887" watchObservedRunningTime="2026-03-14 07:02:25.587575192 +0000 UTC m=+208.339490376" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.588254 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" podStartSLOduration=157.588248041 podStartE2EDuration="2m37.588248041s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:25.586754877 +0000 UTC m=+208.338670071" watchObservedRunningTime="2026-03-14 07:02:25.588248041 +0000 UTC m=+208.340163225" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.589022 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.590880 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.090857086 +0000 UTC m=+208.842772300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.690802 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.691160 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.19114647 +0000 UTC m=+208.943061654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.702266 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vtzj7" podStartSLOduration=157.70225085 podStartE2EDuration="2m37.70225085s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:25.699946734 +0000 UTC m=+208.451861918" watchObservedRunningTime="2026-03-14 07:02:25.70225085 +0000 UTC m=+208.454166034" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.793009 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.793650 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.293604305 +0000 UTC m=+209.045519489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.799957 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.800124 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.800248 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.802868 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.302852122 +0000 UTC m=+209.054767306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.816566 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.822883 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.823162 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:25 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:25 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:25 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.823199 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.902429 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.902967 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.903062 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:25 crc kubenswrapper[5129]: E0314 07:02:25.904808 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.404783083 +0000 UTC m=+209.156698267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.913595 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.915501 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:25 crc kubenswrapper[5129]: I0314 07:02:25.977148 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:25.998182 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:25.999239 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.005488 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.010984 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.510964747 +0000 UTC m=+209.262879931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.108160 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.108305 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.608279514 +0000 UTC m=+209.360194698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.108699 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.109028 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.609013975 +0000 UTC m=+209.360929159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.113450 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.142080 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35708: no serving certificate available for the kubelet" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.150219 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.154661 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" podStartSLOduration=158.154632081 podStartE2EDuration="2m38.154632081s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.143582042 +0000 UTC m=+208.895497226" watchObservedRunningTime="2026-03-14 07:02:26.154632081 +0000 UTC m=+208.906547275" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.186486 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gvck8" podStartSLOduration=158.186465919 podStartE2EDuration="2m38.186465919s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.184519624 +0000 UTC m=+208.936434818" watchObservedRunningTime="2026-03-14 07:02:26.186465919 +0000 UTC m=+208.938381103" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.211146 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.211499 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.711483431 +0000 UTC m=+209.463398615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.231893 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" podStartSLOduration=159.23187334 podStartE2EDuration="2m39.23187334s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.226351551 +0000 UTC m=+208.978266745" watchObservedRunningTime="2026-03-14 07:02:26.23187334 +0000 UTC m=+208.983788524" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.269839 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hrz85" podStartSLOduration=159.269822005 podStartE2EDuration="2m39.269822005s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.265297985 +0000 UTC m=+209.017213169" watchObservedRunningTime="2026-03-14 07:02:26.269822005 +0000 UTC m=+209.021737189" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.313170 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.313715 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.813689891 +0000 UTC m=+209.565605075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.366112 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rsmg2" podStartSLOduration=158.366095373 podStartE2EDuration="2m38.366095373s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.365153295 +0000 UTC m=+209.117068489" watchObservedRunningTime="2026-03-14 07:02:26.366095373 +0000 UTC m=+209.118010557" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.381713 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.418338 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.418931 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:26.918897925 +0000 UTC m=+209.670813109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.478951 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" podStartSLOduration=158.478935638 podStartE2EDuration="2m38.478935638s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.448093828 +0000 UTC m=+209.200009012" watchObservedRunningTime="2026-03-14 07:02:26.478935638 +0000 UTC m=+209.230850822" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.490067 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.490589 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cb8c5"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.493984 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" event={"ID":"ce155720-ca35-4e8d-8b93-1d20fe7368e6","Type":"ContainerStarted","Data":"e55d9c7c86c7847ded704b4c85969ac22b55527d1bef3c00d5899de490cad1f2"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.506729 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.523638 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.524004 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.023991518 +0000 UTC m=+209.775906702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.524399 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" event={"ID":"021f1f3b-8ee9-424f-9c04-c56631332e92","Type":"ContainerStarted","Data":"b45af925f36a74687a11f4b4606bb2a96911004e32869078765e08b1b11cfcea"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.550055 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-msrqq"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.558014 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nz9kg" podStartSLOduration=158.557992799 podStartE2EDuration="2m38.557992799s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.514326698 +0000 UTC m=+209.266241882" watchObservedRunningTime="2026-03-14 07:02:26.557992799 +0000 UTC m=+209.309907983" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.567313 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:26 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:26 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:26 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.567358 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.624558 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" event={"ID":"405015c6-fd32-4d6f-86ba-7d44197a6a23","Type":"ContainerStarted","Data":"203efdf293670bbaedf2fb0cc5c1424b71d78619027b84d0f99dc53f22aaf79f"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.625635 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.625780 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.125750183 +0000 UTC m=+209.877665367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.625891 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.626165 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.126155855 +0000 UTC m=+209.878071039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.626632 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t9ngb"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.660897 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" event={"ID":"5516ae48-72a7-4f9e-9dd8-30618817f0c8","Type":"ContainerStarted","Data":"315b6f83a2aeecf4acdaafc174a40c69d4816d04f1a6ec779f915cafd23dc9d5"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.675451 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s4fr9" podStartSLOduration=159.675423437 podStartE2EDuration="2m39.675423437s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.664145381 +0000 UTC m=+209.416060585" watchObservedRunningTime="2026-03-14 07:02:26.675423437 +0000 UTC m=+209.427338621" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.685177 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dsrgt" event={"ID":"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2","Type":"ContainerStarted","Data":"1a2d4081df85a5ede0e331b763e4ae2e95982af5433e13c4c22270f5c1ee2e94"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.685425 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dsrgt" event={"ID":"4d51a3db-e1b0-4b4d-91ad-5ef4217890d2","Type":"ContainerStarted","Data":"a4d50d629a2f6757a63e38f3acef134f0a7f3be3082c2ffa5da739200b06e974"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.706096 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.733342 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.735305 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.235280253 +0000 UTC m=+209.987195437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.759213 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" event={"ID":"b81a1cd4-9e3a-411a-921f-3cd1139be7e3","Type":"ContainerStarted","Data":"87dde8eb2321f8ce9abbe0842cf727e706a67130c93ec0546cd86f7548c6cc9b"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.782941 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dsrgt" podStartSLOduration=5.782887517 podStartE2EDuration="5.782887517s" podCreationTimestamp="2026-03-14 07:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.780198579 +0000 UTC m=+209.532113763" watchObservedRunningTime="2026-03-14 07:02:26.782887517 +0000 UTC m=+209.534802701" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.824232 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4z5nz" podStartSLOduration=158.824213469 podStartE2EDuration="2m38.824213469s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.823224391 +0000 UTC m=+209.575139575" watchObservedRunningTime="2026-03-14 07:02:26.824213469 +0000 UTC m=+209.576128653" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.840947 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.841357 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.341319743 +0000 UTC m=+210.093234927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.871477 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" event={"ID":"0d03542f-7483-4aea-9651-4ab9a4ff1378","Type":"ContainerStarted","Data":"1af5d303d6998ff24c2164427e50219b8eabc8b8a1a3b6aaaf7f37ffc62c2a2e"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.895631 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" event={"ID":"7d8d49c8-334d-45ba-bd88-2bd200200093","Type":"ContainerStarted","Data":"4f77baf7a0d243bb528b586c8abe3738f2cbcd14941c8ee0a88558c24818c9a1"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.911722 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" event={"ID":"42d74e97-1c27-4447-afe2-d10bb4b3a1b6","Type":"ContainerStarted","Data":"b9b15e7d2ab858b749ea6bff4d231f041a13788dcc485aa3aec61a4b8bb7c9e3"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.925586 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rc492"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.935850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" event={"ID":"6c116925-b69d-4719-9e7b-7bfdf13ee5f5","Type":"ContainerStarted","Data":"34757632450f11dfdccb88463e993301325f1c6280e74ab83b5f54be43a641dc"} Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.942569 5129 patch_prober.go:28] interesting pod/downloads-7954f5f757-hrz85 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.942957 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hrz85" podUID="50cfd6b1-bf1f-4a8b-bd6d-9799281f1359" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.943994 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:26 crc kubenswrapper[5129]: E0314 07:02:26.945117 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.445099217 +0000 UTC m=+210.197014401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.955433 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" podStartSLOduration=159.955408235 podStartE2EDuration="2m39.955408235s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:26.951826771 +0000 UTC m=+209.703741955" watchObservedRunningTime="2026-03-14 07:02:26.955408235 +0000 UTC m=+209.707323419" Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.991553 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f2lk5"] Mar 14 07:02:26 crc kubenswrapper[5129]: I0314 07:02:26.991738 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gkdgj" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.005555 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" podStartSLOduration=159.005538091 podStartE2EDuration="2m39.005538091s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:27.000927778 +0000 UTC m=+209.752842972" watchObservedRunningTime="2026-03-14 07:02:27.005538091 +0000 UTC m=+209.757453275" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.007749 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-lf4bl"] Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.043217 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hcrvq" podStartSLOduration=160.043200357 podStartE2EDuration="2m40.043200357s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:27.040910641 +0000 UTC m=+209.792825835" watchObservedRunningTime="2026-03-14 07:02:27.043200357 +0000 UTC m=+209.795115541" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.045884 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.054409 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.55439357 +0000 UTC m=+210.306308754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.148487 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.148852 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.648823755 +0000 UTC m=+210.400738939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.149224 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.152168 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.65215337 +0000 UTC m=+210.404068554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.253350 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.253922 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.753896616 +0000 UTC m=+210.505811800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.352705 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.354894 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.355255 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.85524497 +0000 UTC m=+210.607160154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: W0314 07:02:27.393126 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeb9ddae_7954_4db6_9864_bae7e7c25744.slice/crio-c49f8aad446e3b178601599180b7399d46f9415518cf369e4d6b404af8eb4c1b WatchSource:0}: Error finding container c49f8aad446e3b178601599180b7399d46f9415518cf369e4d6b404af8eb4c1b: Status 404 returned error can't find the container with id c49f8aad446e3b178601599180b7399d46f9415518cf369e4d6b404af8eb4c1b Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.457495 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.457684 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.957661525 +0000 UTC m=+210.709576709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.457943 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.458389 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:27.958377405 +0000 UTC m=+210.710292579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.501143 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35710: no serving certificate available for the kubelet" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.559087 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.559689 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.059635357 +0000 UTC m=+210.811550541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.562581 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:27 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:27 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:27 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.562722 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.568375 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.582095 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.649757 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.660427 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.660838 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.160822266 +0000 UTC m=+210.912737450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.762138 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.762792 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.262775677 +0000 UTC m=+211.014690861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: W0314 07:02:27.765895 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6b3427002c4950e0d8e18ef576a5f837c3eaef32ee6dfd8dbe1578d3746ed7eb WatchSource:0}: Error finding container 6b3427002c4950e0d8e18ef576a5f837c3eaef32ee6dfd8dbe1578d3746ed7eb: Status 404 returned error can't find the container with id 6b3427002c4950e0d8e18ef576a5f837c3eaef32ee6dfd8dbe1578d3746ed7eb Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.873764 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.874687 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.374667616 +0000 UTC m=+211.126582800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.976210 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.976524 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.476425971 +0000 UTC m=+211.228341155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.976972 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:27 crc kubenswrapper[5129]: E0314 07:02:27.977640 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.477629076 +0000 UTC m=+211.229544260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:27 crc kubenswrapper[5129]: I0314 07:02:27.988247 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" event={"ID":"7d8d49c8-334d-45ba-bd88-2bd200200093","Type":"ContainerStarted","Data":"3cb8df160d43fb2059302e8184e32b268f41c6326746ba24fe063c5a61ee49a6"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.000161 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" event={"ID":"021f1f3b-8ee9-424f-9c04-c56631332e92","Type":"ContainerStarted","Data":"d78d5a7174eb10dce09651d1b84244d39b7ad36c4f9c6d9f01a997b9baba6566"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.017757 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f2lk5" event={"ID":"deb9ddae-7954-4db6-9864-bae7e7c25744","Type":"ContainerStarted","Data":"c49f8aad446e3b178601599180b7399d46f9415518cf369e4d6b404af8eb4c1b"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.034229 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-cqxl7" podStartSLOduration=160.034209129 podStartE2EDuration="2m40.034209129s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:28.032225291 +0000 UTC m=+210.784140475" watchObservedRunningTime="2026-03-14 07:02:28.034209129 +0000 UTC m=+210.786124313" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.063895 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" event={"ID":"5516ae48-72a7-4f9e-9dd8-30618817f0c8","Type":"ContainerStarted","Data":"f5d8c1e9602ce5d1bca6005fd4ba5f72564fa72506e71d13d030d0db176dd060"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.079165 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.079952 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" podStartSLOduration=148.079927857 podStartE2EDuration="2m28.079927857s" podCreationTimestamp="2026-03-14 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:28.059056206 +0000 UTC m=+210.810971390" watchObservedRunningTime="2026-03-14 07:02:28.079927857 +0000 UTC m=+210.831843051" Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.081358 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.581339869 +0000 UTC m=+211.333255213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.096917 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" event={"ID":"e9e49a0a-8f9f-4e78-8098-d195fe3297bd","Type":"ContainerStarted","Data":"350d95a6e97105a727ca14d7a07083a8a5dd2015966c62146aa5a1ab93259216"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.116589 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m7rjz" event={"ID":"42d74e97-1c27-4447-afe2-d10bb4b3a1b6","Type":"ContainerStarted","Data":"1b0b5360ed04c8e9a1e80de1f88428cbfbacce7896f75b3094b41706c34741e3"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.182231 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.183314 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" event={"ID":"e16d3836-813b-4a25-8dbb-8f5330008ec7","Type":"ContainerStarted","Data":"33cffa74de3ffffa2cf03153517cf21b9b719f3f109a5cfc798ee7c7bd8e7eb8"} Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.185058 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.68504004 +0000 UTC m=+211.436955414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.223285 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a2bdf093ab123602112279fc36843fba53650099a3ee308f9d3c18d10acdd0c0"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.241833 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6b3427002c4950e0d8e18ef576a5f837c3eaef32ee6dfd8dbe1578d3746ed7eb"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.270330 5129 generic.go:334] "Generic (PLEG): container finished" podID="b99871bf-2660-4b7c-8ba4-e9cdae008681" containerID="5a70010d8333a1e53072fcb40971d999488aa575e0b8ed6623ad9a06f40ebcae" exitCode=0 Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.271093 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" event={"ID":"b99871bf-2660-4b7c-8ba4-e9cdae008681","Type":"ContainerDied","Data":"5a70010d8333a1e53072fcb40971d999488aa575e0b8ed6623ad9a06f40ebcae"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.284937 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.286090 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.786073095 +0000 UTC m=+211.537988279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.310533 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"887c7c4f6b7c1a5358cd72797d6ae51db4f5e99a949328430002b05723acee82"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.336798 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" event={"ID":"a97c3ccc-71bd-4f3e-8457-5e023732cb5b","Type":"ContainerStarted","Data":"db488d8a70565c81d25791f544d27e51b6b85fb0ace5da8b217d54b2d9f1ef24"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.380860 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" event={"ID":"41165ac7-1905-437e-b313-2917b3f80168","Type":"ContainerStarted","Data":"15c458bb22e4db570157a1ce39fd51e402aa605606bf9898e2662ac597b0b469"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.382839 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.390730 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.392271 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.892247138 +0000 UTC m=+211.644162322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.406741 5129 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kmpng container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.406811 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" podUID="41165ac7-1905-437e-b313-2917b3f80168" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.426893 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6z96" event={"ID":"0d03542f-7483-4aea-9651-4ab9a4ff1378","Type":"ContainerStarted","Data":"a73448c3519a6e5fbe5f1d76448c389f86819ced5f33104ca96cd92deadd118d"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.495748 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.497653 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:28.997614488 +0000 UTC m=+211.749529672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.504817 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" event={"ID":"e61dec00-008c-454f-a37d-87b8a21d4733","Type":"ContainerStarted","Data":"8dc0f863d53834d7706d20b11213f0b4d27673106cf413f54d38843f0828eacd"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.537877 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" event={"ID":"fa4b0c05-3ff2-4e52-b028-603d2eb22adc","Type":"ContainerStarted","Data":"548c0ea9103d6a1a716b4c3563eaba4dbddacd3e061735e0f6b704ede1672228"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.544093 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" event={"ID":"581cb821-3eab-416b-bb1e-013db618ed67","Type":"ContainerStarted","Data":"8eebe540f787be169fc2bc3d47756668001626f85253e269c8c97e4ba0208aab"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.544143 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" event={"ID":"581cb821-3eab-416b-bb1e-013db618ed67","Type":"ContainerStarted","Data":"6b925a57d3f966eb9709eb9949453d993d0065773b55bf439ce87928a56bf141"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.565395 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:28 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:28 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:28 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.565439 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.570437 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" event={"ID":"2adf8fa0-4f99-4e35-b7d5-b1f7d6aa392a","Type":"ContainerStarted","Data":"7e5da12be67de550c442d5a78688a78a3882f8559be0d8b3d2019cc9a5dfc5c8"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.588142 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" event={"ID":"ce155720-ca35-4e8d-8b93-1d20fe7368e6","Type":"ContainerStarted","Data":"a931c02a72154a770190ead40e6fdad24928044f853b50182659eeb5e92c7d8b"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.589170 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.590856 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rc492" event={"ID":"0986b18b-4eea-4f98-b246-017c17cbe895","Type":"ContainerStarted","Data":"e11fbf19c680d6885e4bb507504a5a96ba7d4a2e15eb1223fb507bc5c7d3163d"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.599740 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.603211 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" event={"ID":"5facc307-168d-4313-868a-8bf0db5e65ca","Type":"ContainerStarted","Data":"75f0093ac32e055bb21d119c6eadd39303b09ea641b9c78c6d625e74ad777e94"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.603257 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" event={"ID":"5facc307-168d-4313-868a-8bf0db5e65ca","Type":"ContainerStarted","Data":"43adcf023b222cb28cee4624aacf222455e003bf4956cbdf634dac77ed6ba659"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.604102 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.611313 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.111284258 +0000 UTC m=+211.863199442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.622533 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" event={"ID":"7a298896-ef40-44ff-a9ba-45fba603014b","Type":"ContainerStarted","Data":"4c4b76bcb68a3ceacafa71db66403bba08494ef61e1651bf7b91e60e19f36504"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.622575 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" event={"ID":"7a298896-ef40-44ff-a9ba-45fba603014b","Type":"ContainerStarted","Data":"28fc0e8cf5aa88ada0759ce3b09f653b36f6e8df2b133f6c1b614d4358079722"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.623400 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.629771 5129 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lf26g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.629831 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" podUID="5facc307-168d-4313-868a-8bf0db5e65ca" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.645124 5129 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cb8c5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.645541 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.650436 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" event={"ID":"50a96d01-d66b-4110-a608-aacd554c5111","Type":"ContainerStarted","Data":"503c33b0a01aa023d9104fb36714c5bc80d8f72eab7e6d359a91f51fc2250ae5"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.650485 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" event={"ID":"50a96d01-d66b-4110-a608-aacd554c5111","Type":"ContainerStarted","Data":"14edc9f4287f19dd60c9339b1220922868ce0580ed740bf100895a7912cd0b1e"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.651334 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.674961 5129 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2snfc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.675022 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" podUID="50a96d01-d66b-4110-a608-aacd554c5111" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.693521 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" event={"ID":"14a59e71-3fa0-4957-b79e-927190083c0d","Type":"ContainerStarted","Data":"268014eb6d15ce718dbf5467fc325ceaebba693d4c61d0302b44eb8738e674a1"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.701044 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.702811 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.202790838 +0000 UTC m=+211.954706022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.733017 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" event={"ID":"285a43ca-3a99-4b85-96b3-6b8619bb5853","Type":"ContainerStarted","Data":"283a369fbea9238030c2274d0e51c5a83d106dfdef02d537e613acf192ad52b8"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.771796 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" event={"ID":"a54d1367-e54c-4ebd-99f2-2ec9e8a449ec","Type":"ContainerStarted","Data":"04f95e32a6dcf5da88ee909e216665ce174fbd2bb6b36f97cd44a7ea930518c6"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.771857 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" event={"ID":"a54d1367-e54c-4ebd-99f2-2ec9e8a449ec","Type":"ContainerStarted","Data":"a01b728a0eedd2c2e614b90cf5e7593a0984ef4ccae9fbcdaba3d93698d4a26c"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.816744 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.817981 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" event={"ID":"67f3717d-a52d-46b9-9132-044239d564c3","Type":"ContainerStarted","Data":"58cd3cec30aa955400c8cdf4cb92e53820fd61a95faa408091986dcc904701dc"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.818022 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" event={"ID":"67f3717d-a52d-46b9-9132-044239d564c3","Type":"ContainerStarted","Data":"8b084c85fccc929a51d70322e42580bd812a044fb63d682fff18d39922a2232b"} Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.818439 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.318424664 +0000 UTC m=+212.070339928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.868368 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t9ngb" event={"ID":"72b1d134-f49d-4b86-9630-a7d851ac1c40","Type":"ContainerStarted","Data":"2fde2ef4583fe6ed1b4e116aa337dfe9535e2d995acd6310795ae335939a41e7"} Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.888844 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6l8xj" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.917398 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:28 crc kubenswrapper[5129]: E0314 07:02:28.918910 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.418896003 +0000 UTC m=+212.170811177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.950001 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" podStartSLOduration=160.949981009 podStartE2EDuration="2m40.949981009s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:28.899827242 +0000 UTC m=+211.651742426" watchObservedRunningTime="2026-03-14 07:02:28.949981009 +0000 UTC m=+211.701896193" Mar 14 07:02:28 crc kubenswrapper[5129]: I0314 07:02:28.950526 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" podStartSLOduration=160.950520975 podStartE2EDuration="2m40.950520975s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:28.949232548 +0000 UTC m=+211.701147732" watchObservedRunningTime="2026-03-14 07:02:28.950520975 +0000 UTC m=+211.702436159" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.018289 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" podStartSLOduration=161.018257679 podStartE2EDuration="2m41.018257679s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.016747506 +0000 UTC m=+211.768662690" watchObservedRunningTime="2026-03-14 07:02:29.018257679 +0000 UTC m=+211.770172863" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.019846 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.020441 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.520423732 +0000 UTC m=+212.272338916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.076216 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" podStartSLOduration=161.076192711 podStartE2EDuration="2m41.076192711s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.073235496 +0000 UTC m=+211.825150700" watchObservedRunningTime="2026-03-14 07:02:29.076192711 +0000 UTC m=+211.828107905" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.123028 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.123404 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.623382922 +0000 UTC m=+212.375298106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.200159 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lr6qd" podStartSLOduration=161.200137656 podStartE2EDuration="2m41.200137656s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.157019443 +0000 UTC m=+211.908934627" watchObservedRunningTime="2026-03-14 07:02:29.200137656 +0000 UTC m=+211.952052840" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.220249 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" podStartSLOduration=161.220229966 podStartE2EDuration="2m41.220229966s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.200069014 +0000 UTC m=+211.951984198" watchObservedRunningTime="2026-03-14 07:02:29.220229966 +0000 UTC m=+211.972145150" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.227039 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.227455 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.727445624 +0000 UTC m=+212.479360808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.259486 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nfv9f" podStartSLOduration=161.259465738 podStartE2EDuration="2m41.259465738s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.226352273 +0000 UTC m=+211.978267467" watchObservedRunningTime="2026-03-14 07:02:29.259465738 +0000 UTC m=+212.011380922" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.268894 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" podStartSLOduration=161.26888094 podStartE2EDuration="2m41.26888094s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.268424627 +0000 UTC m=+212.020339821" watchObservedRunningTime="2026-03-14 07:02:29.26888094 +0000 UTC m=+212.020796124" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.303025 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" podStartSLOduration=162.303008535 podStartE2EDuration="2m42.303008535s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.302105299 +0000 UTC m=+212.054020473" watchObservedRunningTime="2026-03-14 07:02:29.303008535 +0000 UTC m=+212.054923719" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.328383 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.328781 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.828754077 +0000 UTC m=+212.580669261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.328987 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.329278 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.829266912 +0000 UTC m=+212.581182096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.337943 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mss56" podStartSLOduration=161.337924641 podStartE2EDuration="2m41.337924641s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.33406504 +0000 UTC m=+212.085980224" watchObservedRunningTime="2026-03-14 07:02:29.337924641 +0000 UTC m=+212.089839825" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.412800 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" podStartSLOduration=161.412777621 podStartE2EDuration="2m41.412777621s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.383332382 +0000 UTC m=+212.135247566" watchObservedRunningTime="2026-03-14 07:02:29.412777621 +0000 UTC m=+212.164692815" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.413644 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-msrqq" podStartSLOduration=161.413635276 podStartE2EDuration="2m41.413635276s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.410279079 +0000 UTC m=+212.162194263" watchObservedRunningTime="2026-03-14 07:02:29.413635276 +0000 UTC m=+212.165550460" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.430193 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.430374 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.930348938 +0000 UTC m=+212.682264122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.430498 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.430904 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:29.930895095 +0000 UTC m=+212.682810279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.434579 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" podStartSLOduration=161.43456057 podStartE2EDuration="2m41.43456057s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.43420055 +0000 UTC m=+212.186115734" watchObservedRunningTime="2026-03-14 07:02:29.43456057 +0000 UTC m=+212.186475754" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.456277 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rrt62" podStartSLOduration=161.456255226 podStartE2EDuration="2m41.456255226s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.453467696 +0000 UTC m=+212.205382880" watchObservedRunningTime="2026-03-14 07:02:29.456255226 +0000 UTC m=+212.208170410" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.531265 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.531672 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.031636041 +0000 UTC m=+212.783551225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.531796 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.532201 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.032185667 +0000 UTC m=+212.784100851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.559149 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:29 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:29 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:29 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.559216 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.632793 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.633030 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.133004285 +0000 UTC m=+212.884919469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.633082 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.633388 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.133376716 +0000 UTC m=+212.885291900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.734639 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.734959 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.234944976 +0000 UTC m=+212.986860160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.835505 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.835848 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.335836747 +0000 UTC m=+213.087751931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.881771 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f2lk5" event={"ID":"deb9ddae-7954-4db6-9864-bae7e7c25744","Type":"ContainerStarted","Data":"2dda06ace08aa970fcbbd08cbd4dae4b65f3563d66de44bf0b175cfddd1b9b89"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.892835 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a734e723116d4db3e8a7ffd8ff86508f3f85bd99e6f68908306d0d552dcedf3d"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.899527 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"87e536ec1792acd2631a39ca5968c55ec41871d80b29f67d14425dc3203073d3"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.900116 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.908081 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d5bdj" event={"ID":"fa4b0c05-3ff2-4e52-b028-603d2eb22adc","Type":"ContainerStarted","Data":"42a173c8b1d11aaffd283e1775cff680ea2ba0042f1d0c22c5e2e8ee316855c5"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.916849 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qjqs7" event={"ID":"a54d1367-e54c-4ebd-99f2-2ec9e8a449ec","Type":"ContainerStarted","Data":"5488d2457d93bc7c47290a5e85382b3bc8495db246740fed9f3bca6e071662d1"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.937514 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.937710 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.437684115 +0000 UTC m=+213.189599299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.937774 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:29 crc kubenswrapper[5129]: E0314 07:02:29.938268 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.438247732 +0000 UTC m=+213.190162926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.957785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" event={"ID":"ce155720-ca35-4e8d-8b93-1d20fe7368e6","Type":"ContainerStarted","Data":"69ce3c03085d6d2ec4ee1d104cc4b61eab428b911815c8eb240952e6109e5941"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.958847 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-f2lk5" podStartSLOduration=8.958826705 podStartE2EDuration="8.958826705s" podCreationTimestamp="2026-03-14 07:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:29.923839486 +0000 UTC m=+212.675754670" watchObservedRunningTime="2026-03-14 07:02:29.958826705 +0000 UTC m=+212.710741889" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.961257 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"50d98527730c308122316026be08b104dcbcb51382e8e558343d9fa1edb433bc"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.964144 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" event={"ID":"14a59e71-3fa0-4957-b79e-927190083c0d","Type":"ContainerStarted","Data":"ab76aa3d1efffe7f97950d833c6fcb478d9513c9f863316fd1714068f200ca64"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.966428 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t9ngb" event={"ID":"72b1d134-f49d-4b86-9630-a7d851ac1c40","Type":"ContainerStarted","Data":"ace293abc0830d5a2905a23b8ea37a2aabc8d1c79722e257a7737dea4360ef96"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.966455 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t9ngb" event={"ID":"72b1d134-f49d-4b86-9630-a7d851ac1c40","Type":"ContainerStarted","Data":"cb0ac15b8740a3f40ac8248cb56fd0ce36f5d22eba2566920b2174179787530f"} Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.966678 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:29 crc kubenswrapper[5129]: I0314 07:02:29.993792 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dgls2" event={"ID":"285a43ca-3a99-4b85-96b3-6b8619bb5853","Type":"ContainerStarted","Data":"a4091cc87942674e54e364bdc65279cd04328d2ece860fc3302fbc96f373aa83"} Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.002392 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" event={"ID":"5516ae48-72a7-4f9e-9dd8-30618817f0c8","Type":"ContainerStarted","Data":"f99e294424b17e4be6b03b04f4237a293038a1da17f651341c9fd12713535237"} Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.010316 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" event={"ID":"b99871bf-2660-4b7c-8ba4-e9cdae008681","Type":"ContainerStarted","Data":"ab32cf9897dcb62fadb4f562ac1234d0e610e14d8f83565a5ebf1fbd80c595e2"} Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.010353 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.012514 5129 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cb8c5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.012564 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.012649 5129 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lf26g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.012711 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" podUID="5facc307-168d-4313-868a-8bf0db5e65ca" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.033248 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kmpng" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.040041 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.042123 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.542104068 +0000 UTC m=+213.294019302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.072529 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2snfc" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.076846 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t9ngb" podStartSLOduration=9.07683415 podStartE2EDuration="9.07683415s" podCreationTimestamp="2026-03-14 07:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:30.051924481 +0000 UTC m=+212.803839665" watchObservedRunningTime="2026-03-14 07:02:30.07683415 +0000 UTC m=+212.828749334" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.106585 5129 ???:1] "http: TLS handshake error from 192.168.126.11:35720: no serving certificate available for the kubelet" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.115212 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pphjw" podStartSLOduration=162.115190857 podStartE2EDuration="2m42.115190857s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:30.078990872 +0000 UTC m=+212.830906056" watchObservedRunningTime="2026-03-14 07:02:30.115190857 +0000 UTC m=+212.867106041" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.142029 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.163724 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.663706046 +0000 UTC m=+213.415621230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.244657 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.245797 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.745771594 +0000 UTC m=+213.497686778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.277673 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2b5g4" podStartSLOduration=162.277653184 podStartE2EDuration="2m42.277653184s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:30.214194783 +0000 UTC m=+212.966109967" watchObservedRunningTime="2026-03-14 07:02:30.277653184 +0000 UTC m=+213.029568388" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.278585 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" podStartSLOduration=163.27858011 podStartE2EDuration="2m43.27858011s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:30.263711462 +0000 UTC m=+213.015626666" watchObservedRunningTime="2026-03-14 07:02:30.27858011 +0000 UTC m=+213.030495294" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.348698 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.349058 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.849046933 +0000 UTC m=+213.600962117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.450290 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.450523 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:30.950496241 +0000 UTC m=+213.702411425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.551947 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.552290 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.052278017 +0000 UTC m=+213.804193201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.553269 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:30 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:30 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:30 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.553323 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.577816 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.578432 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.584043 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.584421 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.621443 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.661316 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.661491 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.161462187 +0000 UTC m=+213.913377381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.661540 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28af9c23-fb3d-4781-bc04-501f7573558a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.661641 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.661830 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28af9c23-fb3d-4781-bc04-501f7573558a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.661958 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.16192568 +0000 UTC m=+213.913840864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.762998 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.763191 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28af9c23-fb3d-4781-bc04-501f7573558a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.763264 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28af9c23-fb3d-4781-bc04-501f7573558a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.763532 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28af9c23-fb3d-4781-bc04-501f7573558a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.763539 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.263508921 +0000 UTC m=+214.015424105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.805228 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28af9c23-fb3d-4781-bc04-501f7573558a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.865125 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.865863 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.365845134 +0000 UTC m=+214.117760308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.896339 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.966483 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.966668 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.466644682 +0000 UTC m=+214.218559866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:30 crc kubenswrapper[5129]: I0314 07:02:30.966702 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:30 crc kubenswrapper[5129]: E0314 07:02:30.966990 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.466981411 +0000 UTC m=+214.218896595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.018018 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rc492" event={"ID":"0986b18b-4eea-4f98-b246-017c17cbe895","Type":"ContainerStarted","Data":"db7bf47132ecd1eef8ee45eeca6e0e13f6b415d89098466582ebcc6709bac8e8"} Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.020096 5129 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cb8c5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.020141 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.020472 5129 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-trtr5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.020500 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" podUID="b99871bf-2660-4b7c-8ba4-e9cdae008681" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.067273 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.068454 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.568435918 +0000 UTC m=+214.320351102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.099878 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xgc7f"] Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.100461 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" podUID="e8915d74-77d5-4d0f-9264-37b6f8167a6d" containerName="controller-manager" containerID="cri-o://8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6" gracePeriod=30 Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.158314 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8"] Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.158558 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" podUID="4b81a68d-81ce-4406-ae24-511cda2d8936" containerName="route-controller-manager" containerID="cri-o://10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098" gracePeriod=30 Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.171227 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.172036 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.672019277 +0000 UTC m=+214.423934461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.265357 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.275184 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.275676 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.775654857 +0000 UTC m=+214.527570041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.379392 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.379827 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.879810611 +0000 UTC m=+214.631725795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.482439 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.482936 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:31.982921646 +0000 UTC m=+214.734836830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.552801 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:31 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:31 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:31 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.552867 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.585663 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.585966 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.085954739 +0000 UTC m=+214.837869923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.606781 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.686105 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.686145 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89tfr\" (UniqueName: \"kubernetes.io/projected/4b81a68d-81ce-4406-ae24-511cda2d8936-kube-api-access-89tfr\") pod \"4b81a68d-81ce-4406-ae24-511cda2d8936\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.686189 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b81a68d-81ce-4406-ae24-511cda2d8936-serving-cert\") pod \"4b81a68d-81ce-4406-ae24-511cda2d8936\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.686219 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-config\") pod \"4b81a68d-81ce-4406-ae24-511cda2d8936\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.686239 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-client-ca\") pod \"4b81a68d-81ce-4406-ae24-511cda2d8936\" (UID: \"4b81a68d-81ce-4406-ae24-511cda2d8936\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.687198 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b81a68d-81ce-4406-ae24-511cda2d8936" (UID: "4b81a68d-81ce-4406-ae24-511cda2d8936"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.687277 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.187260542 +0000 UTC m=+214.939175726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.688172 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-config" (OuterVolumeSpecName: "config") pod "4b81a68d-81ce-4406-ae24-511cda2d8936" (UID: "4b81a68d-81ce-4406-ae24-511cda2d8936"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.699361 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b81a68d-81ce-4406-ae24-511cda2d8936-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b81a68d-81ce-4406-ae24-511cda2d8936" (UID: "4b81a68d-81ce-4406-ae24-511cda2d8936"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.701844 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b81a68d-81ce-4406-ae24-511cda2d8936-kube-api-access-89tfr" (OuterVolumeSpecName: "kube-api-access-89tfr") pod "4b81a68d-81ce-4406-ae24-511cda2d8936" (UID: "4b81a68d-81ce-4406-ae24-511cda2d8936"). InnerVolumeSpecName "kube-api-access-89tfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.788156 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.788582 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b81a68d-81ce-4406-ae24-511cda2d8936-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.788601 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.788624 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b81a68d-81ce-4406-ae24-511cda2d8936-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.788636 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89tfr\" (UniqueName: \"kubernetes.io/projected/4b81a68d-81ce-4406-ae24-511cda2d8936-kube-api-access-89tfr\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.788936 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.288920454 +0000 UTC m=+215.040835638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.796967 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82"] Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.797172 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b81a68d-81ce-4406-ae24-511cda2d8936" containerName="route-controller-manager" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.797184 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b81a68d-81ce-4406-ae24-511cda2d8936" containerName="route-controller-manager" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.797304 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b81a68d-81ce-4406-ae24-511cda2d8936" containerName="route-controller-manager" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.797690 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.812735 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82"] Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.848365 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.889641 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-config\") pod \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.889785 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.889820 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-client-ca\") pod \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.889848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8752z\" (UniqueName: \"kubernetes.io/projected/e8915d74-77d5-4d0f-9264-37b6f8167a6d-kube-api-access-8752z\") pod \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.889869 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-proxy-ca-bundles\") pod \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.889947 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8915d74-77d5-4d0f-9264-37b6f8167a6d-serving-cert\") pod \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\" (UID: \"e8915d74-77d5-4d0f-9264-37b6f8167a6d\") " Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.890066 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dc5638-0119-4e96-aa52-e6a61f6aa25e-serving-cert\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.890091 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-client-ca\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.890120 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hvr4\" (UniqueName: \"kubernetes.io/projected/16dc5638-0119-4e96-aa52-e6a61f6aa25e-kube-api-access-4hvr4\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.890192 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-config\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.891094 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-config" (OuterVolumeSpecName: "config") pod "e8915d74-77d5-4d0f-9264-37b6f8167a6d" (UID: "e8915d74-77d5-4d0f-9264-37b6f8167a6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.891159 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.391144954 +0000 UTC m=+215.143060138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.891572 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-client-ca" (OuterVolumeSpecName: "client-ca") pod "e8915d74-77d5-4d0f-9264-37b6f8167a6d" (UID: "e8915d74-77d5-4d0f-9264-37b6f8167a6d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.892142 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e8915d74-77d5-4d0f-9264-37b6f8167a6d" (UID: "e8915d74-77d5-4d0f-9264-37b6f8167a6d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.897279 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8915d74-77d5-4d0f-9264-37b6f8167a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e8915d74-77d5-4d0f-9264-37b6f8167a6d" (UID: "e8915d74-77d5-4d0f-9264-37b6f8167a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.898258 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8915d74-77d5-4d0f-9264-37b6f8167a6d-kube-api-access-8752z" (OuterVolumeSpecName: "kube-api-access-8752z") pod "e8915d74-77d5-4d0f-9264-37b6f8167a6d" (UID: "e8915d74-77d5-4d0f-9264-37b6f8167a6d"). InnerVolumeSpecName "kube-api-access-8752z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.898615 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6t6l8"] Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.898862 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8915d74-77d5-4d0f-9264-37b6f8167a6d" containerName="controller-manager" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.898886 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8915d74-77d5-4d0f-9264-37b6f8167a6d" containerName="controller-manager" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.899009 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8915d74-77d5-4d0f-9264-37b6f8167a6d" containerName="controller-manager" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.900007 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.902129 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.913139 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t6l8"] Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.965818 5129 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991443 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-config\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991507 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-catalog-content\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991537 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dc5638-0119-4e96-aa52-e6a61f6aa25e-serving-cert\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991564 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-client-ca\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991596 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-utilities\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991705 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hvr4\" (UniqueName: \"kubernetes.io/projected/16dc5638-0119-4e96-aa52-e6a61f6aa25e-kube-api-access-4hvr4\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991763 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfmb\" (UniqueName: \"kubernetes.io/projected/a848f19c-da50-403e-b620-5425b51fab9a-kube-api-access-nwfmb\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991806 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991862 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8915d74-77d5-4d0f-9264-37b6f8167a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991875 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991888 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991898 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8752z\" (UniqueName: \"kubernetes.io/projected/e8915d74-77d5-4d0f-9264-37b6f8167a6d-kube-api-access-8752z\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.991910 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8915d74-77d5-4d0f-9264-37b6f8167a6d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:31 crc kubenswrapper[5129]: E0314 07:02:31.992241 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.49222542 +0000 UTC m=+215.244140664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.992904 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-config\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:31 crc kubenswrapper[5129]: I0314 07:02:31.992968 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-client-ca\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.007931 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dc5638-0119-4e96-aa52-e6a61f6aa25e-serving-cert\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.009077 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hvr4\" (UniqueName: \"kubernetes.io/projected/16dc5638-0119-4e96-aa52-e6a61f6aa25e-kube-api-access-4hvr4\") pod \"route-controller-manager-6bdb4cfc77-xxf82\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.042412 5129 generic.go:334] "Generic (PLEG): container finished" podID="4b81a68d-81ce-4406-ae24-511cda2d8936" containerID="10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098" exitCode=0 Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.042558 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.047341 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" event={"ID":"4b81a68d-81ce-4406-ae24-511cda2d8936","Type":"ContainerDied","Data":"10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.047392 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8" event={"ID":"4b81a68d-81ce-4406-ae24-511cda2d8936","Type":"ContainerDied","Data":"33711b805c5e25cd55e40c989fc60eff86b9de4bbf083bd87a875f6f1d9cb626"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.047411 5129 scope.go:117] "RemoveContainer" containerID="10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.047428 5129 generic.go:334] "Generic (PLEG): container finished" podID="021f1f3b-8ee9-424f-9c04-c56631332e92" containerID="d78d5a7174eb10dce09651d1b84244d39b7ad36c4f9c6d9f01a997b9baba6566" exitCode=0 Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.047535 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" event={"ID":"021f1f3b-8ee9-424f-9c04-c56631332e92","Type":"ContainerDied","Data":"d78d5a7174eb10dce09651d1b84244d39b7ad36c4f9c6d9f01a997b9baba6566"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.050558 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rc492" event={"ID":"0986b18b-4eea-4f98-b246-017c17cbe895","Type":"ContainerStarted","Data":"2aaa6cc59413076d1856e914d9bc6d2e0977cebe60ab6540b7f28ae10033e783"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.052561 5129 generic.go:334] "Generic (PLEG): container finished" podID="e8915d74-77d5-4d0f-9264-37b6f8167a6d" containerID="8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6" exitCode=0 Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.052643 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.052643 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" event={"ID":"e8915d74-77d5-4d0f-9264-37b6f8167a6d","Type":"ContainerDied","Data":"8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.052694 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xgc7f" event={"ID":"e8915d74-77d5-4d0f-9264-37b6f8167a6d","Type":"ContainerDied","Data":"f0040a5ad862ec92b613ac2596511147b20f77d1c6cc5c9856c4fdf45095bc38"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.054101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"28af9c23-fb3d-4781-bc04-501f7573558a","Type":"ContainerStarted","Data":"e0f64e92051cfcc1bfae7c0ea2c48ab9f835e2ed0b0c5342d08560400cd846b2"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.054147 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"28af9c23-fb3d-4781-bc04-501f7573558a","Type":"ContainerStarted","Data":"180bec5ce2f95cc97cd1e7f336d22b3629f54c274334d9b4599d0286ada96fb2"} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.089200 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gw4hz"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.090111 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.093040 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.093303 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-catalog-content\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.093357 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-utilities\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.093433 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfmb\" (UniqueName: \"kubernetes.io/projected/a848f19c-da50-403e-b620-5425b51fab9a-kube-api-access-nwfmb\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:32 crc kubenswrapper[5129]: E0314 07:02:32.094590 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.594567193 +0000 UTC m=+215.346482407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.094947 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-catalog-content\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.095051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-utilities\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.096288 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.102708 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.102689427 podStartE2EDuration="2.102689427s" podCreationTimestamp="2026-03-14 07:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:32.10140448 +0000 UTC m=+214.853319664" watchObservedRunningTime="2026-03-14 07:02:32.102689427 +0000 UTC m=+214.854604611" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.114787 5129 scope.go:117] "RemoveContainer" containerID="10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098" Mar 14 07:02:32 crc kubenswrapper[5129]: E0314 07:02:32.115281 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098\": container with ID starting with 10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098 not found: ID does not exist" containerID="10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.115314 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098"} err="failed to get container status \"10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098\": rpc error: code = NotFound desc = could not find container \"10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098\": container with ID starting with 10da8edb1e01125740eea3eff8feaaf82de83bbeebcbf27a69dd5a26adc68098 not found: ID does not exist" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.115340 5129 scope.go:117] "RemoveContainer" containerID="8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.115644 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gw4hz"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.129138 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfmb\" (UniqueName: \"kubernetes.io/projected/a848f19c-da50-403e-b620-5425b51fab9a-kube-api-access-nwfmb\") pod \"certified-operators-6t6l8\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.145509 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.147041 5129 scope.go:117] "RemoveContainer" containerID="8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6" Mar 14 07:02:32 crc kubenswrapper[5129]: E0314 07:02:32.147508 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6\": container with ID starting with 8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6 not found: ID does not exist" containerID="8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.147535 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6"} err="failed to get container status \"8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6\": rpc error: code = NotFound desc = could not find container \"8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6\": container with ID starting with 8542d355adf73f69f07f43de1576c808ff498e7de823e53e95d3e3374dcb57e6 not found: ID does not exist" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.148847 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.150967 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-98rd8"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.159268 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xgc7f"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.162573 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xgc7f"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.194627 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxdf\" (UniqueName: \"kubernetes.io/projected/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-kube-api-access-vsxdf\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.195268 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-catalog-content\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.195320 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.195355 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-utilities\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: E0314 07:02:32.195703 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.69568319 +0000 UTC m=+215.447598434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pggtq" (UID: "f2a7f356-6278-409f-9047-efece8492b78") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.232296 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.289685 5129 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-14T07:02:31.965849319Z","Handler":null,"Name":""} Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.293758 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8mmjl"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.296215 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.296497 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-catalog-content\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: E0314 07:02:32.296638 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 07:02:32.796589272 +0000 UTC m=+215.548504456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.296706 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-utilities\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.296791 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxdf\" (UniqueName: \"kubernetes.io/projected/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-kube-api-access-vsxdf\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.297007 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-catalog-content\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.297259 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-utilities\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.297480 5129 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.297508 5129 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.297603 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.310540 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mmjl"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.331720 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxdf\" (UniqueName: \"kubernetes.io/projected/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-kube-api-access-vsxdf\") pod \"community-operators-gw4hz\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.387557 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.398427 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9sm\" (UniqueName: \"kubernetes.io/projected/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-kube-api-access-md9sm\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.398504 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.398562 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-utilities\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.398584 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-catalog-content\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.406057 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.406089 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.428357 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.439283 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pggtq\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.489808 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gqcxv"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.490933 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.500108 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.500326 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md9sm\" (UniqueName: \"kubernetes.io/projected/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-kube-api-access-md9sm\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.500409 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-utilities\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.500431 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-catalog-content\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.500844 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-catalog-content\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.501371 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-utilities\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.514076 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.514815 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqcxv"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.532163 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.536309 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9sm\" (UniqueName: \"kubernetes.io/projected/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-kube-api-access-md9sm\") pod \"certified-operators-8mmjl\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.536382 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t6l8"] Mar 14 07:02:32 crc kubenswrapper[5129]: W0314 07:02:32.547291 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda848f19c_da50_403e_b620_5425b51fab9a.slice/crio-b6b59c573404920b8eb8b1aabb297ba85372767af447631c8558ffd76070117f WatchSource:0}: Error finding container b6b59c573404920b8eb8b1aabb297ba85372767af447631c8558ffd76070117f: Status 404 returned error can't find the container with id b6b59c573404920b8eb8b1aabb297ba85372767af447631c8558ffd76070117f Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.550109 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:32 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:32 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:32 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.550159 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.601892 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-catalog-content\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.602270 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-utilities\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.602356 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d764\" (UniqueName: \"kubernetes.io/projected/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-kube-api-access-6d764\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.646524 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.647113 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.655943 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.659211 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.664794 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.682904 5129 patch_prober.go:28] interesting pod/console-f9d7485db-vpw78 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.682986 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vpw78" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.696196 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.705961 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-catalog-content\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.706050 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-utilities\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.706126 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d764\" (UniqueName: \"kubernetes.io/projected/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-kube-api-access-6d764\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.708162 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-catalog-content\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.709390 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-utilities\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.725106 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-trtr5" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.754501 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d764\" (UniqueName: \"kubernetes.io/projected/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-kube-api-access-6d764\") pod \"community-operators-gqcxv\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.822856 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gw4hz"] Mar 14 07:02:32 crc kubenswrapper[5129]: I0314 07:02:32.829014 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:02:32 crc kubenswrapper[5129]: W0314 07:02:32.834544 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0704b1_3b5e_4e02_9f82_c1d74ad03387.slice/crio-4ff447d4501e4c0b9bd80c0475fca4fb1bc6745f5f322d5dc5932cf4d14afeb8 WatchSource:0}: Error finding container 4ff447d4501e4c0b9bd80c0475fca4fb1bc6745f5f322d5dc5932cf4d14afeb8: Status 404 returned error can't find the container with id 4ff447d4501e4c0b9bd80c0475fca4fb1bc6745f5f322d5dc5932cf4d14afeb8 Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.019729 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pggtq"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.094892 5129 generic.go:334] "Generic (PLEG): container finished" podID="28af9c23-fb3d-4781-bc04-501f7573558a" containerID="e0f64e92051cfcc1bfae7c0ea2c48ab9f835e2ed0b0c5342d08560400cd846b2" exitCode=0 Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.095007 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"28af9c23-fb3d-4781-bc04-501f7573558a","Type":"ContainerDied","Data":"e0f64e92051cfcc1bfae7c0ea2c48ab9f835e2ed0b0c5342d08560400cd846b2"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.099496 5129 generic.go:334] "Generic (PLEG): container finished" podID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerID="8825e5de0ec40557a273c02c3eeaca432046c8ae235cec0bc10458d619a39117" exitCode=0 Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.099557 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw4hz" event={"ID":"ff0704b1-3b5e-4e02-9f82-c1d74ad03387","Type":"ContainerDied","Data":"8825e5de0ec40557a273c02c3eeaca432046c8ae235cec0bc10458d619a39117"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.099578 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw4hz" event={"ID":"ff0704b1-3b5e-4e02-9f82-c1d74ad03387","Type":"ContainerStarted","Data":"4ff447d4501e4c0b9bd80c0475fca4fb1bc6745f5f322d5dc5932cf4d14afeb8"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.110474 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" event={"ID":"16dc5638-0119-4e96-aa52-e6a61f6aa25e","Type":"ContainerStarted","Data":"caf709c2a94fc19a4261f726bebf0a7b0b68621ec0490bb5a5f7a39d22ecdfb1"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.110526 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" event={"ID":"16dc5638-0119-4e96-aa52-e6a61f6aa25e","Type":"ContainerStarted","Data":"f989e9b688031c74f8d33332db8d2d5838476a7bfec2b309fb5446a2508c3f81"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.111645 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.123145 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" event={"ID":"f2a7f356-6278-409f-9047-efece8492b78","Type":"ContainerStarted","Data":"f45387b275c6d90c9e116ea809f7d9dc61917156838c02d6e66cf19e84efe4f1"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.128775 5129 generic.go:334] "Generic (PLEG): container finished" podID="a848f19c-da50-403e-b620-5425b51fab9a" containerID="b90b47b68c2587d6749664ae94416e4120c02f08e5be7051331040d949e2086b" exitCode=0 Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.129745 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mmjl"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.129770 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6l8" event={"ID":"a848f19c-da50-403e-b620-5425b51fab9a","Type":"ContainerDied","Data":"b90b47b68c2587d6749664ae94416e4120c02f08e5be7051331040d949e2086b"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.129785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6l8" event={"ID":"a848f19c-da50-403e-b620-5425b51fab9a","Type":"ContainerStarted","Data":"b6b59c573404920b8eb8b1aabb297ba85372767af447631c8558ffd76070117f"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.144398 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rc492" event={"ID":"0986b18b-4eea-4f98-b246-017c17cbe895","Type":"ContainerStarted","Data":"417af67b9451aba366e1b405eaab022d20f05ed43fd0f7da31ec6cc23581e880"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.144451 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rc492" event={"ID":"0986b18b-4eea-4f98-b246-017c17cbe895","Type":"ContainerStarted","Data":"37c4e2396132dd6b148ef2a076bd437085e554f7a2d940a9b943d51a7053a431"} Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.158113 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.171194 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-k7q7x" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.180236 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqcxv"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.184842 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" podStartSLOduration=2.1848182879999998 podStartE2EDuration="2.184818288s" podCreationTimestamp="2026-03-14 07:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:33.173675336 +0000 UTC m=+215.925590530" watchObservedRunningTime="2026-03-14 07:02:33.184818288 +0000 UTC m=+215.936733472" Mar 14 07:02:33 crc kubenswrapper[5129]: W0314 07:02:33.208942 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee9d7764_4db5_49d9_9b08_7b2317ec41ca.slice/crio-9917e4c91fe034a05ae9d77e77167039050c3524e4574dcb64d3c57e9af65e74 WatchSource:0}: Error finding container 9917e4c91fe034a05ae9d77e77167039050c3524e4574dcb64d3c57e9af65e74: Status 404 returned error can't find the container with id 9917e4c91fe034a05ae9d77e77167039050c3524e4574dcb64d3c57e9af65e74 Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.247660 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rc492" podStartSLOduration=12.247603199 podStartE2EDuration="12.247603199s" podCreationTimestamp="2026-03-14 07:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:33.237194238 +0000 UTC m=+215.989109432" watchObservedRunningTime="2026-03-14 07:02:33.247603199 +0000 UTC m=+215.999518383" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.355765 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77fdf47f59-rcbzn"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.356394 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.364208 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.365266 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.365426 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.365556 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.367024 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.373952 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.375886 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77fdf47f59-rcbzn"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.416990 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.431871 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-config\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.431946 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-proxy-ca-bundles\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.432052 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5386c646-51eb-4c9a-9aea-530aaed9f5c0-serving-cert\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.432093 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-client-ca\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.432142 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld4j9\" (UniqueName: \"kubernetes.io/projected/5386c646-51eb-4c9a-9aea-530aaed9f5c0-kube-api-access-ld4j9\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.474264 5129 patch_prober.go:28] interesting pod/downloads-7954f5f757-hrz85 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.474319 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hrz85" podUID="50cfd6b1-bf1f-4a8b-bd6d-9799281f1359" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.476589 5129 patch_prober.go:28] interesting pod/downloads-7954f5f757-hrz85 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.476644 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hrz85" podUID="50cfd6b1-bf1f-4a8b-bd6d-9799281f1359" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.521776 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.522426 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.527918 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.528196 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.533176 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5386c646-51eb-4c9a-9aea-530aaed9f5c0-serving-cert\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.533240 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-client-ca\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.533288 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld4j9\" (UniqueName: \"kubernetes.io/projected/5386c646-51eb-4c9a-9aea-530aaed9f5c0-kube-api-access-ld4j9\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.533358 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-config\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.533376 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-proxy-ca-bundles\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.534795 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-proxy-ca-bundles\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.535379 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-client-ca\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.536453 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-config\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.543466 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.546580 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.563284 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:33 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:33 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:33 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.563369 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.564028 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5386c646-51eb-4c9a-9aea-530aaed9f5c0-serving-cert\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.574208 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld4j9\" (UniqueName: \"kubernetes.io/projected/5386c646-51eb-4c9a-9aea-530aaed9f5c0-kube-api-access-ld4j9\") pod \"controller-manager-77fdf47f59-rcbzn\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.635989 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf17a99-0755-477f-9be5-50de08b6c814-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.636134 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf17a99-0755-477f-9be5-50de08b6c814-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.679034 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.736634 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/021f1f3b-8ee9-424f-9c04-c56631332e92-secret-volume\") pod \"021f1f3b-8ee9-424f-9c04-c56631332e92\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.736702 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/021f1f3b-8ee9-424f-9c04-c56631332e92-config-volume\") pod \"021f1f3b-8ee9-424f-9c04-c56631332e92\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.736730 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ct8n\" (UniqueName: \"kubernetes.io/projected/021f1f3b-8ee9-424f-9c04-c56631332e92-kube-api-access-4ct8n\") pod \"021f1f3b-8ee9-424f-9c04-c56631332e92\" (UID: \"021f1f3b-8ee9-424f-9c04-c56631332e92\") " Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.736912 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf17a99-0755-477f-9be5-50de08b6c814-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.736944 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf17a99-0755-477f-9be5-50de08b6c814-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.737046 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf17a99-0755-477f-9be5-50de08b6c814-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.737145 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.738360 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/021f1f3b-8ee9-424f-9c04-c56631332e92-config-volume" (OuterVolumeSpecName: "config-volume") pod "021f1f3b-8ee9-424f-9c04-c56631332e92" (UID: "021f1f3b-8ee9-424f-9c04-c56631332e92"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.742442 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/021f1f3b-8ee9-424f-9c04-c56631332e92-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "021f1f3b-8ee9-424f-9c04-c56631332e92" (UID: "021f1f3b-8ee9-424f-9c04-c56631332e92"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.742566 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021f1f3b-8ee9-424f-9c04-c56631332e92-kube-api-access-4ct8n" (OuterVolumeSpecName: "kube-api-access-4ct8n") pod "021f1f3b-8ee9-424f-9c04-c56631332e92" (UID: "021f1f3b-8ee9-424f-9c04-c56631332e92"). InnerVolumeSpecName "kube-api-access-4ct8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.753196 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf17a99-0755-477f-9be5-50de08b6c814-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.838896 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/021f1f3b-8ee9-424f-9c04-c56631332e92-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.838930 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ct8n\" (UniqueName: \"kubernetes.io/projected/021f1f3b-8ee9-424f-9c04-c56631332e92-kube-api-access-4ct8n\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.838944 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/021f1f3b-8ee9-424f-9c04-c56631332e92-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.893258 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqck"] Mar 14 07:02:33 crc kubenswrapper[5129]: E0314 07:02:33.894809 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021f1f3b-8ee9-424f-9c04-c56631332e92" containerName="collect-profiles" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.894830 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="021f1f3b-8ee9-424f-9c04-c56631332e92" containerName="collect-profiles" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.894963 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="021f1f3b-8ee9-424f-9c04-c56631332e92" containerName="collect-profiles" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.896792 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.900471 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.904403 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqck"] Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.941282 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft5dc\" (UniqueName: \"kubernetes.io/projected/2cdca220-d63a-45e2-ad4e-d2b920554116-kube-api-access-ft5dc\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.941341 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-utilities\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.941433 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-catalog-content\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:33 crc kubenswrapper[5129]: I0314 07:02:33.979673 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.049197 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5dc\" (UniqueName: \"kubernetes.io/projected/2cdca220-d63a-45e2-ad4e-d2b920554116-kube-api-access-ft5dc\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.049245 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-utilities\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.049290 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-catalog-content\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.049956 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-catalog-content\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.050429 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-utilities\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.088310 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5dc\" (UniqueName: \"kubernetes.io/projected/2cdca220-d63a-45e2-ad4e-d2b920554116-kube-api-access-ft5dc\") pod \"redhat-marketplace-mdqck\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.095149 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b81a68d-81ce-4406-ae24-511cda2d8936" path="/var/lib/kubelet/pods/4b81a68d-81ce-4406-ae24-511cda2d8936/volumes" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.096145 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.096768 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8915d74-77d5-4d0f-9264-37b6f8167a6d" path="/var/lib/kubelet/pods/e8915d74-77d5-4d0f-9264-37b6f8167a6d/volumes" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.176805 5129 generic.go:334] "Generic (PLEG): container finished" podID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerID="e6d3622c77e3bf8db13f8e1ec0ef42879cd9acdab50723f3dfa2d30972c82aae" exitCode=0 Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.176873 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mmjl" event={"ID":"7470c785-0846-4e7f-9d1b-aad9a52e5a5b","Type":"ContainerDied","Data":"e6d3622c77e3bf8db13f8e1ec0ef42879cd9acdab50723f3dfa2d30972c82aae"} Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.176901 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mmjl" event={"ID":"7470c785-0846-4e7f-9d1b-aad9a52e5a5b","Type":"ContainerStarted","Data":"62d857225125c5d6ab1b9d2c1a6377e297b521d29093a79bd2aaf63176f11769"} Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.189383 5129 generic.go:334] "Generic (PLEG): container finished" podID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerID="bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb" exitCode=0 Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.189501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqcxv" event={"ID":"ee9d7764-4db5-49d9-9b08-7b2317ec41ca","Type":"ContainerDied","Data":"bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb"} Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.189535 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqcxv" event={"ID":"ee9d7764-4db5-49d9-9b08-7b2317ec41ca","Type":"ContainerStarted","Data":"9917e4c91fe034a05ae9d77e77167039050c3524e4574dcb64d3c57e9af65e74"} Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.192048 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77fdf47f59-rcbzn"] Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.202667 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" event={"ID":"f2a7f356-6278-409f-9047-efece8492b78","Type":"ContainerStarted","Data":"c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb"} Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.202893 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.212049 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.213174 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5" event={"ID":"021f1f3b-8ee9-424f-9c04-c56631332e92","Type":"ContainerDied","Data":"b45af925f36a74687a11f4b4606bb2a96911004e32869078765e08b1b11cfcea"} Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.213196 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b45af925f36a74687a11f4b4606bb2a96911004e32869078765e08b1b11cfcea" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.230289 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.268819 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" podStartSLOduration=166.268797162 podStartE2EDuration="2m46.268797162s" podCreationTimestamp="2026-03-14 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:34.263567261 +0000 UTC m=+217.015482445" watchObservedRunningTime="2026-03-14 07:02:34.268797162 +0000 UTC m=+217.020712346" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.270935 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.290825 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-glrkk"] Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.292716 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lf26g" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.293815 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.313641 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-glrkk"] Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.369432 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcj68\" (UniqueName: \"kubernetes.io/projected/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-kube-api-access-rcj68\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.369561 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-catalog-content\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.369590 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-utilities\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.473590 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcj68\" (UniqueName: \"kubernetes.io/projected/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-kube-api-access-rcj68\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.474449 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-catalog-content\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.474474 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-utilities\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.475286 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-catalog-content\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.475562 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-utilities\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.502090 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcj68\" (UniqueName: \"kubernetes.io/projected/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-kube-api-access-rcj68\") pod \"redhat-marketplace-glrkk\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.553055 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:34 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:34 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:34 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.553120 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.568789 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqck"] Mar 14 07:02:34 crc kubenswrapper[5129]: W0314 07:02:34.577196 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cdca220_d63a_45e2_ad4e_d2b920554116.slice/crio-29b37f2af887874493fcff382ffac82f3321a7f8d31726eda8aa0d7dd3480f84 WatchSource:0}: Error finding container 29b37f2af887874493fcff382ffac82f3321a7f8d31726eda8aa0d7dd3480f84: Status 404 returned error can't find the container with id 29b37f2af887874493fcff382ffac82f3321a7f8d31726eda8aa0d7dd3480f84 Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.593937 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.616246 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:34 crc kubenswrapper[5129]: W0314 07:02:34.631562 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1cf17a99_0755_477f_9be5_50de08b6c814.slice/crio-a37d6de59891afaf8ee49f7869e461dbc1bc5b2c7b7f3e09acf33ce1fb6d5efc WatchSource:0}: Error finding container a37d6de59891afaf8ee49f7869e461dbc1bc5b2c7b7f3e09acf33ce1fb6d5efc: Status 404 returned error can't find the container with id a37d6de59891afaf8ee49f7869e461dbc1bc5b2c7b7f3e09acf33ce1fb6d5efc Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.664773 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.681821 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28af9c23-fb3d-4781-bc04-501f7573558a-kube-api-access\") pod \"28af9c23-fb3d-4781-bc04-501f7573558a\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.681867 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28af9c23-fb3d-4781-bc04-501f7573558a-kubelet-dir\") pod \"28af9c23-fb3d-4781-bc04-501f7573558a\" (UID: \"28af9c23-fb3d-4781-bc04-501f7573558a\") " Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.682194 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28af9c23-fb3d-4781-bc04-501f7573558a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "28af9c23-fb3d-4781-bc04-501f7573558a" (UID: "28af9c23-fb3d-4781-bc04-501f7573558a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.694800 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28af9c23-fb3d-4781-bc04-501f7573558a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "28af9c23-fb3d-4781-bc04-501f7573558a" (UID: "28af9c23-fb3d-4781-bc04-501f7573558a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.787473 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28af9c23-fb3d-4781-bc04-501f7573558a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:34 crc kubenswrapper[5129]: I0314 07:02:34.787916 5129 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28af9c23-fb3d-4781-bc04-501f7573558a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.105355 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-glrkk"] Mar 14 07:02:35 crc kubenswrapper[5129]: W0314 07:02:35.133905 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d61c74_41cf_4ed5_a0c0_ffae19f82be0.slice/crio-e70f09050edc1f744c08ea351abda29f62ffeb1d185280de54f22b3a3689a54d WatchSource:0}: Error finding container e70f09050edc1f744c08ea351abda29f62ffeb1d185280de54f22b3a3689a54d: Status 404 returned error can't find the container with id e70f09050edc1f744c08ea351abda29f62ffeb1d185280de54f22b3a3689a54d Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.219101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" event={"ID":"5386c646-51eb-4c9a-9aea-530aaed9f5c0","Type":"ContainerStarted","Data":"05d0c08a52f41697219d175c2b2fefcbe9e9aa33fec1defa4a4e67697c8e632f"} Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.219141 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" event={"ID":"5386c646-51eb-4c9a-9aea-530aaed9f5c0","Type":"ContainerStarted","Data":"912075cda309e43fca7f54b2464a99264310c69538cd34b6719d3d47f041db1e"} Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.219501 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.228407 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqck" event={"ID":"2cdca220-d63a-45e2-ad4e-d2b920554116","Type":"ContainerStarted","Data":"29b37f2af887874493fcff382ffac82f3321a7f8d31726eda8aa0d7dd3480f84"} Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.229581 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glrkk" event={"ID":"35d61c74-41cf-4ed5-a0c0-ffae19f82be0","Type":"ContainerStarted","Data":"e70f09050edc1f744c08ea351abda29f62ffeb1d185280de54f22b3a3689a54d"} Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.231078 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.232172 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.232347 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"28af9c23-fb3d-4781-bc04-501f7573558a","Type":"ContainerDied","Data":"180bec5ce2f95cc97cd1e7f336d22b3629f54c274334d9b4599d0286ada96fb2"} Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.232367 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="180bec5ce2f95cc97cd1e7f336d22b3629f54c274334d9b4599d0286ada96fb2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.236590 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1cf17a99-0755-477f-9be5-50de08b6c814","Type":"ContainerStarted","Data":"a37d6de59891afaf8ee49f7869e461dbc1bc5b2c7b7f3e09acf33ce1fb6d5efc"} Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.257471 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" podStartSLOduration=4.257443254 podStartE2EDuration="4.257443254s" podCreationTimestamp="2026-03-14 07:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:35.244915653 +0000 UTC m=+217.996830837" watchObservedRunningTime="2026-03-14 07:02:35.257443254 +0000 UTC m=+218.009358438" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.281023 5129 ???:1] "http: TLS handshake error from 192.168.126.11:42648: no serving certificate available for the kubelet" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.299033 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fg5m2"] Mar 14 07:02:35 crc kubenswrapper[5129]: E0314 07:02:35.299707 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28af9c23-fb3d-4781-bc04-501f7573558a" containerName="pruner" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.299732 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="28af9c23-fb3d-4781-bc04-501f7573558a" containerName="pruner" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.299945 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="28af9c23-fb3d-4781-bc04-501f7573558a" containerName="pruner" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.301312 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.307187 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.309639 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fg5m2"] Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.403706 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-utilities\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.403793 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-catalog-content\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.403851 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbvl\" (UniqueName: \"kubernetes.io/projected/647966c7-67bc-4945-a281-477f0f83496e-kube-api-access-kvbvl\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.505333 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbvl\" (UniqueName: \"kubernetes.io/projected/647966c7-67bc-4945-a281-477f0f83496e-kube-api-access-kvbvl\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.505416 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-utilities\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.505464 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-catalog-content\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.505922 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-catalog-content\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.505976 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-utilities\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.526177 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbvl\" (UniqueName: \"kubernetes.io/projected/647966c7-67bc-4945-a281-477f0f83496e-kube-api-access-kvbvl\") pod \"redhat-operators-fg5m2\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.550378 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:35 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:35 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:35 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.550440 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.628980 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.688910 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rxbtj"] Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.693236 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.701391 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxbtj"] Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.813369 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-utilities\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.813874 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vt7x\" (UniqueName: \"kubernetes.io/projected/8e8571cc-ed81-4074-a9e7-24f81fa725f0-kube-api-access-5vt7x\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.814018 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-catalog-content\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.892162 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fg5m2"] Mar 14 07:02:35 crc kubenswrapper[5129]: W0314 07:02:35.898638 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647966c7_67bc_4945_a281_477f0f83496e.slice/crio-dff040fc114756b7df333a9840a152c0b48a03145eb2c50db65eb733f96802ad WatchSource:0}: Error finding container dff040fc114756b7df333a9840a152c0b48a03145eb2c50db65eb733f96802ad: Status 404 returned error can't find the container with id dff040fc114756b7df333a9840a152c0b48a03145eb2c50db65eb733f96802ad Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.915481 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-catalog-content\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.915654 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-utilities\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.915730 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vt7x\" (UniqueName: \"kubernetes.io/projected/8e8571cc-ed81-4074-a9e7-24f81fa725f0-kube-api-access-5vt7x\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.916376 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-utilities\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.916716 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-catalog-content\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:35 crc kubenswrapper[5129]: I0314 07:02:35.943151 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vt7x\" (UniqueName: \"kubernetes.io/projected/8e8571cc-ed81-4074-a9e7-24f81fa725f0-kube-api-access-5vt7x\") pod \"redhat-operators-rxbtj\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.014212 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.249919 5129 generic.go:334] "Generic (PLEG): container finished" podID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerID="94298492eb0225c9e930006963d29648a3c2ea84c076689d23a10bed0c19ef10" exitCode=0 Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.250241 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqck" event={"ID":"2cdca220-d63a-45e2-ad4e-d2b920554116","Type":"ContainerDied","Data":"94298492eb0225c9e930006963d29648a3c2ea84c076689d23a10bed0c19ef10"} Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.253233 5129 generic.go:334] "Generic (PLEG): container finished" podID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerID="abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85" exitCode=0 Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.253299 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glrkk" event={"ID":"35d61c74-41cf-4ed5-a0c0-ffae19f82be0","Type":"ContainerDied","Data":"abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85"} Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.259620 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1cf17a99-0755-477f-9be5-50de08b6c814","Type":"ContainerStarted","Data":"d039caa3843351817ac552cc82cb472f213e685cf33cd713e798c8753569424c"} Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.262221 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg5m2" event={"ID":"647966c7-67bc-4945-a281-477f0f83496e","Type":"ContainerStarted","Data":"dff040fc114756b7df333a9840a152c0b48a03145eb2c50db65eb733f96802ad"} Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.318627 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.318575998 podStartE2EDuration="3.318575998s" podCreationTimestamp="2026-03-14 07:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:02:36.298215001 +0000 UTC m=+219.050130195" watchObservedRunningTime="2026-03-14 07:02:36.318575998 +0000 UTC m=+219.070491182" Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.361676 5129 ???:1] "http: TLS handshake error from 192.168.126.11:42660: no serving certificate available for the kubelet" Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.431452 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxbtj"] Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.549189 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:36 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:36 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:36 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:36 crc kubenswrapper[5129]: I0314 07:02:36.549555 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.277928 5129 generic.go:334] "Generic (PLEG): container finished" podID="647966c7-67bc-4945-a281-477f0f83496e" containerID="cd801e5cc3681e7ea8d6e92752d0b2a37dc53b6316fcecd7259b9c0eea27e300" exitCode=0 Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.278289 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg5m2" event={"ID":"647966c7-67bc-4945-a281-477f0f83496e","Type":"ContainerDied","Data":"cd801e5cc3681e7ea8d6e92752d0b2a37dc53b6316fcecd7259b9c0eea27e300"} Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.283500 5129 generic.go:334] "Generic (PLEG): container finished" podID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerID="87266b4f7ae6ab6042ccacf2ec3620a6d8f14e014765fb67e0613aa06516218f" exitCode=0 Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.283563 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxbtj" event={"ID":"8e8571cc-ed81-4074-a9e7-24f81fa725f0","Type":"ContainerDied","Data":"87266b4f7ae6ab6042ccacf2ec3620a6d8f14e014765fb67e0613aa06516218f"} Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.283585 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxbtj" event={"ID":"8e8571cc-ed81-4074-a9e7-24f81fa725f0","Type":"ContainerStarted","Data":"c8a6af68962a7794f7cd77491514e81f6b536cc278d1a6553dc804ca2755e264"} Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.289829 5129 generic.go:334] "Generic (PLEG): container finished" podID="1cf17a99-0755-477f-9be5-50de08b6c814" containerID="d039caa3843351817ac552cc82cb472f213e685cf33cd713e798c8753569424c" exitCode=0 Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.290561 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1cf17a99-0755-477f-9be5-50de08b6c814","Type":"ContainerDied","Data":"d039caa3843351817ac552cc82cb472f213e685cf33cd713e798c8753569424c"} Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.550065 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:37 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:37 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:37 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:37 crc kubenswrapper[5129]: I0314 07:02:37.550127 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:38 crc kubenswrapper[5129]: I0314 07:02:38.549625 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:38 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:38 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:38 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:38 crc kubenswrapper[5129]: I0314 07:02:38.549776 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:39 crc kubenswrapper[5129]: I0314 07:02:39.357777 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t9ngb" Mar 14 07:02:39 crc kubenswrapper[5129]: I0314 07:02:39.548285 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:39 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:39 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:39 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:39 crc kubenswrapper[5129]: I0314 07:02:39.548376 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:40 crc kubenswrapper[5129]: I0314 07:02:40.549126 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:40 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:40 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:40 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:40 crc kubenswrapper[5129]: I0314 07:02:40.549532 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:41 crc kubenswrapper[5129]: I0314 07:02:41.302936 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:41 crc kubenswrapper[5129]: I0314 07:02:41.309786 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffc61f17-7577-4872-ad5d-7b33780d3d21-metrics-certs\") pod \"network-metrics-daemon-l2tzv\" (UID: \"ffc61f17-7577-4872-ad5d-7b33780d3d21\") " pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:41 crc kubenswrapper[5129]: I0314 07:02:41.549761 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:41 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:41 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:41 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:41 crc kubenswrapper[5129]: I0314 07:02:41.549898 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:41 crc kubenswrapper[5129]: I0314 07:02:41.561146 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2tzv" Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.548941 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:42 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:42 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:42 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.549491 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.646822 5129 patch_prober.go:28] interesting pod/console-f9d7485db-vpw78 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.646950 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vpw78" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.670205 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.721869 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf17a99-0755-477f-9be5-50de08b6c814-kube-api-access\") pod \"1cf17a99-0755-477f-9be5-50de08b6c814\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.721937 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf17a99-0755-477f-9be5-50de08b6c814-kubelet-dir\") pod \"1cf17a99-0755-477f-9be5-50de08b6c814\" (UID: \"1cf17a99-0755-477f-9be5-50de08b6c814\") " Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.722105 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cf17a99-0755-477f-9be5-50de08b6c814-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1cf17a99-0755-477f-9be5-50de08b6c814" (UID: "1cf17a99-0755-477f-9be5-50de08b6c814"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.722295 5129 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cf17a99-0755-477f-9be5-50de08b6c814-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.725158 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf17a99-0755-477f-9be5-50de08b6c814-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1cf17a99-0755-477f-9be5-50de08b6c814" (UID: "1cf17a99-0755-477f-9be5-50de08b6c814"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:02:42 crc kubenswrapper[5129]: I0314 07:02:42.823588 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cf17a99-0755-477f-9be5-50de08b6c814-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:02:43 crc kubenswrapper[5129]: I0314 07:02:43.333459 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1cf17a99-0755-477f-9be5-50de08b6c814","Type":"ContainerDied","Data":"a37d6de59891afaf8ee49f7869e461dbc1bc5b2c7b7f3e09acf33ce1fb6d5efc"} Mar 14 07:02:43 crc kubenswrapper[5129]: I0314 07:02:43.333497 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a37d6de59891afaf8ee49f7869e461dbc1bc5b2c7b7f3e09acf33ce1fb6d5efc" Mar 14 07:02:43 crc kubenswrapper[5129]: I0314 07:02:43.333505 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 07:02:43 crc kubenswrapper[5129]: I0314 07:02:43.478814 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hrz85" Mar 14 07:02:43 crc kubenswrapper[5129]: I0314 07:02:43.550128 5129 patch_prober.go:28] interesting pod/router-default-5444994796-gvck8 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 07:02:43 crc kubenswrapper[5129]: [-]has-synced failed: reason withheld Mar 14 07:02:43 crc kubenswrapper[5129]: [+]process-running ok Mar 14 07:02:43 crc kubenswrapper[5129]: healthz check failed Mar 14 07:02:43 crc kubenswrapper[5129]: I0314 07:02:43.550189 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gvck8" podUID="29a50e5b-e484-4916-be70-893887f8405e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 07:02:44 crc kubenswrapper[5129]: I0314 07:02:44.549488 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:44 crc kubenswrapper[5129]: I0314 07:02:44.559136 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gvck8" Mar 14 07:02:45 crc kubenswrapper[5129]: I0314 07:02:45.555742 5129 ???:1] "http: TLS handshake error from 192.168.126.11:46996: no serving certificate available for the kubelet" Mar 14 07:02:47 crc kubenswrapper[5129]: E0314 07:02:47.845721 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/19/19fd7fac1842265ebe5bfe519c61d184c5b2e02945b2510b490cf4200a1fc049?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260314%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260314T070237Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=aa9d9524ef1cb8c716091bed26e66f6114e33cf1909b71ee8c7bfe51ab60e277®ion=us-east-1&namespace=redhat&username=redhat+registry_proxy&repo_name=redhat----redhat-marketplace-index&akamai_signature=exp=1773472657~hmac=111c4ff2d5eb786bb8bf3b17739b2408fdc334e8254f11c4fb56f953fecaf0a9\": net/http: TLS handshake timeout" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 07:02:47 crc kubenswrapper[5129]: E0314 07:02:47.846151 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rcj68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-glrkk_openshift-marketplace(35d61c74-41cf-4ed5-a0c0-ffae19f82be0): ErrImagePull: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/19/19fd7fac1842265ebe5bfe519c61d184c5b2e02945b2510b490cf4200a1fc049?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260314%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260314T070237Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=aa9d9524ef1cb8c716091bed26e66f6114e33cf1909b71ee8c7bfe51ab60e277®ion=us-east-1&namespace=redhat&username=redhat+registry_proxy&repo_name=redhat----redhat-marketplace-index&akamai_signature=exp=1773472657~hmac=111c4ff2d5eb786bb8bf3b17739b2408fdc334e8254f11c4fb56f953fecaf0a9\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 14 07:02:47 crc kubenswrapper[5129]: E0314 07:02:47.847395 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/19/19fd7fac1842265ebe5bfe519c61d184c5b2e02945b2510b490cf4200a1fc049?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260314%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260314T070237Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=aa9d9524ef1cb8c716091bed26e66f6114e33cf1909b71ee8c7bfe51ab60e277®ion=us-east-1&namespace=redhat&username=redhat+registry_proxy&repo_name=redhat----redhat-marketplace-index&akamai_signature=exp=1773472657~hmac=111c4ff2d5eb786bb8bf3b17739b2408fdc334e8254f11c4fb56f953fecaf0a9\\\": net/http: TLS handshake timeout\"" pod="openshift-marketplace/redhat-marketplace-glrkk" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" Mar 14 07:02:49 crc kubenswrapper[5129]: I0314 07:02:49.574571 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:02:49 crc kubenswrapper[5129]: I0314 07:02:49.574888 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:02:50 crc kubenswrapper[5129]: I0314 07:02:50.654343 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77fdf47f59-rcbzn"] Mar 14 07:02:50 crc kubenswrapper[5129]: I0314 07:02:50.654549 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" podUID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" containerName="controller-manager" containerID="cri-o://05d0c08a52f41697219d175c2b2fefcbe9e9aa33fec1defa4a4e67697c8e632f" gracePeriod=30 Mar 14 07:02:50 crc kubenswrapper[5129]: I0314 07:02:50.679990 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82"] Mar 14 07:02:50 crc kubenswrapper[5129]: I0314 07:02:50.680218 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" podUID="16dc5638-0119-4e96-aa52-e6a61f6aa25e" containerName="route-controller-manager" containerID="cri-o://caf709c2a94fc19a4261f726bebf0a7b0b68621ec0490bb5a5f7a39d22ecdfb1" gracePeriod=30 Mar 14 07:02:50 crc kubenswrapper[5129]: E0314 07:02:50.852841 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-glrkk" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" Mar 14 07:02:51 crc kubenswrapper[5129]: I0314 07:02:51.383084 5129 generic.go:334] "Generic (PLEG): container finished" podID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" containerID="05d0c08a52f41697219d175c2b2fefcbe9e9aa33fec1defa4a4e67697c8e632f" exitCode=0 Mar 14 07:02:51 crc kubenswrapper[5129]: I0314 07:02:51.383150 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" event={"ID":"5386c646-51eb-4c9a-9aea-530aaed9f5c0","Type":"ContainerDied","Data":"05d0c08a52f41697219d175c2b2fefcbe9e9aa33fec1defa4a4e67697c8e632f"} Mar 14 07:02:51 crc kubenswrapper[5129]: I0314 07:02:51.384841 5129 generic.go:334] "Generic (PLEG): container finished" podID="16dc5638-0119-4e96-aa52-e6a61f6aa25e" containerID="caf709c2a94fc19a4261f726bebf0a7b0b68621ec0490bb5a5f7a39d22ecdfb1" exitCode=0 Mar 14 07:02:51 crc kubenswrapper[5129]: I0314 07:02:51.384883 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" event={"ID":"16dc5638-0119-4e96-aa52-e6a61f6aa25e","Type":"ContainerDied","Data":"caf709c2a94fc19a4261f726bebf0a7b0b68621ec0490bb5a5f7a39d22ecdfb1"} Mar 14 07:02:52 crc kubenswrapper[5129]: I0314 07:02:52.147162 5129 patch_prober.go:28] interesting pod/route-controller-manager-6bdb4cfc77-xxf82 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 14 07:02:52 crc kubenswrapper[5129]: I0314 07:02:52.147326 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" podUID="16dc5638-0119-4e96-aa52-e6a61f6aa25e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 14 07:02:52 crc kubenswrapper[5129]: I0314 07:02:52.520594 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:02:52 crc kubenswrapper[5129]: I0314 07:02:52.654289 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:52 crc kubenswrapper[5129]: I0314 07:02:52.658350 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:02:53 crc kubenswrapper[5129]: I0314 07:02:53.425099 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l2tzv"] Mar 14 07:02:53 crc kubenswrapper[5129]: I0314 07:02:53.738735 5129 patch_prober.go:28] interesting pod/controller-manager-77fdf47f59-rcbzn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 14 07:02:53 crc kubenswrapper[5129]: I0314 07:02:53.738793 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" podUID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 14 07:02:56 crc kubenswrapper[5129]: E0314 07:02:56.802793 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 14 07:02:56 crc kubenswrapper[5129]: E0314 07:02:56.803264 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:02:56 crc kubenswrapper[5129]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 14 07:02:56 crc kubenswrapper[5129]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7lfsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29557862-lf4bl_openshift-infra(e9e49a0a-8f9f-4e78-8098-d195fe3297bd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 14 07:02:56 crc kubenswrapper[5129]: > logger="UnhandledError" Mar 14 07:02:56 crc kubenswrapper[5129]: E0314 07:02:56.804765 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" podUID="e9e49a0a-8f9f-4e78-8098-d195fe3297bd" Mar 14 07:02:57 crc kubenswrapper[5129]: E0314 07:02:57.425014 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" podUID="e9e49a0a-8f9f-4e78-8098-d195fe3297bd" Mar 14 07:03:01 crc kubenswrapper[5129]: W0314 07:03:01.671124 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffc61f17_7577_4872_ad5d_7b33780d3d21.slice/crio-db3843b696f5d3166e1521e447b4ef228227857a78d3c073fc85b87e992296f9 WatchSource:0}: Error finding container db3843b696f5d3166e1521e447b4ef228227857a78d3c073fc85b87e992296f9: Status 404 returned error can't find the container with id db3843b696f5d3166e1521e447b4ef228227857a78d3c073fc85b87e992296f9 Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.710771 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.736092 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf"] Mar 14 07:03:01 crc kubenswrapper[5129]: E0314 07:03:01.736343 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dc5638-0119-4e96-aa52-e6a61f6aa25e" containerName="route-controller-manager" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.736357 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dc5638-0119-4e96-aa52-e6a61f6aa25e" containerName="route-controller-manager" Mar 14 07:03:01 crc kubenswrapper[5129]: E0314 07:03:01.736375 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf17a99-0755-477f-9be5-50de08b6c814" containerName="pruner" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.736382 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf17a99-0755-477f-9be5-50de08b6c814" containerName="pruner" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.736504 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf17a99-0755-477f-9be5-50de08b6c814" containerName="pruner" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.736525 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dc5638-0119-4e96-aa52-e6a61f6aa25e" containerName="route-controller-manager" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.736950 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.752853 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf"] Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.887901 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-client-ca\") pod \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.888023 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dc5638-0119-4e96-aa52-e6a61f6aa25e-serving-cert\") pod \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.888084 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hvr4\" (UniqueName: \"kubernetes.io/projected/16dc5638-0119-4e96-aa52-e6a61f6aa25e-kube-api-access-4hvr4\") pod \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.888114 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-config\") pod \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\" (UID: \"16dc5638-0119-4e96-aa52-e6a61f6aa25e\") " Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.888318 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59825e2-b54f-4438-9a12-be640d8baf2b-serving-cert\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.888361 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rh7c\" (UniqueName: \"kubernetes.io/projected/a59825e2-b54f-4438-9a12-be640d8baf2b-kube-api-access-5rh7c\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.888412 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-client-ca\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.888838 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-config" (OuterVolumeSpecName: "config") pod "16dc5638-0119-4e96-aa52-e6a61f6aa25e" (UID: "16dc5638-0119-4e96-aa52-e6a61f6aa25e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.889025 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-config\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.889233 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.889327 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-client-ca" (OuterVolumeSpecName: "client-ca") pod "16dc5638-0119-4e96-aa52-e6a61f6aa25e" (UID: "16dc5638-0119-4e96-aa52-e6a61f6aa25e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.894256 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dc5638-0119-4e96-aa52-e6a61f6aa25e-kube-api-access-4hvr4" (OuterVolumeSpecName: "kube-api-access-4hvr4") pod "16dc5638-0119-4e96-aa52-e6a61f6aa25e" (UID: "16dc5638-0119-4e96-aa52-e6a61f6aa25e"). InnerVolumeSpecName "kube-api-access-4hvr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.896547 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dc5638-0119-4e96-aa52-e6a61f6aa25e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16dc5638-0119-4e96-aa52-e6a61f6aa25e" (UID: "16dc5638-0119-4e96-aa52-e6a61f6aa25e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.990557 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59825e2-b54f-4438-9a12-be640d8baf2b-serving-cert\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.990899 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rh7c\" (UniqueName: \"kubernetes.io/projected/a59825e2-b54f-4438-9a12-be640d8baf2b-kube-api-access-5rh7c\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.991075 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-client-ca\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.991225 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-config\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.991363 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dc5638-0119-4e96-aa52-e6a61f6aa25e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.991495 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dc5638-0119-4e96-aa52-e6a61f6aa25e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:01 crc kubenswrapper[5129]: I0314 07:03:01.991580 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hvr4\" (UniqueName: \"kubernetes.io/projected/16dc5638-0119-4e96-aa52-e6a61f6aa25e-kube-api-access-4hvr4\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.013272 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-client-ca\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.013533 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59825e2-b54f-4438-9a12-be640d8baf2b-serving-cert\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.014502 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-config\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.031117 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rh7c\" (UniqueName: \"kubernetes.io/projected/a59825e2-b54f-4438-9a12-be640d8baf2b-kube-api-access-5rh7c\") pod \"route-controller-manager-ff9f59bdb-6vgpf\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.061197 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.445802 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" event={"ID":"ffc61f17-7577-4872-ad5d-7b33780d3d21","Type":"ContainerStarted","Data":"db3843b696f5d3166e1521e447b4ef228227857a78d3c073fc85b87e992296f9"} Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.447572 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" event={"ID":"16dc5638-0119-4e96-aa52-e6a61f6aa25e","Type":"ContainerDied","Data":"f989e9b688031c74f8d33332db8d2d5838476a7bfec2b309fb5446a2508c3f81"} Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.447765 5129 scope.go:117] "RemoveContainer" containerID="caf709c2a94fc19a4261f726bebf0a7b0b68621ec0490bb5a5f7a39d22ecdfb1" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.448047 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82" Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.468634 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82"] Mar 14 07:03:02 crc kubenswrapper[5129]: I0314 07:03:02.472463 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bdb4cfc77-xxf82"] Mar 14 07:03:03 crc kubenswrapper[5129]: I0314 07:03:03.889769 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tm2rg" Mar 14 07:03:04 crc kubenswrapper[5129]: I0314 07:03:04.043155 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dc5638-0119-4e96-aa52-e6a61f6aa25e" path="/var/lib/kubelet/pods/16dc5638-0119-4e96-aa52-e6a61f6aa25e/volumes" Mar 14 07:03:04 crc kubenswrapper[5129]: I0314 07:03:04.739271 5129 patch_prober.go:28] interesting pod/controller-manager-77fdf47f59-rcbzn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 07:03:04 crc kubenswrapper[5129]: I0314 07:03:04.739359 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" podUID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.184529 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.225293 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx"] Mar 14 07:03:05 crc kubenswrapper[5129]: E0314 07:03:05.226082 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" containerName="controller-manager" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.226111 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" containerName="controller-manager" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.226276 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" containerName="controller-manager" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.227130 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.229217 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld4j9\" (UniqueName: \"kubernetes.io/projected/5386c646-51eb-4c9a-9aea-530aaed9f5c0-kube-api-access-ld4j9\") pod \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.229295 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-config\") pod \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.229498 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-proxy-ca-bundles\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.229529 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-client-ca\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.229557 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx572\" (UniqueName: \"kubernetes.io/projected/c307bc2f-3f41-4103-8a67-6947b51a2dcf-kube-api-access-kx572\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.229586 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c307bc2f-3f41-4103-8a67-6947b51a2dcf-serving-cert\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.229641 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-config\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.231402 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx"] Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.232158 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-config" (OuterVolumeSpecName: "config") pod "5386c646-51eb-4c9a-9aea-530aaed9f5c0" (UID: "5386c646-51eb-4c9a-9aea-530aaed9f5c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.236592 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5386c646-51eb-4c9a-9aea-530aaed9f5c0-kube-api-access-ld4j9" (OuterVolumeSpecName: "kube-api-access-ld4j9") pod "5386c646-51eb-4c9a-9aea-530aaed9f5c0" (UID: "5386c646-51eb-4c9a-9aea-530aaed9f5c0"). InnerVolumeSpecName "kube-api-access-ld4j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.330318 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-proxy-ca-bundles\") pod \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.330412 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-client-ca\") pod \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.330474 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5386c646-51eb-4c9a-9aea-530aaed9f5c0-serving-cert\") pod \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\" (UID: \"5386c646-51eb-4c9a-9aea-530aaed9f5c0\") " Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.330651 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-proxy-ca-bundles\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.330695 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-client-ca\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.330726 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx572\" (UniqueName: \"kubernetes.io/projected/c307bc2f-3f41-4103-8a67-6947b51a2dcf-kube-api-access-kx572\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.331118 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c307bc2f-3f41-4103-8a67-6947b51a2dcf-serving-cert\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.331274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-config\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.331337 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld4j9\" (UniqueName: \"kubernetes.io/projected/5386c646-51eb-4c9a-9aea-530aaed9f5c0-kube-api-access-ld4j9\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.331360 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.332110 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-client-ca\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.332616 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-config\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.333665 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-proxy-ca-bundles\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.333989 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5386c646-51eb-4c9a-9aea-530aaed9f5c0" (UID: "5386c646-51eb-4c9a-9aea-530aaed9f5c0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.334259 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5386c646-51eb-4c9a-9aea-530aaed9f5c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5386c646-51eb-4c9a-9aea-530aaed9f5c0" (UID: "5386c646-51eb-4c9a-9aea-530aaed9f5c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.334380 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "5386c646-51eb-4c9a-9aea-530aaed9f5c0" (UID: "5386c646-51eb-4c9a-9aea-530aaed9f5c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.335465 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c307bc2f-3f41-4103-8a67-6947b51a2dcf-serving-cert\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.346504 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx572\" (UniqueName: \"kubernetes.io/projected/c307bc2f-3f41-4103-8a67-6947b51a2dcf-kube-api-access-kx572\") pod \"controller-manager-7bb57d5c57-m9pzx\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.432869 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.432900 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5386c646-51eb-4c9a-9aea-530aaed9f5c0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.432908 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5386c646-51eb-4c9a-9aea-530aaed9f5c0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.464507 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" event={"ID":"5386c646-51eb-4c9a-9aea-530aaed9f5c0","Type":"ContainerDied","Data":"912075cda309e43fca7f54b2464a99264310c69538cd34b6719d3d47f041db1e"} Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.464662 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fdf47f59-rcbzn" Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.491553 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77fdf47f59-rcbzn"] Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.494198 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77fdf47f59-rcbzn"] Mar 14 07:03:05 crc kubenswrapper[5129]: I0314 07:03:05.570353 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.011591 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.046995 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5386c646-51eb-4c9a-9aea-530aaed9f5c0" path="/var/lib/kubelet/pods/5386c646-51eb-4c9a-9aea-530aaed9f5c0/volumes" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.518189 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.518902 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.522488 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.522556 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.525751 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.650904 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9029912-be8a-48be-8881-79b83ec8b2e9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.651014 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9029912-be8a-48be-8881-79b83ec8b2e9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.751712 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9029912-be8a-48be-8881-79b83ec8b2e9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.751788 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9029912-be8a-48be-8881-79b83ec8b2e9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.751884 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9029912-be8a-48be-8881-79b83ec8b2e9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.767985 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9029912-be8a-48be-8881-79b83ec8b2e9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:06 crc kubenswrapper[5129]: I0314 07:03:06.837857 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:10 crc kubenswrapper[5129]: E0314 07:03:10.197283 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 14 07:03:10 crc kubenswrapper[5129]: E0314 07:03:10.197697 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nwfmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6t6l8_openshift-marketplace(a848f19c-da50-403e-b620-5425b51fab9a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:03:10 crc kubenswrapper[5129]: E0314 07:03:10.199669 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6t6l8" podUID="a848f19c-da50-403e-b620-5425b51fab9a" Mar 14 07:03:10 crc kubenswrapper[5129]: I0314 07:03:10.659889 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx"] Mar 14 07:03:10 crc kubenswrapper[5129]: I0314 07:03:10.751425 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf"] Mar 14 07:03:12 crc kubenswrapper[5129]: E0314 07:03:12.036867 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 07:03:12 crc kubenswrapper[5129]: E0314 07:03:12.037029 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vsxdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gw4hz_openshift-marketplace(ff0704b1-3b5e-4e02-9f82-c1d74ad03387): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:03:12 crc kubenswrapper[5129]: E0314 07:03:12.038179 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gw4hz" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.124038 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.125375 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.130411 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.226845 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-var-lock\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.226947 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7526b6c-47da-49a7-8751-fb1a037a3082-kube-api-access\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.226990 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.328314 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7526b6c-47da-49a7-8751-fb1a037a3082-kube-api-access\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.328364 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.328403 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-var-lock\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.328471 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-var-lock\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.328483 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.345970 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7526b6c-47da-49a7-8751-fb1a037a3082-kube-api-access\") pod \"installer-9-crc\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:12 crc kubenswrapper[5129]: I0314 07:03:12.496661 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.359971 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gw4hz" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.360140 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6t6l8" podUID="a848f19c-da50-403e-b620-5425b51fab9a" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.388093 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.388400 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5vt7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rxbtj_openshift-marketplace(8e8571cc-ed81-4074-a9e7-24f81fa725f0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.389785 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rxbtj" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.476713 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.476910 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6d764,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gqcxv_openshift-marketplace(ee9d7764-4db5-49d9-9b08-7b2317ec41ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:03:14 crc kubenswrapper[5129]: E0314 07:03:14.478056 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gqcxv" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" Mar 14 07:03:15 crc kubenswrapper[5129]: E0314 07:03:15.852876 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gqcxv" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" Mar 14 07:03:15 crc kubenswrapper[5129]: E0314 07:03:15.853175 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rxbtj" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" Mar 14 07:03:15 crc kubenswrapper[5129]: I0314 07:03:15.867239 5129 scope.go:117] "RemoveContainer" containerID="05d0c08a52f41697219d175c2b2fefcbe9e9aa33fec1defa4a4e67697c8e632f" Mar 14 07:03:15 crc kubenswrapper[5129]: E0314 07:03:15.925269 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 07:03:15 crc kubenswrapper[5129]: E0314 07:03:15.925426 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ft5dc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mdqck_openshift-marketplace(2cdca220-d63a-45e2-ad4e-d2b920554116): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 07:03:15 crc kubenswrapper[5129]: E0314 07:03:15.926776 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mdqck" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.336215 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf"] Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.431338 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.449735 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.452314 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx"] Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.542831 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" event={"ID":"c307bc2f-3f41-4103-8a67-6947b51a2dcf","Type":"ContainerStarted","Data":"398d7be71c4835b3175f356aada6ee5dec73222fadb7756c2964ee7c7c2938f0"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.545176 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" event={"ID":"ffc61f17-7577-4872-ad5d-7b33780d3d21","Type":"ContainerStarted","Data":"4a0dfeea9165e577739a16a9bad80473fa7bc91beb6028e76bd59e2d03d15366"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.545209 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l2tzv" event={"ID":"ffc61f17-7577-4872-ad5d-7b33780d3d21","Type":"ContainerStarted","Data":"eaafffea2a378b25b481708349e83e950eb1104d42b862f0d2ec097d07303781"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.549545 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg5m2" event={"ID":"647966c7-67bc-4945-a281-477f0f83496e","Type":"ContainerStarted","Data":"f9df24927cdb20da53feb58f0a5c808ef05ba7dfb79907d29aa6d9181f0cdeff"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.562522 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l2tzv" podStartSLOduration=209.562505886 podStartE2EDuration="3m29.562505886s" podCreationTimestamp="2026-03-14 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:16.560384564 +0000 UTC m=+259.312299758" watchObservedRunningTime="2026-03-14 07:03:16.562505886 +0000 UTC m=+259.314421070" Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.564847 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d9029912-be8a-48be-8881-79b83ec8b2e9","Type":"ContainerStarted","Data":"0a549d9b9a260923e2e8db25b54772a8aa46e44fa860c8bfffccd9d199f8a536"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.582141 5129 generic.go:334] "Generic (PLEG): container finished" podID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerID="ea9694764b338ca7e1d75541f494c63af2ba5ec628d9c5e5583bcdf0d960fc2b" exitCode=0 Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.582522 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mmjl" event={"ID":"7470c785-0846-4e7f-9d1b-aad9a52e5a5b","Type":"ContainerDied","Data":"ea9694764b338ca7e1d75541f494c63af2ba5ec628d9c5e5583bcdf0d960fc2b"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.608316 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" event={"ID":"a59825e2-b54f-4438-9a12-be640d8baf2b","Type":"ContainerStarted","Data":"ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.608362 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" event={"ID":"a59825e2-b54f-4438-9a12-be640d8baf2b","Type":"ContainerStarted","Data":"c735e5c03b57f6226cfd5acd15fc24583ae4d5a53ca094626b2e44e319e846c0"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.608485 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" podUID="a59825e2-b54f-4438-9a12-be640d8baf2b" containerName="route-controller-manager" containerID="cri-o://ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c" gracePeriod=30 Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.609122 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.617988 5129 patch_prober.go:28] interesting pod/route-controller-manager-ff9f59bdb-6vgpf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.618040 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" podUID="a59825e2-b54f-4438-9a12-be640d8baf2b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.623230 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" event={"ID":"e9e49a0a-8f9f-4e78-8098-d195fe3297bd","Type":"ContainerStarted","Data":"39befd089a33e2ffb58292b85b60ae481731877a002c54267b4a001c86e775ab"} Mar 14 07:03:16 crc kubenswrapper[5129]: E0314 07:03:16.647791 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mdqck" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.651655 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7526b6c-47da-49a7-8751-fb1a037a3082","Type":"ContainerStarted","Data":"e04861ed52d15214a2454d2fd113731e67a83588963d524363e908d2ecc48f8f"} Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.672377 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" podStartSLOduration=26.672360875 podStartE2EDuration="26.672360875s" podCreationTimestamp="2026-03-14 07:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:16.668497574 +0000 UTC m=+259.420412758" watchObservedRunningTime="2026-03-14 07:03:16.672360875 +0000 UTC m=+259.424276049" Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.693325 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" podStartSLOduration=27.948530728 podStartE2EDuration="1m16.69330298s" podCreationTimestamp="2026-03-14 07:02:00 +0000 UTC" firstStartedPulling="2026-03-14 07:02:27.352641335 +0000 UTC m=+210.104556519" lastFinishedPulling="2026-03-14 07:03:16.097413587 +0000 UTC m=+258.849328771" observedRunningTime="2026-03-14 07:03:16.690334744 +0000 UTC m=+259.442249938" watchObservedRunningTime="2026-03-14 07:03:16.69330298 +0000 UTC m=+259.445218164" Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.980504 5129 csr.go:261] certificate signing request csr-rrkzx is approved, waiting to be issued Mar 14 07:03:16 crc kubenswrapper[5129]: I0314 07:03:16.994579 5129 csr.go:257] certificate signing request csr-rrkzx is issued Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.000260 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-ff9f59bdb-6vgpf_a59825e2-b54f-4438-9a12-be640d8baf2b/route-controller-manager/0.log" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.000349 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.029617 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh"] Mar 14 07:03:17 crc kubenswrapper[5129]: E0314 07:03:17.029862 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59825e2-b54f-4438-9a12-be640d8baf2b" containerName="route-controller-manager" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.029874 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59825e2-b54f-4438-9a12-be640d8baf2b" containerName="route-controller-manager" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.029957 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59825e2-b54f-4438-9a12-be640d8baf2b" containerName="route-controller-manager" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.030356 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.043660 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh"] Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.187518 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-client-ca\") pod \"a59825e2-b54f-4438-9a12-be640d8baf2b\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.187577 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59825e2-b54f-4438-9a12-be640d8baf2b-serving-cert\") pod \"a59825e2-b54f-4438-9a12-be640d8baf2b\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.187668 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-config\") pod \"a59825e2-b54f-4438-9a12-be640d8baf2b\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.187712 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rh7c\" (UniqueName: \"kubernetes.io/projected/a59825e2-b54f-4438-9a12-be640d8baf2b-kube-api-access-5rh7c\") pod \"a59825e2-b54f-4438-9a12-be640d8baf2b\" (UID: \"a59825e2-b54f-4438-9a12-be640d8baf2b\") " Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.187858 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkkh\" (UniqueName: \"kubernetes.io/projected/4b1e1421-7470-47c1-bb48-966ef694890e-kube-api-access-jjkkh\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.187918 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-config\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.188091 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b1e1421-7470-47c1-bb48-966ef694890e-serving-cert\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.188152 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-client-ca\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.188663 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a59825e2-b54f-4438-9a12-be640d8baf2b" (UID: "a59825e2-b54f-4438-9a12-be640d8baf2b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.189151 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-config" (OuterVolumeSpecName: "config") pod "a59825e2-b54f-4438-9a12-be640d8baf2b" (UID: "a59825e2-b54f-4438-9a12-be640d8baf2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.193282 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59825e2-b54f-4438-9a12-be640d8baf2b-kube-api-access-5rh7c" (OuterVolumeSpecName: "kube-api-access-5rh7c") pod "a59825e2-b54f-4438-9a12-be640d8baf2b" (UID: "a59825e2-b54f-4438-9a12-be640d8baf2b"). InnerVolumeSpecName "kube-api-access-5rh7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.193738 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59825e2-b54f-4438-9a12-be640d8baf2b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a59825e2-b54f-4438-9a12-be640d8baf2b" (UID: "a59825e2-b54f-4438-9a12-be640d8baf2b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289433 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkkh\" (UniqueName: \"kubernetes.io/projected/4b1e1421-7470-47c1-bb48-966ef694890e-kube-api-access-jjkkh\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289799 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-config\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289833 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b1e1421-7470-47c1-bb48-966ef694890e-serving-cert\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289856 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-client-ca\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289945 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289960 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59825e2-b54f-4438-9a12-be640d8baf2b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289971 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59825e2-b54f-4438-9a12-be640d8baf2b-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.289983 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rh7c\" (UniqueName: \"kubernetes.io/projected/a59825e2-b54f-4438-9a12-be640d8baf2b-kube-api-access-5rh7c\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.290909 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-client-ca\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.290955 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-config\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.297010 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b1e1421-7470-47c1-bb48-966ef694890e-serving-cert\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.309571 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkkh\" (UniqueName: \"kubernetes.io/projected/4b1e1421-7470-47c1-bb48-966ef694890e-kube-api-access-jjkkh\") pod \"route-controller-manager-6fd9dcb4c5-hg6sh\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.462094 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.650275 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mmjl" event={"ID":"7470c785-0846-4e7f-9d1b-aad9a52e5a5b","Type":"ContainerStarted","Data":"d61fc1270fd328952b9b5573afda891dce6cb6377ccebdc5aa25bb0b5c95a364"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.653913 5129 generic.go:334] "Generic (PLEG): container finished" podID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerID="aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5" exitCode=0 Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.653979 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glrkk" event={"ID":"35d61c74-41cf-4ed5-a0c0-ffae19f82be0","Type":"ContainerDied","Data":"aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.656760 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-ff9f59bdb-6vgpf_a59825e2-b54f-4438-9a12-be640d8baf2b/route-controller-manager/0.log" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.656806 5129 generic.go:334] "Generic (PLEG): container finished" podID="a59825e2-b54f-4438-9a12-be640d8baf2b" containerID="ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c" exitCode=2 Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.656888 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.656900 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" event={"ID":"a59825e2-b54f-4438-9a12-be640d8baf2b","Type":"ContainerDied","Data":"ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.656923 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf" event={"ID":"a59825e2-b54f-4438-9a12-be640d8baf2b","Type":"ContainerDied","Data":"c735e5c03b57f6226cfd5acd15fc24583ae4d5a53ca094626b2e44e319e846c0"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.656936 5129 scope.go:117] "RemoveContainer" containerID="ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.668294 5129 generic.go:334] "Generic (PLEG): container finished" podID="647966c7-67bc-4945-a281-477f0f83496e" containerID="f9df24927cdb20da53feb58f0a5c808ef05ba7dfb79907d29aa6d9181f0cdeff" exitCode=0 Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.668393 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg5m2" event={"ID":"647966c7-67bc-4945-a281-477f0f83496e","Type":"ContainerDied","Data":"f9df24927cdb20da53feb58f0a5c808ef05ba7dfb79907d29aa6d9181f0cdeff"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.675002 5129 generic.go:334] "Generic (PLEG): container finished" podID="d9029912-be8a-48be-8881-79b83ec8b2e9" containerID="d09d5ee1e4c8a2d87de6a59f301cd00c592b2540279040af2f413b5e681865c0" exitCode=0 Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.675140 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d9029912-be8a-48be-8881-79b83ec8b2e9","Type":"ContainerDied","Data":"d09d5ee1e4c8a2d87de6a59f301cd00c592b2540279040af2f413b5e681865c0"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.680358 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8mmjl" podStartSLOduration=2.702478554 podStartE2EDuration="45.680329926s" podCreationTimestamp="2026-03-14 07:02:32 +0000 UTC" firstStartedPulling="2026-03-14 07:02:34.184821459 +0000 UTC m=+216.936736643" lastFinishedPulling="2026-03-14 07:03:17.162672831 +0000 UTC m=+259.914588015" observedRunningTime="2026-03-14 07:03:17.667844075 +0000 UTC m=+260.419759259" watchObservedRunningTime="2026-03-14 07:03:17.680329926 +0000 UTC m=+260.432245120" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.683305 5129 generic.go:334] "Generic (PLEG): container finished" podID="e9e49a0a-8f9f-4e78-8098-d195fe3297bd" containerID="39befd089a33e2ffb58292b85b60ae481731877a002c54267b4a001c86e775ab" exitCode=0 Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.683400 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" event={"ID":"e9e49a0a-8f9f-4e78-8098-d195fe3297bd","Type":"ContainerDied","Data":"39befd089a33e2ffb58292b85b60ae481731877a002c54267b4a001c86e775ab"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.685254 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7526b6c-47da-49a7-8751-fb1a037a3082","Type":"ContainerStarted","Data":"e66a306547d2d502fbec60b974e1970c8d3c039b87bc4b1b6821208e39fad78c"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.687658 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" podUID="c307bc2f-3f41-4103-8a67-6947b51a2dcf" containerName="controller-manager" containerID="cri-o://be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91" gracePeriod=30 Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.689931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" event={"ID":"c307bc2f-3f41-4103-8a67-6947b51a2dcf","Type":"ContainerStarted","Data":"be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91"} Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.689972 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.691497 5129 scope.go:117] "RemoveContainer" containerID="ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c" Mar 14 07:03:17 crc kubenswrapper[5129]: E0314 07:03:17.692580 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c\": container with ID starting with ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c not found: ID does not exist" containerID="ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.692640 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c"} err="failed to get container status \"ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c\": rpc error: code = NotFound desc = could not find container \"ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c\": container with ID starting with ecb2b116753bc313f5a7de4ffd7975c21d472746777cd474d32a30093621834c not found: ID does not exist" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.710389 5129 patch_prober.go:28] interesting pod/controller-manager-7bb57d5c57-m9pzx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:39374->10.217.0.58:8443: read: connection reset by peer" start-of-body= Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.710458 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" podUID="c307bc2f-3f41-4103-8a67-6947b51a2dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:39374->10.217.0.58:8443: read: connection reset by peer" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.729182 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.729159164 podStartE2EDuration="5.729159164s" podCreationTimestamp="2026-03-14 07:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:17.719082714 +0000 UTC m=+260.470997898" watchObservedRunningTime="2026-03-14 07:03:17.729159164 +0000 UTC m=+260.481074348" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.738739 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" podStartSLOduration=27.738724001 podStartE2EDuration="27.738724001s" podCreationTimestamp="2026-03-14 07:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:17.734521279 +0000 UTC m=+260.486436483" watchObservedRunningTime="2026-03-14 07:03:17.738724001 +0000 UTC m=+260.490639185" Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.776739 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf"] Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.779526 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff9f59bdb-6vgpf"] Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.866155 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh"] Mar 14 07:03:17 crc kubenswrapper[5129]: W0314 07:03:17.898284 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b1e1421_7470_47c1_bb48_966ef694890e.slice/crio-83852ee339292f7c710c30b5ad4f1becb86248514c172814431495d0b57f1a01 WatchSource:0}: Error finding container 83852ee339292f7c710c30b5ad4f1becb86248514c172814431495d0b57f1a01: Status 404 returned error can't find the container with id 83852ee339292f7c710c30b5ad4f1becb86248514c172814431495d0b57f1a01 Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.995383 5129 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-13 04:36:59.412220961 +0000 UTC Mar 14 07:03:17 crc kubenswrapper[5129]: I0314 07:03:17.996792 5129 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6573h33m41.4154347s for next certificate rotation Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.052694 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59825e2-b54f-4438-9a12-be640d8baf2b" path="/var/lib/kubelet/pods/a59825e2-b54f-4438-9a12-be640d8baf2b/volumes" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.580222 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.692931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" event={"ID":"4b1e1421-7470-47c1-bb48-966ef694890e","Type":"ContainerStarted","Data":"c9224607510b7a04a9fb45643a9142e9dbf56b8783da5470274d0ab036e9c53c"} Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.692964 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" event={"ID":"4b1e1421-7470-47c1-bb48-966ef694890e","Type":"ContainerStarted","Data":"83852ee339292f7c710c30b5ad4f1becb86248514c172814431495d0b57f1a01"} Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.693285 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.696953 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg5m2" event={"ID":"647966c7-67bc-4945-a281-477f0f83496e","Type":"ContainerStarted","Data":"9a7c6f281770d9548b8e524a5a1504fb4a0316b6c4cd16b401285f80898ed820"} Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.697806 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.698418 5129 generic.go:334] "Generic (PLEG): container finished" podID="c307bc2f-3f41-4103-8a67-6947b51a2dcf" containerID="be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91" exitCode=0 Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.698461 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" event={"ID":"c307bc2f-3f41-4103-8a67-6947b51a2dcf","Type":"ContainerDied","Data":"be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91"} Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.698480 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.698507 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx" event={"ID":"c307bc2f-3f41-4103-8a67-6947b51a2dcf","Type":"ContainerDied","Data":"398d7be71c4835b3175f356aada6ee5dec73222fadb7756c2964ee7c7c2938f0"} Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.698529 5129 scope.go:117] "RemoveContainer" containerID="be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.700800 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glrkk" event={"ID":"35d61c74-41cf-4ed5-a0c0-ffae19f82be0","Type":"ContainerStarted","Data":"26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a"} Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.708965 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-config\") pod \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.709018 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c307bc2f-3f41-4103-8a67-6947b51a2dcf-serving-cert\") pod \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.709096 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-client-ca\") pod \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.709215 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-proxy-ca-bundles\") pod \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.709250 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx572\" (UniqueName: \"kubernetes.io/projected/c307bc2f-3f41-4103-8a67-6947b51a2dcf-kube-api-access-kx572\") pod \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\" (UID: \"c307bc2f-3f41-4103-8a67-6947b51a2dcf\") " Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.709991 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-config" (OuterVolumeSpecName: "config") pod "c307bc2f-3f41-4103-8a67-6947b51a2dcf" (UID: "c307bc2f-3f41-4103-8a67-6947b51a2dcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.710130 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "c307bc2f-3f41-4103-8a67-6947b51a2dcf" (UID: "c307bc2f-3f41-4103-8a67-6947b51a2dcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.710519 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c307bc2f-3f41-4103-8a67-6947b51a2dcf" (UID: "c307bc2f-3f41-4103-8a67-6947b51a2dcf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.715088 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c307bc2f-3f41-4103-8a67-6947b51a2dcf-kube-api-access-kx572" (OuterVolumeSpecName: "kube-api-access-kx572") pod "c307bc2f-3f41-4103-8a67-6947b51a2dcf" (UID: "c307bc2f-3f41-4103-8a67-6947b51a2dcf"). InnerVolumeSpecName "kube-api-access-kx572". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.722116 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" podStartSLOduration=8.722096609 podStartE2EDuration="8.722096609s" podCreationTimestamp="2026-03-14 07:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:18.720243015 +0000 UTC m=+261.472158209" watchObservedRunningTime="2026-03-14 07:03:18.722096609 +0000 UTC m=+261.474011803" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.729414 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c307bc2f-3f41-4103-8a67-6947b51a2dcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c307bc2f-3f41-4103-8a67-6947b51a2dcf" (UID: "c307bc2f-3f41-4103-8a67-6947b51a2dcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.729545 5129 scope.go:117] "RemoveContainer" containerID="be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91" Mar 14 07:03:18 crc kubenswrapper[5129]: E0314 07:03:18.729922 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91\": container with ID starting with be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91 not found: ID does not exist" containerID="be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.729943 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91"} err="failed to get container status \"be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91\": rpc error: code = NotFound desc = could not find container \"be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91\": container with ID starting with be6326cd71cd99bb2038d060f997c6476b014008e498a014f6a2aa02a1aefd91 not found: ID does not exist" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.756478 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-glrkk" podStartSLOduration=2.884615472 podStartE2EDuration="44.756434285s" podCreationTimestamp="2026-03-14 07:02:34 +0000 UTC" firstStartedPulling="2026-03-14 07:02:36.255008635 +0000 UTC m=+219.006923819" lastFinishedPulling="2026-03-14 07:03:18.126827448 +0000 UTC m=+260.878742632" observedRunningTime="2026-03-14 07:03:18.737135101 +0000 UTC m=+261.489050285" watchObservedRunningTime="2026-03-14 07:03:18.756434285 +0000 UTC m=+261.508349469" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.792981 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fg5m2" podStartSLOduration=8.275555152 podStartE2EDuration="43.792961445s" podCreationTimestamp="2026-03-14 07:02:35 +0000 UTC" firstStartedPulling="2026-03-14 07:02:42.626770915 +0000 UTC m=+225.378686099" lastFinishedPulling="2026-03-14 07:03:18.144177198 +0000 UTC m=+260.896092392" observedRunningTime="2026-03-14 07:03:18.789934718 +0000 UTC m=+261.541849902" watchObservedRunningTime="2026-03-14 07:03:18.792961445 +0000 UTC m=+261.544876629" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.810467 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.810498 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx572\" (UniqueName: \"kubernetes.io/projected/c307bc2f-3f41-4103-8a67-6947b51a2dcf-kube-api-access-kx572\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.810508 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.810517 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c307bc2f-3f41-4103-8a67-6947b51a2dcf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.810526 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c307bc2f-3f41-4103-8a67-6947b51a2dcf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.997232 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.997274 5129 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-10 01:47:37.298075859 +0000 UTC Mar 14 07:03:18 crc kubenswrapper[5129]: I0314 07:03:18.997306 5129 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6498h44m18.30077203s for next certificate rotation Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.002567 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.037861 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx"] Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.044945 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bb57d5c57-m9pzx"] Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.112371 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9029912-be8a-48be-8881-79b83ec8b2e9-kube-api-access\") pod \"d9029912-be8a-48be-8881-79b83ec8b2e9\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.112533 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lfsd\" (UniqueName: \"kubernetes.io/projected/e9e49a0a-8f9f-4e78-8098-d195fe3297bd-kube-api-access-7lfsd\") pod \"e9e49a0a-8f9f-4e78-8098-d195fe3297bd\" (UID: \"e9e49a0a-8f9f-4e78-8098-d195fe3297bd\") " Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.112628 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9029912-be8a-48be-8881-79b83ec8b2e9-kubelet-dir\") pod \"d9029912-be8a-48be-8881-79b83ec8b2e9\" (UID: \"d9029912-be8a-48be-8881-79b83ec8b2e9\") " Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.112743 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9029912-be8a-48be-8881-79b83ec8b2e9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d9029912-be8a-48be-8881-79b83ec8b2e9" (UID: "d9029912-be8a-48be-8881-79b83ec8b2e9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.113108 5129 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9029912-be8a-48be-8881-79b83ec8b2e9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.115685 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9029912-be8a-48be-8881-79b83ec8b2e9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d9029912-be8a-48be-8881-79b83ec8b2e9" (UID: "d9029912-be8a-48be-8881-79b83ec8b2e9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.116974 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e49a0a-8f9f-4e78-8098-d195fe3297bd-kube-api-access-7lfsd" (OuterVolumeSpecName: "kube-api-access-7lfsd") pod "e9e49a0a-8f9f-4e78-8098-d195fe3297bd" (UID: "e9e49a0a-8f9f-4e78-8098-d195fe3297bd"). InnerVolumeSpecName "kube-api-access-7lfsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.214558 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lfsd\" (UniqueName: \"kubernetes.io/projected/e9e49a0a-8f9f-4e78-8098-d195fe3297bd-kube-api-access-7lfsd\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.214613 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9029912-be8a-48be-8881-79b83ec8b2e9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397177 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr"] Mar 14 07:03:19 crc kubenswrapper[5129]: E0314 07:03:19.397371 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9029912-be8a-48be-8881-79b83ec8b2e9" containerName="pruner" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397382 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9029912-be8a-48be-8881-79b83ec8b2e9" containerName="pruner" Mar 14 07:03:19 crc kubenswrapper[5129]: E0314 07:03:19.397398 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c307bc2f-3f41-4103-8a67-6947b51a2dcf" containerName="controller-manager" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397405 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c307bc2f-3f41-4103-8a67-6947b51a2dcf" containerName="controller-manager" Mar 14 07:03:19 crc kubenswrapper[5129]: E0314 07:03:19.397418 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e49a0a-8f9f-4e78-8098-d195fe3297bd" containerName="oc" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397423 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e49a0a-8f9f-4e78-8098-d195fe3297bd" containerName="oc" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397537 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c307bc2f-3f41-4103-8a67-6947b51a2dcf" containerName="controller-manager" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397562 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e49a0a-8f9f-4e78-8098-d195fe3297bd" containerName="oc" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397573 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9029912-be8a-48be-8881-79b83ec8b2e9" containerName="pruner" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.397913 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.399369 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.399389 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.400142 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.400169 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.400297 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.400591 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.409803 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.415421 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr"] Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.518161 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a43573-36b3-4d7a-8608-b3a017f86968-serving-cert\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.518216 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjtp8\" (UniqueName: \"kubernetes.io/projected/37a43573-36b3-4d7a-8608-b3a017f86968-kube-api-access-vjtp8\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.518303 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-proxy-ca-bundles\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.518325 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-client-ca\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.518365 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-config\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.574828 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.574881 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.574928 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.575478 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.575542 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2" gracePeriod=600 Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.619407 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-proxy-ca-bundles\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.619476 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-client-ca\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.619505 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-config\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.619571 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a43573-36b3-4d7a-8608-b3a017f86968-serving-cert\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.619628 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjtp8\" (UniqueName: \"kubernetes.io/projected/37a43573-36b3-4d7a-8608-b3a017f86968-kube-api-access-vjtp8\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.620587 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-proxy-ca-bundles\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.620591 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-client-ca\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.620978 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-config\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.623664 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a43573-36b3-4d7a-8608-b3a017f86968-serving-cert\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.641126 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjtp8\" (UniqueName: \"kubernetes.io/projected/37a43573-36b3-4d7a-8608-b3a017f86968-kube-api-access-vjtp8\") pod \"controller-manager-5b9bfb5c4f-xwwmr\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.724505 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.737569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d9029912-be8a-48be-8881-79b83ec8b2e9","Type":"ContainerDied","Data":"0a549d9b9a260923e2e8db25b54772a8aa46e44fa860c8bfffccd9d199f8a536"} Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.737792 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a549d9b9a260923e2e8db25b54772a8aa46e44fa860c8bfffccd9d199f8a536" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.737693 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.740933 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" event={"ID":"e9e49a0a-8f9f-4e78-8098-d195fe3297bd","Type":"ContainerDied","Data":"350d95a6e97105a727ca14d7a07083a8a5dd2015966c62146aa5a1ab93259216"} Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.740963 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350d95a6e97105a727ca14d7a07083a8a5dd2015966c62146aa5a1ab93259216" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.741014 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557862-lf4bl" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.745439 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2" exitCode=0 Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.745789 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2"} Mar 14 07:03:19 crc kubenswrapper[5129]: E0314 07:03:19.825968 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9e49a0a_8f9f_4e78_8098_d195fe3297bd.slice\": RecentStats: unable to find data in memory cache]" Mar 14 07:03:19 crc kubenswrapper[5129]: I0314 07:03:19.942333 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr"] Mar 14 07:03:19 crc kubenswrapper[5129]: W0314 07:03:19.951514 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a43573_36b3_4d7a_8608_b3a017f86968.slice/crio-4fe18e2d7a6e0d8b47b61381f699edff245176abe91175f287745404a79494a9 WatchSource:0}: Error finding container 4fe18e2d7a6e0d8b47b61381f699edff245176abe91175f287745404a79494a9: Status 404 returned error can't find the container with id 4fe18e2d7a6e0d8b47b61381f699edff245176abe91175f287745404a79494a9 Mar 14 07:03:20 crc kubenswrapper[5129]: I0314 07:03:20.043283 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c307bc2f-3f41-4103-8a67-6947b51a2dcf" path="/var/lib/kubelet/pods/c307bc2f-3f41-4103-8a67-6947b51a2dcf/volumes" Mar 14 07:03:20 crc kubenswrapper[5129]: I0314 07:03:20.753193 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" event={"ID":"37a43573-36b3-4d7a-8608-b3a017f86968","Type":"ContainerStarted","Data":"10499c0e570584e900565ebd10747e19861b1638ec9435fa59a167072dfa25fa"} Mar 14 07:03:20 crc kubenswrapper[5129]: I0314 07:03:20.753717 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:20 crc kubenswrapper[5129]: I0314 07:03:20.753729 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" event={"ID":"37a43573-36b3-4d7a-8608-b3a017f86968","Type":"ContainerStarted","Data":"4fe18e2d7a6e0d8b47b61381f699edff245176abe91175f287745404a79494a9"} Mar 14 07:03:20 crc kubenswrapper[5129]: I0314 07:03:20.755175 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"530282bf4b5fced32ff8a4f896195aed87fef0330a342177990023776d5d6856"} Mar 14 07:03:20 crc kubenswrapper[5129]: I0314 07:03:20.758581 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:20 crc kubenswrapper[5129]: I0314 07:03:20.772637 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" podStartSLOduration=10.772591666 podStartE2EDuration="10.772591666s" podCreationTimestamp="2026-03-14 07:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:20.770210188 +0000 UTC m=+263.522125402" watchObservedRunningTime="2026-03-14 07:03:20.772591666 +0000 UTC m=+263.524506850" Mar 14 07:03:22 crc kubenswrapper[5129]: I0314 07:03:22.665034 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:03:22 crc kubenswrapper[5129]: I0314 07:03:22.665363 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:03:22 crc kubenswrapper[5129]: I0314 07:03:22.875073 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:03:22 crc kubenswrapper[5129]: I0314 07:03:22.964443 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:03:23 crc kubenswrapper[5129]: I0314 07:03:23.802948 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mmjl"] Mar 14 07:03:24 crc kubenswrapper[5129]: I0314 07:03:24.665747 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:03:24 crc kubenswrapper[5129]: I0314 07:03:24.666110 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:03:24 crc kubenswrapper[5129]: I0314 07:03:24.708471 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:03:24 crc kubenswrapper[5129]: I0314 07:03:24.772825 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8mmjl" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="registry-server" containerID="cri-o://d61fc1270fd328952b9b5573afda891dce6cb6377ccebdc5aa25bb0b5c95a364" gracePeriod=2 Mar 14 07:03:24 crc kubenswrapper[5129]: I0314 07:03:24.812629 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:03:25 crc kubenswrapper[5129]: I0314 07:03:25.629593 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:03:25 crc kubenswrapper[5129]: I0314 07:03:25.629956 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:03:25 crc kubenswrapper[5129]: I0314 07:03:25.671105 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:03:25 crc kubenswrapper[5129]: I0314 07:03:25.782990 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6l8" event={"ID":"a848f19c-da50-403e-b620-5425b51fab9a","Type":"ContainerStarted","Data":"c67d27b7fea88d43560bb8c501366312262ffcfe8710eb9aadad7262db75a441"} Mar 14 07:03:25 crc kubenswrapper[5129]: I0314 07:03:25.787099 5129 generic.go:334] "Generic (PLEG): container finished" podID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerID="d61fc1270fd328952b9b5573afda891dce6cb6377ccebdc5aa25bb0b5c95a364" exitCode=0 Mar 14 07:03:25 crc kubenswrapper[5129]: I0314 07:03:25.787777 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mmjl" event={"ID":"7470c785-0846-4e7f-9d1b-aad9a52e5a5b","Type":"ContainerDied","Data":"d61fc1270fd328952b9b5573afda891dce6cb6377ccebdc5aa25bb0b5c95a364"} Mar 14 07:03:25 crc kubenswrapper[5129]: I0314 07:03:25.829318 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.258479 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.409378 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-catalog-content\") pod \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.409442 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md9sm\" (UniqueName: \"kubernetes.io/projected/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-kube-api-access-md9sm\") pod \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.410569 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-utilities\") pod \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\" (UID: \"7470c785-0846-4e7f-9d1b-aad9a52e5a5b\") " Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.411163 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-utilities" (OuterVolumeSpecName: "utilities") pod "7470c785-0846-4e7f-9d1b-aad9a52e5a5b" (UID: "7470c785-0846-4e7f-9d1b-aad9a52e5a5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.415053 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-kube-api-access-md9sm" (OuterVolumeSpecName: "kube-api-access-md9sm") pod "7470c785-0846-4e7f-9d1b-aad9a52e5a5b" (UID: "7470c785-0846-4e7f-9d1b-aad9a52e5a5b"). InnerVolumeSpecName "kube-api-access-md9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.511950 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md9sm\" (UniqueName: \"kubernetes.io/projected/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-kube-api-access-md9sm\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.511983 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.513067 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7470c785-0846-4e7f-9d1b-aad9a52e5a5b" (UID: "7470c785-0846-4e7f-9d1b-aad9a52e5a5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.612642 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7470c785-0846-4e7f-9d1b-aad9a52e5a5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.794137 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mmjl" event={"ID":"7470c785-0846-4e7f-9d1b-aad9a52e5a5b","Type":"ContainerDied","Data":"62d857225125c5d6ab1b9d2c1a6377e297b521d29093a79bd2aaf63176f11769"} Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.794187 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mmjl" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.794431 5129 scope.go:117] "RemoveContainer" containerID="d61fc1270fd328952b9b5573afda891dce6cb6377ccebdc5aa25bb0b5c95a364" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.797491 5129 generic.go:334] "Generic (PLEG): container finished" podID="a848f19c-da50-403e-b620-5425b51fab9a" containerID="c67d27b7fea88d43560bb8c501366312262ffcfe8710eb9aadad7262db75a441" exitCode=0 Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.797530 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6l8" event={"ID":"a848f19c-da50-403e-b620-5425b51fab9a","Type":"ContainerDied","Data":"c67d27b7fea88d43560bb8c501366312262ffcfe8710eb9aadad7262db75a441"} Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.816371 5129 scope.go:117] "RemoveContainer" containerID="ea9694764b338ca7e1d75541f494c63af2ba5ec628d9c5e5583bcdf0d960fc2b" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.831510 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mmjl"] Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.835931 5129 scope.go:117] "RemoveContainer" containerID="e6d3622c77e3bf8db13f8e1ec0ef42879cd9acdab50723f3dfa2d30972c82aae" Mar 14 07:03:26 crc kubenswrapper[5129]: I0314 07:03:26.836200 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8mmjl"] Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.002993 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-glrkk"] Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.003240 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-glrkk" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="registry-server" containerID="cri-o://26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a" gracePeriod=2 Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.429031 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.627046 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-catalog-content\") pod \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.627152 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-utilities\") pod \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.627198 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcj68\" (UniqueName: \"kubernetes.io/projected/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-kube-api-access-rcj68\") pod \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\" (UID: \"35d61c74-41cf-4ed5-a0c0-ffae19f82be0\") " Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.628966 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-utilities" (OuterVolumeSpecName: "utilities") pod "35d61c74-41cf-4ed5-a0c0-ffae19f82be0" (UID: "35d61c74-41cf-4ed5-a0c0-ffae19f82be0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.634237 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-kube-api-access-rcj68" (OuterVolumeSpecName: "kube-api-access-rcj68") pod "35d61c74-41cf-4ed5-a0c0-ffae19f82be0" (UID: "35d61c74-41cf-4ed5-a0c0-ffae19f82be0"). InnerVolumeSpecName "kube-api-access-rcj68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.666717 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35d61c74-41cf-4ed5-a0c0-ffae19f82be0" (UID: "35d61c74-41cf-4ed5-a0c0-ffae19f82be0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.728641 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcj68\" (UniqueName: \"kubernetes.io/projected/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-kube-api-access-rcj68\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.728678 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.728690 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d61c74-41cf-4ed5-a0c0-ffae19f82be0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.804808 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6l8" event={"ID":"a848f19c-da50-403e-b620-5425b51fab9a","Type":"ContainerStarted","Data":"c5fda5f254298e5b858f8568449e0f20230cc86c9afe9a1eca8b8f730108ce30"} Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.808841 5129 generic.go:334] "Generic (PLEG): container finished" podID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerID="26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a" exitCode=0 Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.808892 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glrkk" event={"ID":"35d61c74-41cf-4ed5-a0c0-ffae19f82be0","Type":"ContainerDied","Data":"26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a"} Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.808943 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glrkk" event={"ID":"35d61c74-41cf-4ed5-a0c0-ffae19f82be0","Type":"ContainerDied","Data":"e70f09050edc1f744c08ea351abda29f62ffeb1d185280de54f22b3a3689a54d"} Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.808964 5129 scope.go:117] "RemoveContainer" containerID="26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.808901 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-glrkk" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.825410 5129 scope.go:117] "RemoveContainer" containerID="aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.837314 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6t6l8" podStartSLOduration=2.706618753 podStartE2EDuration="56.837293459s" podCreationTimestamp="2026-03-14 07:02:31 +0000 UTC" firstStartedPulling="2026-03-14 07:02:33.131460849 +0000 UTC m=+215.883376033" lastFinishedPulling="2026-03-14 07:03:27.262135555 +0000 UTC m=+270.014050739" observedRunningTime="2026-03-14 07:03:27.827707503 +0000 UTC m=+270.579622697" watchObservedRunningTime="2026-03-14 07:03:27.837293459 +0000 UTC m=+270.589208643" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.838816 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-glrkk"] Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.842233 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-glrkk"] Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.857708 5129 scope.go:117] "RemoveContainer" containerID="abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.878843 5129 scope.go:117] "RemoveContainer" containerID="26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a" Mar 14 07:03:27 crc kubenswrapper[5129]: E0314 07:03:27.879287 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a\": container with ID starting with 26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a not found: ID does not exist" containerID="26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.879334 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a"} err="failed to get container status \"26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a\": rpc error: code = NotFound desc = could not find container \"26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a\": container with ID starting with 26ef3c6e45865d98cda2a9023b06f0311ee8ac6cdd7038d89c3ee84168b8947a not found: ID does not exist" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.879363 5129 scope.go:117] "RemoveContainer" containerID="aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5" Mar 14 07:03:27 crc kubenswrapper[5129]: E0314 07:03:27.879847 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5\": container with ID starting with aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5 not found: ID does not exist" containerID="aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.879873 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5"} err="failed to get container status \"aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5\": rpc error: code = NotFound desc = could not find container \"aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5\": container with ID starting with aa76f3ccc9471298dfc4584ee792962bf18e5bf8bd102dec1499950705b5adb5 not found: ID does not exist" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.879901 5129 scope.go:117] "RemoveContainer" containerID="abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85" Mar 14 07:03:27 crc kubenswrapper[5129]: E0314 07:03:27.880244 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85\": container with ID starting with abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85 not found: ID does not exist" containerID="abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85" Mar 14 07:03:27 crc kubenswrapper[5129]: I0314 07:03:27.880273 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85"} err="failed to get container status \"abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85\": rpc error: code = NotFound desc = could not find container \"abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85\": container with ID starting with abd13f73f65561cfc277f56c99f4d486c544f632ba1fbefb44c166af4b4cad85 not found: ID does not exist" Mar 14 07:03:28 crc kubenswrapper[5129]: I0314 07:03:28.045133 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" path="/var/lib/kubelet/pods/35d61c74-41cf-4ed5-a0c0-ffae19f82be0/volumes" Mar 14 07:03:28 crc kubenswrapper[5129]: I0314 07:03:28.046008 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" path="/var/lib/kubelet/pods/7470c785-0846-4e7f-9d1b-aad9a52e5a5b/volumes" Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.665693 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr"] Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.666385 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" podUID="37a43573-36b3-4d7a-8608-b3a017f86968" containerName="controller-manager" containerID="cri-o://10499c0e570584e900565ebd10747e19861b1638ec9435fa59a167072dfa25fa" gracePeriod=30 Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.681068 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh"] Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.681275 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" podUID="4b1e1421-7470-47c1-bb48-966ef694890e" containerName="route-controller-manager" containerID="cri-o://c9224607510b7a04a9fb45643a9142e9dbf56b8783da5470274d0ab036e9c53c" gracePeriod=30 Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.827675 5129 generic.go:334] "Generic (PLEG): container finished" podID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerID="1346a0f9987b675f39ff7d6a9789cfbca306fe0ca919c5e2c4f83dc3afd3c37e" exitCode=0 Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.827715 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw4hz" event={"ID":"ff0704b1-3b5e-4e02-9f82-c1d74ad03387","Type":"ContainerDied","Data":"1346a0f9987b675f39ff7d6a9789cfbca306fe0ca919c5e2c4f83dc3afd3c37e"} Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.834849 5129 generic.go:334] "Generic (PLEG): container finished" podID="4b1e1421-7470-47c1-bb48-966ef694890e" containerID="c9224607510b7a04a9fb45643a9142e9dbf56b8783da5470274d0ab036e9c53c" exitCode=0 Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.834890 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" event={"ID":"4b1e1421-7470-47c1-bb48-966ef694890e","Type":"ContainerDied","Data":"c9224607510b7a04a9fb45643a9142e9dbf56b8783da5470274d0ab036e9c53c"} Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.837252 5129 generic.go:334] "Generic (PLEG): container finished" podID="37a43573-36b3-4d7a-8608-b3a017f86968" containerID="10499c0e570584e900565ebd10747e19861b1638ec9435fa59a167072dfa25fa" exitCode=0 Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.837301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" event={"ID":"37a43573-36b3-4d7a-8608-b3a017f86968","Type":"ContainerDied","Data":"10499c0e570584e900565ebd10747e19861b1638ec9435fa59a167072dfa25fa"} Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.842200 5129 generic.go:334] "Generic (PLEG): container finished" podID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerID="e666cbc843910b19e70ba5eb743eb0dbf3da67aaacac0fd04845e9331ec18950" exitCode=0 Mar 14 07:03:30 crc kubenswrapper[5129]: I0314 07:03:30.842249 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqck" event={"ID":"2cdca220-d63a-45e2-ad4e-d2b920554116","Type":"ContainerDied","Data":"e666cbc843910b19e70ba5eb743eb0dbf3da67aaacac0fd04845e9331ec18950"} Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.150725 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.202090 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274220 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-config\") pod \"37a43573-36b3-4d7a-8608-b3a017f86968\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274299 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjtp8\" (UniqueName: \"kubernetes.io/projected/37a43573-36b3-4d7a-8608-b3a017f86968-kube-api-access-vjtp8\") pod \"37a43573-36b3-4d7a-8608-b3a017f86968\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274320 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-client-ca\") pod \"4b1e1421-7470-47c1-bb48-966ef694890e\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274347 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-proxy-ca-bundles\") pod \"37a43573-36b3-4d7a-8608-b3a017f86968\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274390 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b1e1421-7470-47c1-bb48-966ef694890e-serving-cert\") pod \"4b1e1421-7470-47c1-bb48-966ef694890e\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274425 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a43573-36b3-4d7a-8608-b3a017f86968-serving-cert\") pod \"37a43573-36b3-4d7a-8608-b3a017f86968\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274439 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-config\") pod \"4b1e1421-7470-47c1-bb48-966ef694890e\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274466 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-client-ca\") pod \"37a43573-36b3-4d7a-8608-b3a017f86968\" (UID: \"37a43573-36b3-4d7a-8608-b3a017f86968\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.274489 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjkkh\" (UniqueName: \"kubernetes.io/projected/4b1e1421-7470-47c1-bb48-966ef694890e-kube-api-access-jjkkh\") pod \"4b1e1421-7470-47c1-bb48-966ef694890e\" (UID: \"4b1e1421-7470-47c1-bb48-966ef694890e\") " Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.275263 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-config" (OuterVolumeSpecName: "config") pod "37a43573-36b3-4d7a-8608-b3a017f86968" (UID: "37a43573-36b3-4d7a-8608-b3a017f86968"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.276483 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b1e1421-7470-47c1-bb48-966ef694890e" (UID: "4b1e1421-7470-47c1-bb48-966ef694890e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.277078 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-config" (OuterVolumeSpecName: "config") pod "4b1e1421-7470-47c1-bb48-966ef694890e" (UID: "4b1e1421-7470-47c1-bb48-966ef694890e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.278043 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "37a43573-36b3-4d7a-8608-b3a017f86968" (UID: "37a43573-36b3-4d7a-8608-b3a017f86968"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.278924 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-client-ca" (OuterVolumeSpecName: "client-ca") pod "37a43573-36b3-4d7a-8608-b3a017f86968" (UID: "37a43573-36b3-4d7a-8608-b3a017f86968"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.279357 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b1e1421-7470-47c1-bb48-966ef694890e-kube-api-access-jjkkh" (OuterVolumeSpecName: "kube-api-access-jjkkh") pod "4b1e1421-7470-47c1-bb48-966ef694890e" (UID: "4b1e1421-7470-47c1-bb48-966ef694890e"). InnerVolumeSpecName "kube-api-access-jjkkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.279948 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a43573-36b3-4d7a-8608-b3a017f86968-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "37a43573-36b3-4d7a-8608-b3a017f86968" (UID: "37a43573-36b3-4d7a-8608-b3a017f86968"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.280218 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b1e1421-7470-47c1-bb48-966ef694890e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b1e1421-7470-47c1-bb48-966ef694890e" (UID: "4b1e1421-7470-47c1-bb48-966ef694890e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.280426 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a43573-36b3-4d7a-8608-b3a017f86968-kube-api-access-vjtp8" (OuterVolumeSpecName: "kube-api-access-vjtp8") pod "37a43573-36b3-4d7a-8608-b3a017f86968" (UID: "37a43573-36b3-4d7a-8608-b3a017f86968"). InnerVolumeSpecName "kube-api-access-vjtp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375427 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjtp8\" (UniqueName: \"kubernetes.io/projected/37a43573-36b3-4d7a-8608-b3a017f86968-kube-api-access-vjtp8\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375481 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375491 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375499 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b1e1421-7470-47c1-bb48-966ef694890e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375508 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a43573-36b3-4d7a-8608-b3a017f86968-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375516 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b1e1421-7470-47c1-bb48-966ef694890e-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375525 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375533 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjkkh\" (UniqueName: \"kubernetes.io/projected/4b1e1421-7470-47c1-bb48-966ef694890e-kube-api-access-jjkkh\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.375541 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a43573-36b3-4d7a-8608-b3a017f86968-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.848663 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.848678 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr" event={"ID":"37a43573-36b3-4d7a-8608-b3a017f86968","Type":"ContainerDied","Data":"4fe18e2d7a6e0d8b47b61381f699edff245176abe91175f287745404a79494a9"} Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.849144 5129 scope.go:117] "RemoveContainer" containerID="10499c0e570584e900565ebd10747e19861b1638ec9435fa59a167072dfa25fa" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.851150 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqcxv" event={"ID":"ee9d7764-4db5-49d9-9b08-7b2317ec41ca","Type":"ContainerStarted","Data":"d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0"} Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.853296 5129 generic.go:334] "Generic (PLEG): container finished" podID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerID="de8341d36f6ccaf1ef2a567fe07185ce78a2d44f28b03de376f6436e1c9387ad" exitCode=0 Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.853338 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxbtj" event={"ID":"8e8571cc-ed81-4074-a9e7-24f81fa725f0","Type":"ContainerDied","Data":"de8341d36f6ccaf1ef2a567fe07185ce78a2d44f28b03de376f6436e1c9387ad"} Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.857344 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw4hz" event={"ID":"ff0704b1-3b5e-4e02-9f82-c1d74ad03387","Type":"ContainerStarted","Data":"b56c67ed916dbfb9522d4eb33b4cb80614cf5dc13278fd879d13a84ed48ddf40"} Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.859929 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" event={"ID":"4b1e1421-7470-47c1-bb48-966ef694890e","Type":"ContainerDied","Data":"83852ee339292f7c710c30b5ad4f1becb86248514c172814431495d0b57f1a01"} Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.859991 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.890654 5129 scope.go:117] "RemoveContainer" containerID="c9224607510b7a04a9fb45643a9142e9dbf56b8783da5470274d0ab036e9c53c" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.909364 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gw4hz" podStartSLOduration=1.542585831 podStartE2EDuration="59.909348002s" podCreationTimestamp="2026-03-14 07:02:32 +0000 UTC" firstStartedPulling="2026-03-14 07:02:33.101299848 +0000 UTC m=+215.853215032" lastFinishedPulling="2026-03-14 07:03:31.468062019 +0000 UTC m=+274.219977203" observedRunningTime="2026-03-14 07:03:31.906973084 +0000 UTC m=+274.658888268" watchObservedRunningTime="2026-03-14 07:03:31.909348002 +0000 UTC m=+274.661263186" Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.921916 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr"] Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.925284 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b9bfb5c4f-xwwmr"] Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.936021 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh"] Mar 14 07:03:31 crc kubenswrapper[5129]: I0314 07:03:31.938472 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd9dcb4c5-hg6sh"] Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.042572 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a43573-36b3-4d7a-8608-b3a017f86968" path="/var/lib/kubelet/pods/37a43573-36b3-4d7a-8608-b3a017f86968/volumes" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.043196 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b1e1421-7470-47c1-bb48-966ef694890e" path="/var/lib/kubelet/pods/4b1e1421-7470-47c1-bb48-966ef694890e/volumes" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.232587 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.232681 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.276784 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408239 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-988f88596-6zfsw"] Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408484 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a43573-36b3-4d7a-8608-b3a017f86968" containerName="controller-manager" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408495 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a43573-36b3-4d7a-8608-b3a017f86968" containerName="controller-manager" Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408507 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="extract-content" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408513 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="extract-content" Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408523 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="extract-utilities" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408529 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="extract-utilities" Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408538 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="extract-utilities" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408544 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="extract-utilities" Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408554 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="registry-server" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408559 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="registry-server" Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408566 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="registry-server" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408571 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="registry-server" Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408579 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="extract-content" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408586 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="extract-content" Mar 14 07:03:32 crc kubenswrapper[5129]: E0314 07:03:32.408614 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1e1421-7470-47c1-bb48-966ef694890e" containerName="route-controller-manager" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408620 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1e1421-7470-47c1-bb48-966ef694890e" containerName="route-controller-manager" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408703 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7470c785-0846-4e7f-9d1b-aad9a52e5a5b" containerName="registry-server" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408715 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d61c74-41cf-4ed5-a0c0-ffae19f82be0" containerName="registry-server" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408728 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a43573-36b3-4d7a-8608-b3a017f86968" containerName="controller-manager" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.408739 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b1e1421-7470-47c1-bb48-966ef694890e" containerName="route-controller-manager" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.409078 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.410875 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.410939 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.411226 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg"] Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.411652 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.411948 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.414312 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.414572 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.415040 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.415151 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.415181 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.415299 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.415421 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.415575 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.415898 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.423341 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.426847 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg"] Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.429402 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.429441 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.469085 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-988f88596-6zfsw"] Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492274 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnm2\" (UniqueName: \"kubernetes.io/projected/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-kube-api-access-plnm2\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492346 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-serving-cert\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492372 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-proxy-ca-bundles\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492415 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fg5\" (UniqueName: \"kubernetes.io/projected/611e6147-8e37-41ba-8f13-d944070e4ed8-kube-api-access-45fg5\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492454 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611e6147-8e37-41ba-8f13-d944070e4ed8-serving-cert\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492494 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-client-ca\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492527 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-config\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492559 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-client-ca\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.492589 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-config\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.593822 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611e6147-8e37-41ba-8f13-d944070e4ed8-serving-cert\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.593923 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-client-ca\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.593989 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-config\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.594043 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-client-ca\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.594074 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-config\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.594098 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnm2\" (UniqueName: \"kubernetes.io/projected/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-kube-api-access-plnm2\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.594157 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-serving-cert\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.594179 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-proxy-ca-bundles\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.594249 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fg5\" (UniqueName: \"kubernetes.io/projected/611e6147-8e37-41ba-8f13-d944070e4ed8-kube-api-access-45fg5\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.595091 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-client-ca\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.596154 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-config\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.596484 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-client-ca\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.597742 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-config\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.598065 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-serving-cert\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.598408 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611e6147-8e37-41ba-8f13-d944070e4ed8-serving-cert\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.598869 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-proxy-ca-bundles\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.614184 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnm2\" (UniqueName: \"kubernetes.io/projected/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-kube-api-access-plnm2\") pod \"route-controller-manager-7c665b84c9-v9bqg\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.617726 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fg5\" (UniqueName: \"kubernetes.io/projected/611e6147-8e37-41ba-8f13-d944070e4ed8-kube-api-access-45fg5\") pod \"controller-manager-988f88596-6zfsw\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.783476 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.792901 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.869163 5129 generic.go:334] "Generic (PLEG): container finished" podID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerID="d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0" exitCode=0 Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.869246 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqcxv" event={"ID":"ee9d7764-4db5-49d9-9b08-7b2317ec41ca","Type":"ContainerDied","Data":"d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0"} Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.902663 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqck" event={"ID":"2cdca220-d63a-45e2-ad4e-d2b920554116","Type":"ContainerStarted","Data":"ab7286767d8073deb30416644d38e366872aeccd9803f0a8008ab41b26a08609"} Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.924858 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdqck" podStartSLOduration=3.5185919610000003 podStartE2EDuration="59.924841311s" podCreationTimestamp="2026-03-14 07:02:33 +0000 UTC" firstStartedPulling="2026-03-14 07:02:36.255147899 +0000 UTC m=+219.007063083" lastFinishedPulling="2026-03-14 07:03:32.661397249 +0000 UTC m=+275.413312433" observedRunningTime="2026-03-14 07:03:32.923561504 +0000 UTC m=+275.675476698" watchObservedRunningTime="2026-03-14 07:03:32.924841311 +0000 UTC m=+275.676756495" Mar 14 07:03:32 crc kubenswrapper[5129]: I0314 07:03:32.963360 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.026275 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg"] Mar 14 07:03:33 crc kubenswrapper[5129]: W0314 07:03:33.039435 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9ffa51_5f5e_4f7e_9e89_64e91fb8034a.slice/crio-5614201981c33ed676f3d3f75c3bc215412175b212239908441da79eb7ab0d97 WatchSource:0}: Error finding container 5614201981c33ed676f3d3f75c3bc215412175b212239908441da79eb7ab0d97: Status 404 returned error can't find the container with id 5614201981c33ed676f3d3f75c3bc215412175b212239908441da79eb7ab0d97 Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.292300 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-988f88596-6zfsw"] Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.507816 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gw4hz" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="registry-server" probeResult="failure" output=< Mar 14 07:03:33 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:03:33 crc kubenswrapper[5129]: > Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.920750 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" event={"ID":"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a","Type":"ContainerStarted","Data":"267c7c5b213a97c6b86b7359e2ca22d4b1a7ec91ed6e44eb647f03a0618a23f1"} Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.920789 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" event={"ID":"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a","Type":"ContainerStarted","Data":"5614201981c33ed676f3d3f75c3bc215412175b212239908441da79eb7ab0d97"} Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.921804 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.926011 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxbtj" event={"ID":"8e8571cc-ed81-4074-a9e7-24f81fa725f0","Type":"ContainerStarted","Data":"434b988f32830f00e4165efb189d6ae323a7c22c17bf328f3a88da6c9585491d"} Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.928762 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.933117 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqcxv" event={"ID":"ee9d7764-4db5-49d9-9b08-7b2317ec41ca","Type":"ContainerStarted","Data":"fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758"} Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.936855 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" event={"ID":"611e6147-8e37-41ba-8f13-d944070e4ed8","Type":"ContainerStarted","Data":"d7ec99cd26e02be357ead74338dfac629eec60336c25299ce4cbe1508b85b3f2"} Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.936894 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" event={"ID":"611e6147-8e37-41ba-8f13-d944070e4ed8","Type":"ContainerStarted","Data":"cb666a76dd77333495d22d5f3ff537f0feb57d53c39c1db268f410d2da0832f0"} Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.936908 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.944917 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.945614 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" podStartSLOduration=3.94558106 podStartE2EDuration="3.94558106s" podCreationTimestamp="2026-03-14 07:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:33.940166764 +0000 UTC m=+276.692081948" watchObservedRunningTime="2026-03-14 07:03:33.94558106 +0000 UTC m=+276.697496244" Mar 14 07:03:33 crc kubenswrapper[5129]: I0314 07:03:33.982907 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gqcxv" podStartSLOduration=2.821057866 podStartE2EDuration="1m1.982889272s" podCreationTimestamp="2026-03-14 07:02:32 +0000 UTC" firstStartedPulling="2026-03-14 07:02:34.200884471 +0000 UTC m=+216.952799655" lastFinishedPulling="2026-03-14 07:03:33.362715877 +0000 UTC m=+276.114631061" observedRunningTime="2026-03-14 07:03:33.982685826 +0000 UTC m=+276.734601010" watchObservedRunningTime="2026-03-14 07:03:33.982889272 +0000 UTC m=+276.734804446" Mar 14 07:03:34 crc kubenswrapper[5129]: I0314 07:03:34.019251 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rxbtj" podStartSLOduration=8.367952199 podStartE2EDuration="59.019236647s" podCreationTimestamp="2026-03-14 07:02:35 +0000 UTC" firstStartedPulling="2026-03-14 07:02:42.626768825 +0000 UTC m=+225.378684009" lastFinishedPulling="2026-03-14 07:03:33.278053283 +0000 UTC m=+276.029968457" observedRunningTime="2026-03-14 07:03:34.018418373 +0000 UTC m=+276.770333557" watchObservedRunningTime="2026-03-14 07:03:34.019236647 +0000 UTC m=+276.771151831" Mar 14 07:03:34 crc kubenswrapper[5129]: I0314 07:03:34.061794 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" podStartSLOduration=4.06177357 podStartE2EDuration="4.06177357s" podCreationTimestamp="2026-03-14 07:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:34.044213585 +0000 UTC m=+276.796128769" watchObservedRunningTime="2026-03-14 07:03:34.06177357 +0000 UTC m=+276.813688764" Mar 14 07:03:34 crc kubenswrapper[5129]: I0314 07:03:34.231197 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:03:34 crc kubenswrapper[5129]: I0314 07:03:34.231240 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:03:35 crc kubenswrapper[5129]: I0314 07:03:35.282319 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mdqck" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="registry-server" probeResult="failure" output=< Mar 14 07:03:35 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:03:35 crc kubenswrapper[5129]: > Mar 14 07:03:36 crc kubenswrapper[5129]: I0314 07:03:36.015339 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:03:36 crc kubenswrapper[5129]: I0314 07:03:36.015381 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:03:37 crc kubenswrapper[5129]: I0314 07:03:37.058281 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rxbtj" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="registry-server" probeResult="failure" output=< Mar 14 07:03:37 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:03:37 crc kubenswrapper[5129]: > Mar 14 07:03:43 crc kubenswrapper[5129]: I0314 07:03:42.480058 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:03:43 crc kubenswrapper[5129]: I0314 07:03:42.527988 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:03:43 crc kubenswrapper[5129]: I0314 07:03:42.832909 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:03:43 crc kubenswrapper[5129]: I0314 07:03:42.832959 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:03:43 crc kubenswrapper[5129]: I0314 07:03:42.869767 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:03:43 crc kubenswrapper[5129]: I0314 07:03:43.047028 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:03:43 crc kubenswrapper[5129]: I0314 07:03:43.709631 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqcxv"] Mar 14 07:03:44 crc kubenswrapper[5129]: I0314 07:03:44.280095 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:03:44 crc kubenswrapper[5129]: I0314 07:03:44.323351 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:03:45 crc kubenswrapper[5129]: I0314 07:03:45.005058 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gqcxv" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="registry-server" containerID="cri-o://fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758" gracePeriod=2 Mar 14 07:03:45 crc kubenswrapper[5129]: I0314 07:03:45.961796 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.018147 5129 generic.go:334] "Generic (PLEG): container finished" podID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerID="fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758" exitCode=0 Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.018190 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqcxv" event={"ID":"ee9d7764-4db5-49d9-9b08-7b2317ec41ca","Type":"ContainerDied","Data":"fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758"} Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.018480 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqcxv" event={"ID":"ee9d7764-4db5-49d9-9b08-7b2317ec41ca","Type":"ContainerDied","Data":"9917e4c91fe034a05ae9d77e77167039050c3524e4574dcb64d3c57e9af65e74"} Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.018503 5129 scope.go:117] "RemoveContainer" containerID="fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.018649 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqcxv" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.034654 5129 scope.go:117] "RemoveContainer" containerID="d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.053494 5129 scope.go:117] "RemoveContainer" containerID="bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.063121 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.088385 5129 scope.go:117] "RemoveContainer" containerID="fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758" Mar 14 07:03:46 crc kubenswrapper[5129]: E0314 07:03:46.090543 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758\": container with ID starting with fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758 not found: ID does not exist" containerID="fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.090572 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758"} err="failed to get container status \"fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758\": rpc error: code = NotFound desc = could not find container \"fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758\": container with ID starting with fbee49c51893858f7669d9ead8502aaeb04841c08c469d27f8f8bdb769181758 not found: ID does not exist" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.090591 5129 scope.go:117] "RemoveContainer" containerID="d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0" Mar 14 07:03:46 crc kubenswrapper[5129]: E0314 07:03:46.091035 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0\": container with ID starting with d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0 not found: ID does not exist" containerID="d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.091075 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0"} err="failed to get container status \"d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0\": rpc error: code = NotFound desc = could not find container \"d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0\": container with ID starting with d7f3cc88801371e4f5888211f1a798c8e33ed97d2cf20e6d3f1abb490ec74dd0 not found: ID does not exist" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.091101 5129 scope.go:117] "RemoveContainer" containerID="bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb" Mar 14 07:03:46 crc kubenswrapper[5129]: E0314 07:03:46.091434 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb\": container with ID starting with bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb not found: ID does not exist" containerID="bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.091484 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb"} err="failed to get container status \"bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb\": rpc error: code = NotFound desc = could not find container \"bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb\": container with ID starting with bb28288488aadc810ffcb5b901e3b72844431d3ef6a47842aaa2cba5854778cb not found: ID does not exist" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.107135 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-catalog-content\") pod \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.107232 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-utilities\") pod \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.107287 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d764\" (UniqueName: \"kubernetes.io/projected/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-kube-api-access-6d764\") pod \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\" (UID: \"ee9d7764-4db5-49d9-9b08-7b2317ec41ca\") " Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.119319 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-utilities" (OuterVolumeSpecName: "utilities") pod "ee9d7764-4db5-49d9-9b08-7b2317ec41ca" (UID: "ee9d7764-4db5-49d9-9b08-7b2317ec41ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.120532 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-kube-api-access-6d764" (OuterVolumeSpecName: "kube-api-access-6d764") pod "ee9d7764-4db5-49d9-9b08-7b2317ec41ca" (UID: "ee9d7764-4db5-49d9-9b08-7b2317ec41ca"). InnerVolumeSpecName "kube-api-access-6d764". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.137932 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.172146 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee9d7764-4db5-49d9-9b08-7b2317ec41ca" (UID: "ee9d7764-4db5-49d9-9b08-7b2317ec41ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.208633 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.208675 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d764\" (UniqueName: \"kubernetes.io/projected/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-kube-api-access-6d764\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.208691 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee9d7764-4db5-49d9-9b08-7b2317ec41ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.351891 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqcxv"] Mar 14 07:03:46 crc kubenswrapper[5129]: I0314 07:03:46.354643 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gqcxv"] Mar 14 07:03:48 crc kubenswrapper[5129]: I0314 07:03:48.049773 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" path="/var/lib/kubelet/pods/ee9d7764-4db5-49d9-9b08-7b2317ec41ca/volumes" Mar 14 07:03:48 crc kubenswrapper[5129]: I0314 07:03:48.511220 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxbtj"] Mar 14 07:03:48 crc kubenswrapper[5129]: I0314 07:03:48.511713 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rxbtj" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="registry-server" containerID="cri-o://434b988f32830f00e4165efb189d6ae323a7c22c17bf328f3a88da6c9585491d" gracePeriod=2 Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.047937 5129 generic.go:334] "Generic (PLEG): container finished" podID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerID="434b988f32830f00e4165efb189d6ae323a7c22c17bf328f3a88da6c9585491d" exitCode=0 Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.047976 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxbtj" event={"ID":"8e8571cc-ed81-4074-a9e7-24f81fa725f0","Type":"ContainerDied","Data":"434b988f32830f00e4165efb189d6ae323a7c22c17bf328f3a88da6c9585491d"} Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.048000 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxbtj" event={"ID":"8e8571cc-ed81-4074-a9e7-24f81fa725f0","Type":"ContainerDied","Data":"c8a6af68962a7794f7cd77491514e81f6b536cc278d1a6553dc804ca2755e264"} Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.048012 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a6af68962a7794f7cd77491514e81f6b536cc278d1a6553dc804ca2755e264" Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.063790 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.246027 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-catalog-content\") pod \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.246096 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-utilities\") pod \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.246137 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vt7x\" (UniqueName: \"kubernetes.io/projected/8e8571cc-ed81-4074-a9e7-24f81fa725f0-kube-api-access-5vt7x\") pod \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\" (UID: \"8e8571cc-ed81-4074-a9e7-24f81fa725f0\") " Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.247693 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-utilities" (OuterVolumeSpecName: "utilities") pod "8e8571cc-ed81-4074-a9e7-24f81fa725f0" (UID: "8e8571cc-ed81-4074-a9e7-24f81fa725f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.250745 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8571cc-ed81-4074-a9e7-24f81fa725f0-kube-api-access-5vt7x" (OuterVolumeSpecName: "kube-api-access-5vt7x") pod "8e8571cc-ed81-4074-a9e7-24f81fa725f0" (UID: "8e8571cc-ed81-4074-a9e7-24f81fa725f0"). InnerVolumeSpecName "kube-api-access-5vt7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.347014 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.347301 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vt7x\" (UniqueName: \"kubernetes.io/projected/8e8571cc-ed81-4074-a9e7-24f81fa725f0-kube-api-access-5vt7x\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.368506 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e8571cc-ed81-4074-a9e7-24f81fa725f0" (UID: "8e8571cc-ed81-4074-a9e7-24f81fa725f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:03:49 crc kubenswrapper[5129]: I0314 07:03:49.448483 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8571cc-ed81-4074-a9e7-24f81fa725f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:50 crc kubenswrapper[5129]: I0314 07:03:50.052743 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxbtj" Mar 14 07:03:50 crc kubenswrapper[5129]: I0314 07:03:50.077500 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxbtj"] Mar 14 07:03:50 crc kubenswrapper[5129]: I0314 07:03:50.086976 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rxbtj"] Mar 14 07:03:50 crc kubenswrapper[5129]: I0314 07:03:50.661395 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-988f88596-6zfsw"] Mar 14 07:03:50 crc kubenswrapper[5129]: I0314 07:03:50.661761 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" podUID="611e6147-8e37-41ba-8f13-d944070e4ed8" containerName="controller-manager" containerID="cri-o://d7ec99cd26e02be357ead74338dfac629eec60336c25299ce4cbe1508b85b3f2" gracePeriod=30 Mar 14 07:03:50 crc kubenswrapper[5129]: I0314 07:03:50.756297 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg"] Mar 14 07:03:50 crc kubenswrapper[5129]: I0314 07:03:50.756491 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" podUID="7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" containerName="route-controller-manager" containerID="cri-o://267c7c5b213a97c6b86b7359e2ca22d4b1a7ec91ed6e44eb647f03a0618a23f1" gracePeriod=30 Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.058966 5129 generic.go:334] "Generic (PLEG): container finished" podID="7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" containerID="267c7c5b213a97c6b86b7359e2ca22d4b1a7ec91ed6e44eb647f03a0618a23f1" exitCode=0 Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.059026 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" event={"ID":"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a","Type":"ContainerDied","Data":"267c7c5b213a97c6b86b7359e2ca22d4b1a7ec91ed6e44eb647f03a0618a23f1"} Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.060049 5129 generic.go:334] "Generic (PLEG): container finished" podID="611e6147-8e37-41ba-8f13-d944070e4ed8" containerID="d7ec99cd26e02be357ead74338dfac629eec60336c25299ce4cbe1508b85b3f2" exitCode=0 Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.060070 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" event={"ID":"611e6147-8e37-41ba-8f13-d944070e4ed8","Type":"ContainerDied","Data":"d7ec99cd26e02be357ead74338dfac629eec60336c25299ce4cbe1508b85b3f2"} Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.192521 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.196288 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.374367 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-config\") pod \"611e6147-8e37-41ba-8f13-d944070e4ed8\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.374402 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-client-ca\") pod \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.374433 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-serving-cert\") pod \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.374453 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45fg5\" (UniqueName: \"kubernetes.io/projected/611e6147-8e37-41ba-8f13-d944070e4ed8-kube-api-access-45fg5\") pod \"611e6147-8e37-41ba-8f13-d944070e4ed8\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.374472 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-proxy-ca-bundles\") pod \"611e6147-8e37-41ba-8f13-d944070e4ed8\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375208 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "611e6147-8e37-41ba-8f13-d944070e4ed8" (UID: "611e6147-8e37-41ba-8f13-d944070e4ed8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375325 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-config" (OuterVolumeSpecName: "config") pod "611e6147-8e37-41ba-8f13-d944070e4ed8" (UID: "611e6147-8e37-41ba-8f13-d944070e4ed8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375382 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plnm2\" (UniqueName: \"kubernetes.io/projected/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-kube-api-access-plnm2\") pod \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375405 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611e6147-8e37-41ba-8f13-d944070e4ed8-serving-cert\") pod \"611e6147-8e37-41ba-8f13-d944070e4ed8\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375425 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-client-ca\") pod \"611e6147-8e37-41ba-8f13-d944070e4ed8\" (UID: \"611e6147-8e37-41ba-8f13-d944070e4ed8\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375450 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-config\") pod \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\" (UID: \"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a\") " Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375689 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375706 5129 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375791 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" (UID: "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.375960 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-client-ca" (OuterVolumeSpecName: "client-ca") pod "611e6147-8e37-41ba-8f13-d944070e4ed8" (UID: "611e6147-8e37-41ba-8f13-d944070e4ed8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.376053 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-config" (OuterVolumeSpecName: "config") pod "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" (UID: "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.379284 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611e6147-8e37-41ba-8f13-d944070e4ed8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "611e6147-8e37-41ba-8f13-d944070e4ed8" (UID: "611e6147-8e37-41ba-8f13-d944070e4ed8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.379452 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" (UID: "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.379886 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-kube-api-access-plnm2" (OuterVolumeSpecName: "kube-api-access-plnm2") pod "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" (UID: "7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a"). InnerVolumeSpecName "kube-api-access-plnm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.380116 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611e6147-8e37-41ba-8f13-d944070e4ed8-kube-api-access-45fg5" (OuterVolumeSpecName: "kube-api-access-45fg5") pod "611e6147-8e37-41ba-8f13-d944070e4ed8" (UID: "611e6147-8e37-41ba-8f13-d944070e4ed8"). InnerVolumeSpecName "kube-api-access-45fg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.477381 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.477426 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45fg5\" (UniqueName: \"kubernetes.io/projected/611e6147-8e37-41ba-8f13-d944070e4ed8-kube-api-access-45fg5\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.477436 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plnm2\" (UniqueName: \"kubernetes.io/projected/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-kube-api-access-plnm2\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.477444 5129 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611e6147-8e37-41ba-8f13-d944070e4ed8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.477451 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/611e6147-8e37-41ba-8f13-d944070e4ed8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.477460 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:51 crc kubenswrapper[5129]: I0314 07:03:51.477469 5129 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.043111 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" path="/var/lib/kubelet/pods/8e8571cc-ed81-4074-a9e7-24f81fa725f0/volumes" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.065076 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.065092 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-988f88596-6zfsw" event={"ID":"611e6147-8e37-41ba-8f13-d944070e4ed8","Type":"ContainerDied","Data":"cb666a76dd77333495d22d5f3ff537f0feb57d53c39c1db268f410d2da0832f0"} Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.065143 5129 scope.go:117] "RemoveContainer" containerID="d7ec99cd26e02be357ead74338dfac629eec60336c25299ce4cbe1508b85b3f2" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.066678 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" event={"ID":"7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a","Type":"ContainerDied","Data":"5614201981c33ed676f3d3f75c3bc215412175b212239908441da79eb7ab0d97"} Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.066752 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.101499 5129 scope.go:117] "RemoveContainer" containerID="267c7c5b213a97c6b86b7359e2ca22d4b1a7ec91ed6e44eb647f03a0618a23f1" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.104240 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-988f88596-6zfsw"] Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.109843 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-988f88596-6zfsw"] Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.120541 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg"] Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.124684 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c665b84c9-v9bqg"] Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.422849 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-647bfb886c-k9jtw"] Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423034 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="extract-content" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423045 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="extract-content" Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423057 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="extract-utilities" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423064 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="extract-utilities" Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423075 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="registry-server" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423081 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="registry-server" Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423088 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="extract-utilities" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423094 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="extract-utilities" Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423101 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" containerName="route-controller-manager" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423108 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" containerName="route-controller-manager" Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423115 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="extract-content" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423120 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="extract-content" Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423126 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="registry-server" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423131 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="registry-server" Mar 14 07:03:52 crc kubenswrapper[5129]: E0314 07:03:52.423142 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611e6147-8e37-41ba-8f13-d944070e4ed8" containerName="controller-manager" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423147 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="611e6147-8e37-41ba-8f13-d944070e4ed8" containerName="controller-manager" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423240 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="611e6147-8e37-41ba-8f13-d944070e4ed8" containerName="controller-manager" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423252 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8571cc-ed81-4074-a9e7-24f81fa725f0" containerName="registry-server" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423267 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" containerName="route-controller-manager" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423277 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9d7764-4db5-49d9-9b08-7b2317ec41ca" containerName="registry-server" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.423638 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8"] Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.424219 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.424577 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.427044 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.427431 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.427548 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.427742 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.427901 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.428082 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.430026 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.430166 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.430341 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.430523 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.430708 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.432023 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.434184 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-647bfb886c-k9jtw"] Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.442526 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.447026 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8"] Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505162 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/614c8c14-d2a8-46aa-82b6-999a04e05e4f-config\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505220 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614c8c14-d2a8-46aa-82b6-999a04e05e4f-serving-cert\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505272 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-proxy-ca-bundles\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505294 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa1b39d-3713-4f51-8289-52f38a7427e0-serving-cert\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505325 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-client-ca\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505345 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98bnt\" (UniqueName: \"kubernetes.io/projected/1fa1b39d-3713-4f51-8289-52f38a7427e0-kube-api-access-98bnt\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505385 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-config\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505426 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/614c8c14-d2a8-46aa-82b6-999a04e05e4f-client-ca\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.505445 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntqt\" (UniqueName: \"kubernetes.io/projected/614c8c14-d2a8-46aa-82b6-999a04e05e4f-kube-api-access-hntqt\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606150 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-config\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606209 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/614c8c14-d2a8-46aa-82b6-999a04e05e4f-client-ca\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606229 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hntqt\" (UniqueName: \"kubernetes.io/projected/614c8c14-d2a8-46aa-82b6-999a04e05e4f-kube-api-access-hntqt\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606272 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/614c8c14-d2a8-46aa-82b6-999a04e05e4f-config\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606299 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614c8c14-d2a8-46aa-82b6-999a04e05e4f-serving-cert\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606318 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-proxy-ca-bundles\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606348 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa1b39d-3713-4f51-8289-52f38a7427e0-serving-cert\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606374 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-client-ca\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.606389 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98bnt\" (UniqueName: \"kubernetes.io/projected/1fa1b39d-3713-4f51-8289-52f38a7427e0-kube-api-access-98bnt\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.607756 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-client-ca\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.608023 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-proxy-ca-bundles\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.608172 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/614c8c14-d2a8-46aa-82b6-999a04e05e4f-config\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.608290 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/614c8c14-d2a8-46aa-82b6-999a04e05e4f-client-ca\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.608475 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa1b39d-3713-4f51-8289-52f38a7427e0-config\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.611663 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/614c8c14-d2a8-46aa-82b6-999a04e05e4f-serving-cert\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.615737 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa1b39d-3713-4f51-8289-52f38a7427e0-serving-cert\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.623246 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntqt\" (UniqueName: \"kubernetes.io/projected/614c8c14-d2a8-46aa-82b6-999a04e05e4f-kube-api-access-hntqt\") pod \"route-controller-manager-5df76b77f5-tt9b8\" (UID: \"614c8c14-d2a8-46aa-82b6-999a04e05e4f\") " pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.623659 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98bnt\" (UniqueName: \"kubernetes.io/projected/1fa1b39d-3713-4f51-8289-52f38a7427e0-kube-api-access-98bnt\") pod \"controller-manager-647bfb886c-k9jtw\" (UID: \"1fa1b39d-3713-4f51-8289-52f38a7427e0\") " pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.756302 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.773349 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:52 crc kubenswrapper[5129]: I0314 07:03:52.992388 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-647bfb886c-k9jtw"] Mar 14 07:03:53 crc kubenswrapper[5129]: W0314 07:03:53.001637 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa1b39d_3713_4f51_8289_52f38a7427e0.slice/crio-6a691793de7e1f65342d93d4a009d06d09c049b19e99ecd505a4d0e8f5b830f5 WatchSource:0}: Error finding container 6a691793de7e1f65342d93d4a009d06d09c049b19e99ecd505a4d0e8f5b830f5: Status 404 returned error can't find the container with id 6a691793de7e1f65342d93d4a009d06d09c049b19e99ecd505a4d0e8f5b830f5 Mar 14 07:03:53 crc kubenswrapper[5129]: I0314 07:03:53.072499 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" event={"ID":"1fa1b39d-3713-4f51-8289-52f38a7427e0","Type":"ContainerStarted","Data":"6a691793de7e1f65342d93d4a009d06d09c049b19e99ecd505a4d0e8f5b830f5"} Mar 14 07:03:53 crc kubenswrapper[5129]: I0314 07:03:53.154437 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8"] Mar 14 07:03:53 crc kubenswrapper[5129]: W0314 07:03:53.160151 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod614c8c14_d2a8_46aa_82b6_999a04e05e4f.slice/crio-3f0ea8d00877076b476b04cdc08655929fa2d1a4548791f4d20607e173ab658b WatchSource:0}: Error finding container 3f0ea8d00877076b476b04cdc08655929fa2d1a4548791f4d20607e173ab658b: Status 404 returned error can't find the container with id 3f0ea8d00877076b476b04cdc08655929fa2d1a4548791f4d20607e173ab658b Mar 14 07:03:53 crc kubenswrapper[5129]: I0314 07:03:53.529515 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmn4k"] Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.045788 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611e6147-8e37-41ba-8f13-d944070e4ed8" path="/var/lib/kubelet/pods/611e6147-8e37-41ba-8f13-d944070e4ed8/volumes" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.046964 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a" path="/var/lib/kubelet/pods/7e9ffa51-5f5e-4f7e-9e89-64e91fb8034a/volumes" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.082151 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" event={"ID":"1fa1b39d-3713-4f51-8289-52f38a7427e0","Type":"ContainerStarted","Data":"46252413c18c01a08f1e1453ed83f9c124e9f2891b1c3ed73363750fe9a2a7ed"} Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.082453 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.083547 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" event={"ID":"614c8c14-d2a8-46aa-82b6-999a04e05e4f","Type":"ContainerStarted","Data":"ed1aa81672ac103c9934a8bdc7609742594fe7d08f995cb521233c9c08b84439"} Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.083576 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" event={"ID":"614c8c14-d2a8-46aa-82b6-999a04e05e4f","Type":"ContainerStarted","Data":"3f0ea8d00877076b476b04cdc08655929fa2d1a4548791f4d20607e173ab658b"} Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.083732 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.088576 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.091101 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.104731 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-647bfb886c-k9jtw" podStartSLOduration=4.104711537 podStartE2EDuration="4.104711537s" podCreationTimestamp="2026-03-14 07:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:54.100849196 +0000 UTC m=+296.852764400" watchObservedRunningTime="2026-03-14 07:03:54.104711537 +0000 UTC m=+296.856626731" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.136127 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5df76b77f5-tt9b8" podStartSLOduration=4.136109499 podStartE2EDuration="4.136109499s" podCreationTimestamp="2026-03-14 07:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:03:54.135591434 +0000 UTC m=+296.887506618" watchObservedRunningTime="2026-03-14 07:03:54.136109499 +0000 UTC m=+296.888024683" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.440469 5129 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.440784 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0" gracePeriod=15 Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.440834 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee" gracePeriod=15 Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.440901 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115" gracePeriod=15 Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.440842 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780" gracePeriod=15 Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.440853 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1" gracePeriod=15 Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.441987 5129 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443335 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443360 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443385 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443392 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443403 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443410 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443420 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443427 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443446 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443453 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443462 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443471 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443490 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443497 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443509 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443517 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443686 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443703 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443713 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443721 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443740 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443750 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443758 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443769 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443891 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443901 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.443910 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.443918 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.444018 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.445154 5129 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.445705 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.453223 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.481167 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.631344 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.631527 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.631643 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.631704 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.631786 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.631835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.631946 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.632011 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733638 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733691 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733725 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733742 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733759 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733785 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733800 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733791 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733837 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733840 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733881 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733852 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733885 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733882 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733852 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.733954 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: I0314 07:03:54.779896 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:03:54 crc kubenswrapper[5129]: W0314 07:03:54.796570 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-8319ee9d7c465864e13ed987005242caca726ae7f83b44709acd33efb93607f7 WatchSource:0}: Error finding container 8319ee9d7c465864e13ed987005242caca726ae7f83b44709acd33efb93607f7: Status 404 returned error can't find the container with id 8319ee9d7c465864e13ed987005242caca726ae7f83b44709acd33efb93607f7 Mar 14 07:03:54 crc kubenswrapper[5129]: E0314 07:03:54.799076 5129 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca33ccc4f791e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 07:03:54.798258462 +0000 UTC m=+297.550173686,LastTimestamp:2026-03-14 07:03:54.798258462 +0000 UTC m=+297.550173686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.094787 5129 generic.go:334] "Generic (PLEG): container finished" podID="b7526b6c-47da-49a7-8751-fb1a037a3082" containerID="e66a306547d2d502fbec60b974e1970c8d3c039b87bc4b1b6821208e39fad78c" exitCode=0 Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.094928 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7526b6c-47da-49a7-8751-fb1a037a3082","Type":"ContainerDied","Data":"e66a306547d2d502fbec60b974e1970c8d3c039b87bc4b1b6821208e39fad78c"} Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.096117 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.097466 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.097970 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.100453 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.102593 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee" exitCode=0 Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.102666 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115" exitCode=0 Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.102681 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1" exitCode=0 Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.102644 5129 scope.go:117] "RemoveContainer" containerID="846de44c98b508e75e624aee04f1c490dfa91353f12cc2c5a0fdfb265ffda9ee" Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.102695 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780" exitCode=2 Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.109097 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0"} Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.109136 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8319ee9d7c465864e13ed987005242caca726ae7f83b44709acd33efb93607f7"} Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.109585 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:55 crc kubenswrapper[5129]: I0314 07:03:55.109838 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.117348 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.428095 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.428935 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.429303 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.556113 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-var-lock\") pod \"b7526b6c-47da-49a7-8751-fb1a037a3082\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.556238 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-var-lock" (OuterVolumeSpecName: "var-lock") pod "b7526b6c-47da-49a7-8751-fb1a037a3082" (UID: "b7526b6c-47da-49a7-8751-fb1a037a3082"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.556253 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7526b6c-47da-49a7-8751-fb1a037a3082-kube-api-access\") pod \"b7526b6c-47da-49a7-8751-fb1a037a3082\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.556368 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-kubelet-dir\") pod \"b7526b6c-47da-49a7-8751-fb1a037a3082\" (UID: \"b7526b6c-47da-49a7-8751-fb1a037a3082\") " Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.556839 5129 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.556875 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b7526b6c-47da-49a7-8751-fb1a037a3082" (UID: "b7526b6c-47da-49a7-8751-fb1a037a3082"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.564920 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7526b6c-47da-49a7-8751-fb1a037a3082-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b7526b6c-47da-49a7-8751-fb1a037a3082" (UID: "b7526b6c-47da-49a7-8751-fb1a037a3082"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.657739 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7526b6c-47da-49a7-8751-fb1a037a3082-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.657771 5129 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7526b6c-47da-49a7-8751-fb1a037a3082-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.805988 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.808411 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.809016 5129 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.809524 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.809919 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.961912 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.962030 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.962122 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.962320 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.962377 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:56 crc kubenswrapper[5129]: I0314 07:03:56.962427 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.063447 5129 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.063496 5129 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.063519 5129 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.127009 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.127031 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7526b6c-47da-49a7-8751-fb1a037a3082","Type":"ContainerDied","Data":"e04861ed52d15214a2454d2fd113731e67a83588963d524363e908d2ecc48f8f"} Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.127082 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e04861ed52d15214a2454d2fd113731e67a83588963d524363e908d2ecc48f8f" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.143118 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.143910 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0" exitCode=0 Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.143965 5129 scope.go:117] "RemoveContainer" containerID="87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.144092 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.150513 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.151200 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.151670 5129 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.159877 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.160364 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.160658 5129 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.171450 5129 scope.go:117] "RemoveContainer" containerID="e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.187897 5129 scope.go:117] "RemoveContainer" containerID="525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.202090 5129 scope.go:117] "RemoveContainer" containerID="87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.216680 5129 scope.go:117] "RemoveContainer" containerID="f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.233429 5129 scope.go:117] "RemoveContainer" containerID="4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.250798 5129 scope.go:117] "RemoveContainer" containerID="87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee" Mar 14 07:03:57 crc kubenswrapper[5129]: E0314 07:03:57.251551 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\": container with ID starting with 87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee not found: ID does not exist" containerID="87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.251646 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee"} err="failed to get container status \"87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\": rpc error: code = NotFound desc = could not find container \"87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee\": container with ID starting with 87210633a768900256b11fe343640a81d7da3d2d871dc0bed7a98f91783293ee not found: ID does not exist" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.251685 5129 scope.go:117] "RemoveContainer" containerID="e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115" Mar 14 07:03:57 crc kubenswrapper[5129]: E0314 07:03:57.252249 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\": container with ID starting with e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115 not found: ID does not exist" containerID="e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.252317 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115"} err="failed to get container status \"e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\": rpc error: code = NotFound desc = could not find container \"e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115\": container with ID starting with e66b1279d12730c4d4ab30d5a1d2218a99b520fb51f077b7fb9b258690d75115 not found: ID does not exist" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.252369 5129 scope.go:117] "RemoveContainer" containerID="525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1" Mar 14 07:03:57 crc kubenswrapper[5129]: E0314 07:03:57.252809 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\": container with ID starting with 525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1 not found: ID does not exist" containerID="525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.252844 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1"} err="failed to get container status \"525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\": rpc error: code = NotFound desc = could not find container \"525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1\": container with ID starting with 525b4f61f38ec6348c1e68195770cb9562bca2c2565e84e0a4aa54987ea1a3b1 not found: ID does not exist" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.252869 5129 scope.go:117] "RemoveContainer" containerID="87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780" Mar 14 07:03:57 crc kubenswrapper[5129]: E0314 07:03:57.253462 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\": container with ID starting with 87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780 not found: ID does not exist" containerID="87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.253510 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780"} err="failed to get container status \"87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\": rpc error: code = NotFound desc = could not find container \"87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780\": container with ID starting with 87b0bffb3c44ef0ef8cc3e279a9f592841006f9078a1648256626c0f6714e780 not found: ID does not exist" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.253540 5129 scope.go:117] "RemoveContainer" containerID="f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0" Mar 14 07:03:57 crc kubenswrapper[5129]: E0314 07:03:57.254031 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\": container with ID starting with f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0 not found: ID does not exist" containerID="f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.254057 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0"} err="failed to get container status \"f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\": rpc error: code = NotFound desc = could not find container \"f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0\": container with ID starting with f6e03f8c702401ed768708820bd50557a3018a11b0453c1604d6a0318f7ab5a0 not found: ID does not exist" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.254077 5129 scope.go:117] "RemoveContainer" containerID="4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401" Mar 14 07:03:57 crc kubenswrapper[5129]: E0314 07:03:57.254504 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\": container with ID starting with 4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401 not found: ID does not exist" containerID="4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401" Mar 14 07:03:57 crc kubenswrapper[5129]: I0314 07:03:57.254554 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401"} err="failed to get container status \"4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\": rpc error: code = NotFound desc = could not find container \"4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401\": container with ID starting with 4ba4f8ff6777d4dafe895be9e3e85a57c60c0e094221fc5cb54b315ed24ac401 not found: ID does not exist" Mar 14 07:03:58 crc kubenswrapper[5129]: I0314 07:03:58.038981 5129 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:58 crc kubenswrapper[5129]: I0314 07:03:58.039387 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:58 crc kubenswrapper[5129]: I0314 07:03:58.039823 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:03:58 crc kubenswrapper[5129]: I0314 07:03:58.045847 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 14 07:03:59 crc kubenswrapper[5129]: E0314 07:03:59.167140 5129 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca33ccc4f791e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 07:03:54.798258462 +0000 UTC m=+297.550173686,LastTimestamp:2026-03-14 07:03:54.798258462 +0000 UTC m=+297.550173686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 07:04:02 crc kubenswrapper[5129]: E0314 07:04:02.450587 5129 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:02 crc kubenswrapper[5129]: E0314 07:04:02.451386 5129 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:02 crc kubenswrapper[5129]: E0314 07:04:02.452164 5129 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:02 crc kubenswrapper[5129]: E0314 07:04:02.452552 5129 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:02 crc kubenswrapper[5129]: E0314 07:04:02.452757 5129 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:02 crc kubenswrapper[5129]: I0314 07:04:02.452784 5129 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 14 07:04:02 crc kubenswrapper[5129]: E0314 07:04:02.452939 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="200ms" Mar 14 07:04:02 crc kubenswrapper[5129]: E0314 07:04:02.653506 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="400ms" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.055257 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="800ms" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.524087 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:04:03Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:04:03Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:04:03Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T07:04:03Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.524641 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.527007 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.527237 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.527444 5129 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.527465 5129 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 07:04:03 crc kubenswrapper[5129]: E0314 07:04:03.856248 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="1.6s" Mar 14 07:04:05 crc kubenswrapper[5129]: E0314 07:04:05.457127 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="3.2s" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.035743 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.050785 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.051352 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.052007 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.052353 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.069516 5129 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.069552 5129 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:08 crc kubenswrapper[5129]: E0314 07:04:08.070143 5129 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.070917 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:08 crc kubenswrapper[5129]: W0314 07:04:08.095145 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-cd9739632941f8d0f8a4bb7f7545431321949ed45276b61f3933201c24416284 WatchSource:0}: Error finding container cd9739632941f8d0f8a4bb7f7545431321949ed45276b61f3933201c24416284: Status 404 returned error can't find the container with id cd9739632941f8d0f8a4bb7f7545431321949ed45276b61f3933201c24416284 Mar 14 07:04:08 crc kubenswrapper[5129]: I0314 07:04:08.209750 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd9739632941f8d0f8a4bb7f7545431321949ed45276b61f3933201c24416284"} Mar 14 07:04:08 crc kubenswrapper[5129]: E0314 07:04:08.658936 5129 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="6.4s" Mar 14 07:04:09 crc kubenswrapper[5129]: E0314 07:04:09.169304 5129 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca33ccc4f791e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 07:03:54.798258462 +0000 UTC m=+297.550173686,LastTimestamp:2026-03-14 07:03:54.798258462 +0000 UTC m=+297.550173686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.220332 5129 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b7193222f5e0a4931e686e474a3d4d99f8987e3f5ba5ac30b83fb7de61857538" exitCode=0 Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.220419 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b7193222f5e0a4931e686e474a3d4d99f8987e3f5ba5ac30b83fb7de61857538"} Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.221023 5129 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.221104 5129 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.221263 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:09 crc kubenswrapper[5129]: E0314 07:04:09.221672 5129 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.222039 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.224131 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.224882 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.224922 5129 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c" exitCode=1 Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.224942 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c"} Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.225285 5129 scope.go:117] "RemoveContainer" containerID="a6c2f9f77fd61eea301a8bb0388b078488c80453835df5e3ca468d99fe97193c" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.225926 5129 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.226307 5129 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:09 crc kubenswrapper[5129]: I0314 07:04:09.226692 5129 status_manager.go:851] "Failed to get status for pod" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.113:6443: connect: connection refused" Mar 14 07:04:10 crc kubenswrapper[5129]: I0314 07:04:10.241997 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"745c5d00d033c714b71fce40ef199ebb466a3c089e30ed1b34f06486a414f5bc"} Mar 14 07:04:10 crc kubenswrapper[5129]: I0314 07:04:10.242406 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0c6e9a256bfe01ff817cb16e7aba8b94c9b1c534d575a597968ef2fb816838b6"} Mar 14 07:04:10 crc kubenswrapper[5129]: I0314 07:04:10.242421 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"74df43652a33afbac60bfc14c020a70cd261eb725080d7fae550a4584325b0a9"} Mar 14 07:04:10 crc kubenswrapper[5129]: I0314 07:04:10.242430 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4e8e48d03d19ce10f941ce978925872ba8da362e79243efcc7857cba678be6e7"} Mar 14 07:04:10 crc kubenswrapper[5129]: I0314 07:04:10.250143 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 07:04:10 crc kubenswrapper[5129]: I0314 07:04:10.251126 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 07:04:10 crc kubenswrapper[5129]: I0314 07:04:10.251205 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b98dd51eee838a8720e399715a3fa6f325fdc2d50be37d09cdeaeadeb825b0dc"} Mar 14 07:04:11 crc kubenswrapper[5129]: I0314 07:04:11.259216 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7b3cbd4cb6ae3cb85fa17be9c66f9c02a1aa970901985b1179c18952695ede68"} Mar 14 07:04:11 crc kubenswrapper[5129]: I0314 07:04:11.259405 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:11 crc kubenswrapper[5129]: I0314 07:04:11.259547 5129 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:11 crc kubenswrapper[5129]: I0314 07:04:11.259571 5129 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:11 crc kubenswrapper[5129]: I0314 07:04:11.401785 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:04:11 crc kubenswrapper[5129]: I0314 07:04:11.405578 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:04:12 crc kubenswrapper[5129]: I0314 07:04:12.265562 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:04:13 crc kubenswrapper[5129]: I0314 07:04:13.071085 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:13 crc kubenswrapper[5129]: I0314 07:04:13.071151 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:13 crc kubenswrapper[5129]: I0314 07:04:13.080233 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:16 crc kubenswrapper[5129]: I0314 07:04:16.275791 5129 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:17 crc kubenswrapper[5129]: I0314 07:04:17.299661 5129 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:17 crc kubenswrapper[5129]: I0314 07:04:17.300235 5129 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:17 crc kubenswrapper[5129]: I0314 07:04:17.306759 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:18 crc kubenswrapper[5129]: I0314 07:04:18.053277 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="808ec3c1-a0e0-4dcf-9c23-08158e65b617" Mar 14 07:04:18 crc kubenswrapper[5129]: I0314 07:04:18.304214 5129 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:18 crc kubenswrapper[5129]: I0314 07:04:18.304278 5129 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:18 crc kubenswrapper[5129]: I0314 07:04:18.307663 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="808ec3c1-a0e0-4dcf-9c23-08158e65b617" Mar 14 07:04:18 crc kubenswrapper[5129]: I0314 07:04:18.572466 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" podUID="e82ae62e-63ea-4a75-9c2f-7a02ade5768a" containerName="oauth-openshift" containerID="cri-o://cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a" gracePeriod=15 Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.153721 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302024 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-router-certs\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302094 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-cliconfig\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302129 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-service-ca\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302259 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-ocp-branding-template\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302296 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-idp-0-file-data\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302332 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-trusted-ca-bundle\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302364 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xft2m\" (UniqueName: \"kubernetes.io/projected/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-kube-api-access-xft2m\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302403 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-serving-cert\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302456 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-session\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302508 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-policies\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302554 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-dir\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302586 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-login\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302666 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-error\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302657 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.302718 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-provider-selection\") pod \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\" (UID: \"e82ae62e-63ea-4a75-9c2f-7a02ade5768a\") " Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.303011 5129 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.303293 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.303434 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.304296 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.304400 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.311326 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.314695 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.314956 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-kube-api-access-xft2m" (OuterVolumeSpecName: "kube-api-access-xft2m") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "kube-api-access-xft2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.315064 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.315750 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.316034 5129 generic.go:334] "Generic (PLEG): container finished" podID="e82ae62e-63ea-4a75-9c2f-7a02ade5768a" containerID="cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a" exitCode=0 Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.316094 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" event={"ID":"e82ae62e-63ea-4a75-9c2f-7a02ade5768a","Type":"ContainerDied","Data":"cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a"} Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.316121 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.316150 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmn4k" event={"ID":"e82ae62e-63ea-4a75-9c2f-7a02ade5768a","Type":"ContainerDied","Data":"d2eb097f3ed68dcdc4629c7da07456a1e9bf848d25c7f4abd42dcb5e01661282"} Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.316314 5129 scope.go:117] "RemoveContainer" containerID="cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.320165 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.321542 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.323093 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.323287 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e82ae62e-63ea-4a75-9c2f-7a02ade5768a" (UID: "e82ae62e-63ea-4a75-9c2f-7a02ade5768a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.367866 5129 scope.go:117] "RemoveContainer" containerID="cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a" Mar 14 07:04:19 crc kubenswrapper[5129]: E0314 07:04:19.370315 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a\": container with ID starting with cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a not found: ID does not exist" containerID="cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.370354 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a"} err="failed to get container status \"cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a\": rpc error: code = NotFound desc = could not find container \"cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a\": container with ID starting with cd5193f234faf95115832625127473cd03dcbc3d7080252ecfef586530392c9a not found: ID does not exist" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403750 5129 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403794 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403811 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403826 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403843 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403855 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403867 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403881 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403892 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403903 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403916 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xft2m\" (UniqueName: \"kubernetes.io/projected/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-kube-api-access-xft2m\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403931 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:19 crc kubenswrapper[5129]: I0314 07:04:19.403946 5129 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e82ae62e-63ea-4a75-9c2f-7a02ade5768a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:25 crc kubenswrapper[5129]: I0314 07:04:25.134793 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 07:04:25 crc kubenswrapper[5129]: I0314 07:04:25.735080 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 07:04:26 crc kubenswrapper[5129]: I0314 07:04:26.039481 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 07:04:26 crc kubenswrapper[5129]: I0314 07:04:26.311330 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 07:04:26 crc kubenswrapper[5129]: I0314 07:04:26.352055 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 07:04:26 crc kubenswrapper[5129]: I0314 07:04:26.515030 5129 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 07:04:26 crc kubenswrapper[5129]: I0314 07:04:26.542137 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 07:04:26 crc kubenswrapper[5129]: I0314 07:04:26.688377 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 07:04:27 crc kubenswrapper[5129]: I0314 07:04:27.155876 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 07:04:27 crc kubenswrapper[5129]: I0314 07:04:27.300926 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 07:04:27 crc kubenswrapper[5129]: I0314 07:04:27.605272 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 07:04:27 crc kubenswrapper[5129]: I0314 07:04:27.718366 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.034154 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.044169 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.232289 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.313668 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.379814 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.580006 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.682042 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.682924 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.731493 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.849577 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.873504 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 07:04:28 crc kubenswrapper[5129]: I0314 07:04:28.926962 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.177165 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.178511 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.228704 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.272994 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.495669 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.611381 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.834963 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.882293 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 07:04:29 crc kubenswrapper[5129]: I0314 07:04:29.970661 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.032755 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.063251 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.114246 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.128982 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.181289 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.265748 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.270060 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.343290 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.358535 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.493994 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.506232 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.543420 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.562637 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.594825 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.745297 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.756422 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.803785 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.858348 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.907585 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 07:04:30 crc kubenswrapper[5129]: I0314 07:04:30.995993 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.023018 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.063788 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.373905 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.409211 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.544962 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.652487 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.766398 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.771449 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.810749 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.839416 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.846252 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.871491 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.914489 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 07:04:31 crc kubenswrapper[5129]: I0314 07:04:31.991258 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.000756 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.086696 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.123881 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.182785 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.219301 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.394793 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.477156 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.510018 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.513536 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.518301 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.554554 5129 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.663176 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.665400 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.682456 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.733657 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.767037 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.804114 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 07:04:32 crc kubenswrapper[5129]: I0314 07:04:32.848829 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.110943 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.127634 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.185720 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.186054 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.217047 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.250083 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.256504 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.310649 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.330265 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.339719 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.340116 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.347877 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.371293 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.463101 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.483470 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.610938 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.629812 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.641425 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.660278 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.773051 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.891553 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.909071 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.914717 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 07:04:33 crc kubenswrapper[5129]: I0314 07:04:33.951301 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.044247 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.066932 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.097200 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.149683 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.161995 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.191747 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.204360 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.255855 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.330132 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.421829 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.481997 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.496729 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.578466 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.629658 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.811552 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.843424 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.921926 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.960218 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.986319 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.991250 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 07:04:34 crc kubenswrapper[5129]: I0314 07:04:34.995519 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.029340 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.044699 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.216084 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.216434 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.245463 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.432129 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.445340 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.446540 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.603498 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.924434 5129 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.944425 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 07:04:35 crc kubenswrapper[5129]: I0314 07:04:35.984682 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.098717 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.111233 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.151924 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.198010 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.227022 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.309047 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.347136 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.387130 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.604884 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.673491 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.673914 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.760159 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.792231 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.869912 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.879364 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.928840 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.966005 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.973523 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 07:04:36 crc kubenswrapper[5129]: I0314 07:04:36.980388 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.160918 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.192380 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.201778 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.236165 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.321762 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.329926 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.363832 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.475940 5129 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.631980 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.662553 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.719595 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.723996 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.735057 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.768513 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.769127 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.772200 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.806235 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.827843 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.899063 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.944850 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.955067 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 07:04:37 crc kubenswrapper[5129]: I0314 07:04:37.977856 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.003818 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.090467 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.099726 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.187511 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.197296 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.241432 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.488784 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.502564 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.503259 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.733989 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.942506 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 07:04:38 crc kubenswrapper[5129]: I0314 07:04:38.992011 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.021962 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.025398 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.045173 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.306414 5129 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.308783 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.308761112 podStartE2EDuration="45.308761112s" podCreationTimestamp="2026-03-14 07:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:04:16.088928583 +0000 UTC m=+318.840843807" watchObservedRunningTime="2026-03-14 07:04:39.308761112 +0000 UTC m=+342.060676296" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.312181 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-fmn4k"] Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.312381 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29557864-r9tzv","openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw"] Mar 14 07:04:39 crc kubenswrapper[5129]: E0314 07:04:39.312709 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" containerName="installer" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.312811 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" containerName="installer" Mar 14 07:04:39 crc kubenswrapper[5129]: E0314 07:04:39.313024 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82ae62e-63ea-4a75-9c2f-7a02ade5768a" containerName="oauth-openshift" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.313125 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82ae62e-63ea-4a75-9c2f-7a02ade5768a" containerName="oauth-openshift" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.312855 5129 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.313349 5129 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84967d01-e382-4c62-98c2-9e8209f31aa0" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.313451 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82ae62e-63ea-4a75-9c2f-7a02ade5768a" containerName="oauth-openshift" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.313530 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7526b6c-47da-49a7-8751-fb1a037a3082" containerName="installer" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.314079 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.314530 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-r9tzv" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.315691 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.317421 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.317463 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.317424 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.317595 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.323788 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.323966 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.324123 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.324359 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.326786 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.328697 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.328756 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.328785 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.328867 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.329155 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.329202 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.332696 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.337500 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.342526 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.349916 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.349753097 podStartE2EDuration="23.349753097s" podCreationTimestamp="2026-03-14 07:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:04:39.344466598 +0000 UTC m=+342.096381782" watchObservedRunningTime="2026-03-14 07:04:39.349753097 +0000 UTC m=+342.101668281" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.357784 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-login\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.357837 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.357889 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-service-ca\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.357922 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-audit-policies\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.357952 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.357973 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqkx\" (UniqueName: \"kubernetes.io/projected/91e5e202-0a96-46be-b241-a97d49eb2619-kube-api-access-knqkx\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.357998 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-cliconfig\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358019 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-session\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358050 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358079 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358105 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnbd8\" (UniqueName: \"kubernetes.io/projected/6849ef7d-3a7f-4c40-8375-c7d651a1985a-kube-api-access-vnbd8\") pod \"auto-csr-approver-29557864-r9tzv\" (UID: \"6849ef7d-3a7f-4c40-8375-c7d651a1985a\") " pod="openshift-infra/auto-csr-approver-29557864-r9tzv" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358137 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-serving-cert\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358162 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-error\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358189 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91e5e202-0a96-46be-b241-a97d49eb2619-audit-dir\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.358228 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-router-certs\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.370803 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.446196 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.458950 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-service-ca\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.458988 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-audit-policies\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459017 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459038 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqkx\" (UniqueName: \"kubernetes.io/projected/91e5e202-0a96-46be-b241-a97d49eb2619-kube-api-access-knqkx\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459057 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-cliconfig\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459077 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-session\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459095 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459114 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459133 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnbd8\" (UniqueName: \"kubernetes.io/projected/6849ef7d-3a7f-4c40-8375-c7d651a1985a-kube-api-access-vnbd8\") pod \"auto-csr-approver-29557864-r9tzv\" (UID: \"6849ef7d-3a7f-4c40-8375-c7d651a1985a\") " pod="openshift-infra/auto-csr-approver-29557864-r9tzv" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459155 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-serving-cert\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459171 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-error\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459187 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91e5e202-0a96-46be-b241-a97d49eb2619-audit-dir\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459205 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-router-certs\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459221 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-login\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459239 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459797 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-service-ca\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.459857 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91e5e202-0a96-46be-b241-a97d49eb2619-audit-dir\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.460055 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.460683 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-audit-policies\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.460761 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-cliconfig\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.465018 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.465323 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.465645 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-serving-cert\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.466087 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-login\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.466169 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-error\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.474021 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-session\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.477018 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-system-router-certs\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.480445 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91e5e202-0a96-46be-b241-a97d49eb2619-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.483555 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqkx\" (UniqueName: \"kubernetes.io/projected/91e5e202-0a96-46be-b241-a97d49eb2619-kube-api-access-knqkx\") pod \"oauth-openshift-675f5cc7c5-j7vnw\" (UID: \"91e5e202-0a96-46be-b241-a97d49eb2619\") " pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.493443 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnbd8\" (UniqueName: \"kubernetes.io/projected/6849ef7d-3a7f-4c40-8375-c7d651a1985a-kube-api-access-vnbd8\") pod \"auto-csr-approver-29557864-r9tzv\" (UID: \"6849ef7d-3a7f-4c40-8375-c7d651a1985a\") " pod="openshift-infra/auto-csr-approver-29557864-r9tzv" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.508092 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.516410 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.538495 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.547970 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.639192 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.644753 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.654255 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-r9tzv" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.807836 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.822179 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.830701 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.833976 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.840554 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.914837 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.953275 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 07:04:39 crc kubenswrapper[5129]: I0314 07:04:39.987983 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.044712 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82ae62e-63ea-4a75-9c2f-7a02ade5768a" path="/var/lib/kubelet/pods/e82ae62e-63ea-4a75-9c2f-7a02ade5768a/volumes" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.087397 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw"] Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.098443 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.136102 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-r9tzv"] Mar 14 07:04:40 crc kubenswrapper[5129]: W0314 07:04:40.144151 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6849ef7d_3a7f_4c40_8375_c7d651a1985a.slice/crio-3b34f227c1ef129064984592fb5c53280fdad34950c9e297665f91cd8bf6eeb0 WatchSource:0}: Error finding container 3b34f227c1ef129064984592fb5c53280fdad34950c9e297665f91cd8bf6eeb0: Status 404 returned error can't find the container with id 3b34f227c1ef129064984592fb5c53280fdad34950c9e297665f91cd8bf6eeb0 Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.164380 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.335154 5129 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.420407 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.451003 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" event={"ID":"91e5e202-0a96-46be-b241-a97d49eb2619","Type":"ContainerStarted","Data":"475e514a10aaa2a66289ed96d6f5319378a60e05969a5c55f4d26bad3bcda069"} Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.451075 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" event={"ID":"91e5e202-0a96-46be-b241-a97d49eb2619","Type":"ContainerStarted","Data":"546a884de81887bb6251a315049e4132a971fbb984ec718e5ba9604e3078ef94"} Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.451169 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.452053 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-r9tzv" event={"ID":"6849ef7d-3a7f-4c40-8375-c7d651a1985a","Type":"ContainerStarted","Data":"3b34f227c1ef129064984592fb5c53280fdad34950c9e297665f91cd8bf6eeb0"} Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.649769 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.739091 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.839373 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.843988 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.865542 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.871565 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.898362 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" podStartSLOduration=47.898344978 podStartE2EDuration="47.898344978s" podCreationTimestamp="2026-03-14 07:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:04:40.469496194 +0000 UTC m=+343.221411398" watchObservedRunningTime="2026-03-14 07:04:40.898344978 +0000 UTC m=+343.650260162" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.978346 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 07:04:40 crc kubenswrapper[5129]: I0314 07:04:40.992833 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.024874 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.382804 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.414878 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.420141 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.626575 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.693641 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.786243 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 07:04:41 crc kubenswrapper[5129]: I0314 07:04:41.926670 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 07:04:42 crc kubenswrapper[5129]: I0314 07:04:42.423574 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 07:04:42 crc kubenswrapper[5129]: I0314 07:04:42.466927 5129 generic.go:334] "Generic (PLEG): container finished" podID="6849ef7d-3a7f-4c40-8375-c7d651a1985a" containerID="726bfdfc7fa9f776dc2d8cc6428beb7695bd1cabf010b53e28d67885a8ff10c5" exitCode=0 Mar 14 07:04:42 crc kubenswrapper[5129]: I0314 07:04:42.467061 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-r9tzv" event={"ID":"6849ef7d-3a7f-4c40-8375-c7d651a1985a","Type":"ContainerDied","Data":"726bfdfc7fa9f776dc2d8cc6428beb7695bd1cabf010b53e28d67885a8ff10c5"} Mar 14 07:04:42 crc kubenswrapper[5129]: I0314 07:04:42.650644 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 07:04:42 crc kubenswrapper[5129]: I0314 07:04:42.783116 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 07:04:43 crc kubenswrapper[5129]: I0314 07:04:43.341643 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 07:04:43 crc kubenswrapper[5129]: I0314 07:04:43.793819 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-r9tzv" Mar 14 07:04:43 crc kubenswrapper[5129]: I0314 07:04:43.813922 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnbd8\" (UniqueName: \"kubernetes.io/projected/6849ef7d-3a7f-4c40-8375-c7d651a1985a-kube-api-access-vnbd8\") pod \"6849ef7d-3a7f-4c40-8375-c7d651a1985a\" (UID: \"6849ef7d-3a7f-4c40-8375-c7d651a1985a\") " Mar 14 07:04:43 crc kubenswrapper[5129]: I0314 07:04:43.819317 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6849ef7d-3a7f-4c40-8375-c7d651a1985a-kube-api-access-vnbd8" (OuterVolumeSpecName: "kube-api-access-vnbd8") pod "6849ef7d-3a7f-4c40-8375-c7d651a1985a" (UID: "6849ef7d-3a7f-4c40-8375-c7d651a1985a"). InnerVolumeSpecName "kube-api-access-vnbd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:04:43 crc kubenswrapper[5129]: I0314 07:04:43.915360 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnbd8\" (UniqueName: \"kubernetes.io/projected/6849ef7d-3a7f-4c40-8375-c7d651a1985a-kube-api-access-vnbd8\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:44 crc kubenswrapper[5129]: I0314 07:04:44.486948 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557864-r9tzv" event={"ID":"6849ef7d-3a7f-4c40-8375-c7d651a1985a","Type":"ContainerDied","Data":"3b34f227c1ef129064984592fb5c53280fdad34950c9e297665f91cd8bf6eeb0"} Mar 14 07:04:44 crc kubenswrapper[5129]: I0314 07:04:44.487012 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b34f227c1ef129064984592fb5c53280fdad34950c9e297665f91cd8bf6eeb0" Mar 14 07:04:44 crc kubenswrapper[5129]: I0314 07:04:44.487015 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557864-r9tzv" Mar 14 07:04:50 crc kubenswrapper[5129]: I0314 07:04:50.101839 5129 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:04:50 crc kubenswrapper[5129]: I0314 07:04:50.102386 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0" gracePeriod=5 Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.258454 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.259831 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354097 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354214 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354232 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354256 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354286 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354295 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354309 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354350 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354448 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354654 5129 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354669 5129 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354681 5129 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.354695 5129 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.365707 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.455923 5129 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.555149 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.555248 5129 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0" exitCode=137 Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.555291 5129 scope.go:117] "RemoveContainer" containerID="2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.555358 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.581293 5129 scope.go:117] "RemoveContainer" containerID="2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0" Mar 14 07:04:55 crc kubenswrapper[5129]: E0314 07:04:55.581884 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0\": container with ID starting with 2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0 not found: ID does not exist" containerID="2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0" Mar 14 07:04:55 crc kubenswrapper[5129]: I0314 07:04:55.581927 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0"} err="failed to get container status \"2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0\": rpc error: code = NotFound desc = could not find container \"2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0\": container with ID starting with 2e668872f4a33ca7a59e85fdf1d5a7125b31c2e030c03b944eb127123befd9c0 not found: ID does not exist" Mar 14 07:04:56 crc kubenswrapper[5129]: I0314 07:04:56.048568 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 14 07:04:56 crc kubenswrapper[5129]: I0314 07:04:56.049337 5129 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 14 07:04:56 crc kubenswrapper[5129]: I0314 07:04:56.059553 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:04:56 crc kubenswrapper[5129]: I0314 07:04:56.059630 5129 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="444a07c9-1a1c-4965-80c7-4ff479796671" Mar 14 07:04:56 crc kubenswrapper[5129]: I0314 07:04:56.063675 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 07:04:56 crc kubenswrapper[5129]: I0314 07:04:56.063724 5129 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="444a07c9-1a1c-4965-80c7-4ff479796671" Mar 14 07:05:01 crc kubenswrapper[5129]: I0314 07:05:01.081650 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 07:05:01 crc kubenswrapper[5129]: I0314 07:05:01.610684 5129 generic.go:334] "Generic (PLEG): container finished" podID="7a298896-ef40-44ff-a9ba-45fba603014b" containerID="4c4b76bcb68a3ceacafa71db66403bba08494ef61e1651bf7b91e60e19f36504" exitCode=0 Mar 14 07:05:01 crc kubenswrapper[5129]: I0314 07:05:01.610758 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" event={"ID":"7a298896-ef40-44ff-a9ba-45fba603014b","Type":"ContainerDied","Data":"4c4b76bcb68a3ceacafa71db66403bba08494ef61e1651bf7b91e60e19f36504"} Mar 14 07:05:01 crc kubenswrapper[5129]: I0314 07:05:01.611534 5129 scope.go:117] "RemoveContainer" containerID="4c4b76bcb68a3ceacafa71db66403bba08494ef61e1651bf7b91e60e19f36504" Mar 14 07:05:02 crc kubenswrapper[5129]: I0314 07:05:02.618876 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" event={"ID":"7a298896-ef40-44ff-a9ba-45fba603014b","Type":"ContainerStarted","Data":"5d511a9bb9f7d75284acc71525b28c408c0b269fe9f13b9f244a5410ef988af2"} Mar 14 07:05:02 crc kubenswrapper[5129]: I0314 07:05:02.619710 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:05:02 crc kubenswrapper[5129]: I0314 07:05:02.622401 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:05:19 crc kubenswrapper[5129]: I0314 07:05:19.574184 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:05:19 crc kubenswrapper[5129]: I0314 07:05:19.575003 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:05:49 crc kubenswrapper[5129]: I0314 07:05:49.625158 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:05:49 crc kubenswrapper[5129]: I0314 07:05:49.625871 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.276791 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lfrnd"] Mar 14 07:05:53 crc kubenswrapper[5129]: E0314 07:05:53.277567 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6849ef7d-3a7f-4c40-8375-c7d651a1985a" containerName="oc" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.277587 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6849ef7d-3a7f-4c40-8375-c7d651a1985a" containerName="oc" Mar 14 07:05:53 crc kubenswrapper[5129]: E0314 07:05:53.277618 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.277627 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.277919 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.277936 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6849ef7d-3a7f-4c40-8375-c7d651a1985a" containerName="oc" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.278534 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.293301 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lfrnd"] Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403140 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-registry-certificates\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403193 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403216 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhx5w\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-kube-api-access-hhx5w\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403301 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403344 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-trusted-ca\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403376 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-bound-sa-token\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403400 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.403422 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-registry-tls\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.425723 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.504567 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-registry-certificates\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.504682 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.504713 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhx5w\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-kube-api-access-hhx5w\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.504798 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-trusted-ca\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.504834 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-bound-sa-token\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.504866 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.504897 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-registry-tls\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.505139 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.505694 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-registry-certificates\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.507789 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-trusted-ca\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.512782 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.515161 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-registry-tls\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.522637 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-bound-sa-token\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.524812 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhx5w\" (UniqueName: \"kubernetes.io/projected/d644f7a9-e9ab-4136-a6c8-d22521f63dcf-kube-api-access-hhx5w\") pod \"image-registry-66df7c8f76-lfrnd\" (UID: \"d644f7a9-e9ab-4136-a6c8-d22521f63dcf\") " pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:53 crc kubenswrapper[5129]: I0314 07:05:53.602282 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:54 crc kubenswrapper[5129]: I0314 07:05:54.078891 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lfrnd"] Mar 14 07:05:54 crc kubenswrapper[5129]: I0314 07:05:54.947258 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" event={"ID":"d644f7a9-e9ab-4136-a6c8-d22521f63dcf","Type":"ContainerStarted","Data":"d93e7d9bfaa1f4552abe2e75c5bd850a3da75dc6f6710562d34e27c515869113"} Mar 14 07:05:54 crc kubenswrapper[5129]: I0314 07:05:54.947767 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" event={"ID":"d644f7a9-e9ab-4136-a6c8-d22521f63dcf","Type":"ContainerStarted","Data":"6c5db3cc3b158952eb8fc34e22733ccc097bad6cf68e9e509aec43881fbd5176"} Mar 14 07:05:54 crc kubenswrapper[5129]: I0314 07:05:54.947966 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:05:54 crc kubenswrapper[5129]: I0314 07:05:54.967283 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" podStartSLOduration=1.967262338 podStartE2EDuration="1.967262338s" podCreationTimestamp="2026-03-14 07:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:05:54.964772097 +0000 UTC m=+417.716687291" watchObservedRunningTime="2026-03-14 07:05:54.967262338 +0000 UTC m=+417.719177522" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.148805 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557866-9nbbm"] Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.150282 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.154442 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.154705 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.155745 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.163397 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-9nbbm"] Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.319985 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hcsh\" (UniqueName: \"kubernetes.io/projected/70ee55e8-0691-4e28-ba66-cefd04b1a8f2-kube-api-access-8hcsh\") pod \"auto-csr-approver-29557866-9nbbm\" (UID: \"70ee55e8-0691-4e28-ba66-cefd04b1a8f2\") " pod="openshift-infra/auto-csr-approver-29557866-9nbbm" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.422401 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hcsh\" (UniqueName: \"kubernetes.io/projected/70ee55e8-0691-4e28-ba66-cefd04b1a8f2-kube-api-access-8hcsh\") pod \"auto-csr-approver-29557866-9nbbm\" (UID: \"70ee55e8-0691-4e28-ba66-cefd04b1a8f2\") " pod="openshift-infra/auto-csr-approver-29557866-9nbbm" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.444431 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hcsh\" (UniqueName: \"kubernetes.io/projected/70ee55e8-0691-4e28-ba66-cefd04b1a8f2-kube-api-access-8hcsh\") pod \"auto-csr-approver-29557866-9nbbm\" (UID: \"70ee55e8-0691-4e28-ba66-cefd04b1a8f2\") " pod="openshift-infra/auto-csr-approver-29557866-9nbbm" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.481466 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.725423 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-9nbbm"] Mar 14 07:06:00 crc kubenswrapper[5129]: I0314 07:06:00.982820 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" event={"ID":"70ee55e8-0691-4e28-ba66-cefd04b1a8f2","Type":"ContainerStarted","Data":"beefb9098c6b87bf384af4e28ed365d90a39bff8f0158cf69727067f758642fa"} Mar 14 07:06:01 crc kubenswrapper[5129]: I0314 07:06:01.990436 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" event={"ID":"70ee55e8-0691-4e28-ba66-cefd04b1a8f2","Type":"ContainerStarted","Data":"fd4462ed591782bf9325564d88930212ab0327e76305e17016fb6f082e5d439e"} Mar 14 07:06:02 crc kubenswrapper[5129]: I0314 07:06:02.999149 5129 generic.go:334] "Generic (PLEG): container finished" podID="70ee55e8-0691-4e28-ba66-cefd04b1a8f2" containerID="fd4462ed591782bf9325564d88930212ab0327e76305e17016fb6f082e5d439e" exitCode=0 Mar 14 07:06:02 crc kubenswrapper[5129]: I0314 07:06:02.999186 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" event={"ID":"70ee55e8-0691-4e28-ba66-cefd04b1a8f2","Type":"ContainerDied","Data":"fd4462ed591782bf9325564d88930212ab0327e76305e17016fb6f082e5d439e"} Mar 14 07:06:03 crc kubenswrapper[5129]: I0314 07:06:03.329761 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" Mar 14 07:06:03 crc kubenswrapper[5129]: I0314 07:06:03.374751 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hcsh\" (UniqueName: \"kubernetes.io/projected/70ee55e8-0691-4e28-ba66-cefd04b1a8f2-kube-api-access-8hcsh\") pod \"70ee55e8-0691-4e28-ba66-cefd04b1a8f2\" (UID: \"70ee55e8-0691-4e28-ba66-cefd04b1a8f2\") " Mar 14 07:06:03 crc kubenswrapper[5129]: I0314 07:06:03.382347 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ee55e8-0691-4e28-ba66-cefd04b1a8f2-kube-api-access-8hcsh" (OuterVolumeSpecName: "kube-api-access-8hcsh") pod "70ee55e8-0691-4e28-ba66-cefd04b1a8f2" (UID: "70ee55e8-0691-4e28-ba66-cefd04b1a8f2"). InnerVolumeSpecName "kube-api-access-8hcsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:03 crc kubenswrapper[5129]: I0314 07:06:03.476854 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hcsh\" (UniqueName: \"kubernetes.io/projected/70ee55e8-0691-4e28-ba66-cefd04b1a8f2-kube-api-access-8hcsh\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:04 crc kubenswrapper[5129]: I0314 07:06:04.006638 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" event={"ID":"70ee55e8-0691-4e28-ba66-cefd04b1a8f2","Type":"ContainerDied","Data":"beefb9098c6b87bf384af4e28ed365d90a39bff8f0158cf69727067f758642fa"} Mar 14 07:06:04 crc kubenswrapper[5129]: I0314 07:06:04.006672 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beefb9098c6b87bf384af4e28ed365d90a39bff8f0158cf69727067f758642fa" Mar 14 07:06:04 crc kubenswrapper[5129]: I0314 07:06:04.006748 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557866-9nbbm" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.725053 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t6l8"] Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.725975 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6t6l8" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="registry-server" containerID="cri-o://c5fda5f254298e5b858f8568449e0f20230cc86c9afe9a1eca8b8f730108ce30" gracePeriod=30 Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.744277 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gw4hz"] Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.744532 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gw4hz" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="registry-server" containerID="cri-o://b56c67ed916dbfb9522d4eb33b4cb80614cf5dc13278fd879d13a84ed48ddf40" gracePeriod=30 Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.757551 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cb8c5"] Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.757966 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" containerID="cri-o://5d511a9bb9f7d75284acc71525b28c408c0b269fe9f13b9f244a5410ef988af2" gracePeriod=30 Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.762099 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqck"] Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.762432 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mdqck" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="registry-server" containerID="cri-o://ab7286767d8073deb30416644d38e366872aeccd9803f0a8008ab41b26a08609" gracePeriod=30 Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.767841 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8zqt"] Mar 14 07:06:10 crc kubenswrapper[5129]: E0314 07:06:10.768134 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ee55e8-0691-4e28-ba66-cefd04b1a8f2" containerName="oc" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.768156 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ee55e8-0691-4e28-ba66-cefd04b1a8f2" containerName="oc" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.768283 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ee55e8-0691-4e28-ba66-cefd04b1a8f2" containerName="oc" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.768768 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.779132 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fg5m2"] Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.779471 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fg5m2" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="registry-server" containerID="cri-o://9a7c6f281770d9548b8e524a5a1504fb4a0316b6c4cd16b401285f80898ed820" gracePeriod=30 Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.784375 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8zqt"] Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.879938 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wj7p\" (UniqueName: \"kubernetes.io/projected/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-kube-api-access-7wj7p\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.880023 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.880089 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.983194 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wj7p\" (UniqueName: \"kubernetes.io/projected/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-kube-api-access-7wj7p\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.983462 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.983515 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.987453 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:10 crc kubenswrapper[5129]: I0314 07:06:10.989045 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.001654 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wj7p\" (UniqueName: \"kubernetes.io/projected/dc3dfa46-c5d8-40d0-8f3b-b0522341edd6-kube-api-access-7wj7p\") pod \"marketplace-operator-79b997595-j8zqt\" (UID: \"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.054189 5129 generic.go:334] "Generic (PLEG): container finished" podID="7a298896-ef40-44ff-a9ba-45fba603014b" containerID="5d511a9bb9f7d75284acc71525b28c408c0b269fe9f13b9f244a5410ef988af2" exitCode=0 Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.054256 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" event={"ID":"7a298896-ef40-44ff-a9ba-45fba603014b","Type":"ContainerDied","Data":"5d511a9bb9f7d75284acc71525b28c408c0b269fe9f13b9f244a5410ef988af2"} Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.054288 5129 scope.go:117] "RemoveContainer" containerID="4c4b76bcb68a3ceacafa71db66403bba08494ef61e1651bf7b91e60e19f36504" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.056867 5129 generic.go:334] "Generic (PLEG): container finished" podID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerID="b56c67ed916dbfb9522d4eb33b4cb80614cf5dc13278fd879d13a84ed48ddf40" exitCode=0 Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.056916 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw4hz" event={"ID":"ff0704b1-3b5e-4e02-9f82-c1d74ad03387","Type":"ContainerDied","Data":"b56c67ed916dbfb9522d4eb33b4cb80614cf5dc13278fd879d13a84ed48ddf40"} Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.060120 5129 generic.go:334] "Generic (PLEG): container finished" podID="647966c7-67bc-4945-a281-477f0f83496e" containerID="9a7c6f281770d9548b8e524a5a1504fb4a0316b6c4cd16b401285f80898ed820" exitCode=0 Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.060139 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg5m2" event={"ID":"647966c7-67bc-4945-a281-477f0f83496e","Type":"ContainerDied","Data":"9a7c6f281770d9548b8e524a5a1504fb4a0316b6c4cd16b401285f80898ed820"} Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.062834 5129 generic.go:334] "Generic (PLEG): container finished" podID="a848f19c-da50-403e-b620-5425b51fab9a" containerID="c5fda5f254298e5b858f8568449e0f20230cc86c9afe9a1eca8b8f730108ce30" exitCode=0 Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.062899 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6l8" event={"ID":"a848f19c-da50-403e-b620-5425b51fab9a","Type":"ContainerDied","Data":"c5fda5f254298e5b858f8568449e0f20230cc86c9afe9a1eca8b8f730108ce30"} Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.062930 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t6l8" event={"ID":"a848f19c-da50-403e-b620-5425b51fab9a","Type":"ContainerDied","Data":"b6b59c573404920b8eb8b1aabb297ba85372767af447631c8558ffd76070117f"} Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.062944 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b59c573404920b8eb8b1aabb297ba85372767af447631c8558ffd76070117f" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.065292 5129 generic.go:334] "Generic (PLEG): container finished" podID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerID="ab7286767d8073deb30416644d38e366872aeccd9803f0a8008ab41b26a08609" exitCode=0 Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.065326 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqck" event={"ID":"2cdca220-d63a-45e2-ad4e-d2b920554116","Type":"ContainerDied","Data":"ab7286767d8073deb30416644d38e366872aeccd9803f0a8008ab41b26a08609"} Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.087873 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.181314 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.188383 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.207735 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.214682 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.256277 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289052 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsxdf\" (UniqueName: \"kubernetes.io/projected/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-kube-api-access-vsxdf\") pod \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289107 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-utilities\") pod \"a848f19c-da50-403e-b620-5425b51fab9a\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289134 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-catalog-content\") pod \"a848f19c-da50-403e-b620-5425b51fab9a\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289155 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-trusted-ca\") pod \"7a298896-ef40-44ff-a9ba-45fba603014b\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289184 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-utilities\") pod \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289206 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-utilities\") pod \"2cdca220-d63a-45e2-ad4e-d2b920554116\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289264 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-474v5\" (UniqueName: \"kubernetes.io/projected/7a298896-ef40-44ff-a9ba-45fba603014b-kube-api-access-474v5\") pod \"7a298896-ef40-44ff-a9ba-45fba603014b\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289288 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-catalog-content\") pod \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\" (UID: \"ff0704b1-3b5e-4e02-9f82-c1d74ad03387\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289303 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft5dc\" (UniqueName: \"kubernetes.io/projected/2cdca220-d63a-45e2-ad4e-d2b920554116-kube-api-access-ft5dc\") pod \"2cdca220-d63a-45e2-ad4e-d2b920554116\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289324 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-operator-metrics\") pod \"7a298896-ef40-44ff-a9ba-45fba603014b\" (UID: \"7a298896-ef40-44ff-a9ba-45fba603014b\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289345 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfmb\" (UniqueName: \"kubernetes.io/projected/a848f19c-da50-403e-b620-5425b51fab9a-kube-api-access-nwfmb\") pod \"a848f19c-da50-403e-b620-5425b51fab9a\" (UID: \"a848f19c-da50-403e-b620-5425b51fab9a\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.289359 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-catalog-content\") pod \"2cdca220-d63a-45e2-ad4e-d2b920554116\" (UID: \"2cdca220-d63a-45e2-ad4e-d2b920554116\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.290542 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-utilities" (OuterVolumeSpecName: "utilities") pod "2cdca220-d63a-45e2-ad4e-d2b920554116" (UID: "2cdca220-d63a-45e2-ad4e-d2b920554116"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.291131 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7a298896-ef40-44ff-a9ba-45fba603014b" (UID: "7a298896-ef40-44ff-a9ba-45fba603014b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.291418 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-utilities" (OuterVolumeSpecName: "utilities") pod "ff0704b1-3b5e-4e02-9f82-c1d74ad03387" (UID: "ff0704b1-3b5e-4e02-9f82-c1d74ad03387"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.291966 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-utilities" (OuterVolumeSpecName: "utilities") pod "a848f19c-da50-403e-b620-5425b51fab9a" (UID: "a848f19c-da50-403e-b620-5425b51fab9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.294955 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7a298896-ef40-44ff-a9ba-45fba603014b" (UID: "7a298896-ef40-44ff-a9ba-45fba603014b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.295152 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a848f19c-da50-403e-b620-5425b51fab9a-kube-api-access-nwfmb" (OuterVolumeSpecName: "kube-api-access-nwfmb") pod "a848f19c-da50-403e-b620-5425b51fab9a" (UID: "a848f19c-da50-403e-b620-5425b51fab9a"). InnerVolumeSpecName "kube-api-access-nwfmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.295354 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-kube-api-access-vsxdf" (OuterVolumeSpecName: "kube-api-access-vsxdf") pod "ff0704b1-3b5e-4e02-9f82-c1d74ad03387" (UID: "ff0704b1-3b5e-4e02-9f82-c1d74ad03387"). InnerVolumeSpecName "kube-api-access-vsxdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.295429 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a298896-ef40-44ff-a9ba-45fba603014b-kube-api-access-474v5" (OuterVolumeSpecName: "kube-api-access-474v5") pod "7a298896-ef40-44ff-a9ba-45fba603014b" (UID: "7a298896-ef40-44ff-a9ba-45fba603014b"). InnerVolumeSpecName "kube-api-access-474v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.295549 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdca220-d63a-45e2-ad4e-d2b920554116-kube-api-access-ft5dc" (OuterVolumeSpecName: "kube-api-access-ft5dc") pod "2cdca220-d63a-45e2-ad4e-d2b920554116" (UID: "2cdca220-d63a-45e2-ad4e-d2b920554116"). InnerVolumeSpecName "kube-api-access-ft5dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.327468 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cdca220-d63a-45e2-ad4e-d2b920554116" (UID: "2cdca220-d63a-45e2-ad4e-d2b920554116"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.359924 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff0704b1-3b5e-4e02-9f82-c1d74ad03387" (UID: "ff0704b1-3b5e-4e02-9f82-c1d74ad03387"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.366338 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a848f19c-da50-403e-b620-5425b51fab9a" (UID: "a848f19c-da50-403e-b620-5425b51fab9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.390740 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvbvl\" (UniqueName: \"kubernetes.io/projected/647966c7-67bc-4945-a281-477f0f83496e-kube-api-access-kvbvl\") pod \"647966c7-67bc-4945-a281-477f0f83496e\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.390781 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-utilities\") pod \"647966c7-67bc-4945-a281-477f0f83496e\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.390824 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-catalog-content\") pod \"647966c7-67bc-4945-a281-477f0f83496e\" (UID: \"647966c7-67bc-4945-a281-477f0f83496e\") " Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391114 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391133 5129 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391144 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391152 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391161 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-474v5\" (UniqueName: \"kubernetes.io/projected/7a298896-ef40-44ff-a9ba-45fba603014b-kube-api-access-474v5\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391169 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391177 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft5dc\" (UniqueName: \"kubernetes.io/projected/2cdca220-d63a-45e2-ad4e-d2b920554116-kube-api-access-ft5dc\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391228 5129 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7a298896-ef40-44ff-a9ba-45fba603014b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391239 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfmb\" (UniqueName: \"kubernetes.io/projected/a848f19c-da50-403e-b620-5425b51fab9a-kube-api-access-nwfmb\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391247 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdca220-d63a-45e2-ad4e-d2b920554116-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391255 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsxdf\" (UniqueName: \"kubernetes.io/projected/ff0704b1-3b5e-4e02-9f82-c1d74ad03387-kube-api-access-vsxdf\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391263 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a848f19c-da50-403e-b620-5425b51fab9a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.391478 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-utilities" (OuterVolumeSpecName: "utilities") pod "647966c7-67bc-4945-a281-477f0f83496e" (UID: "647966c7-67bc-4945-a281-477f0f83496e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.393834 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647966c7-67bc-4945-a281-477f0f83496e-kube-api-access-kvbvl" (OuterVolumeSpecName: "kube-api-access-kvbvl") pod "647966c7-67bc-4945-a281-477f0f83496e" (UID: "647966c7-67bc-4945-a281-477f0f83496e"). InnerVolumeSpecName "kube-api-access-kvbvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.492563 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvbvl\" (UniqueName: \"kubernetes.io/projected/647966c7-67bc-4945-a281-477f0f83496e-kube-api-access-kvbvl\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.492591 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.533892 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "647966c7-67bc-4945-a281-477f0f83496e" (UID: "647966c7-67bc-4945-a281-477f0f83496e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.582671 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8zqt"] Mar 14 07:06:11 crc kubenswrapper[5129]: I0314 07:06:11.593426 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/647966c7-67bc-4945-a281-477f0f83496e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.074166 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdqck" event={"ID":"2cdca220-d63a-45e2-ad4e-d2b920554116","Type":"ContainerDied","Data":"29b37f2af887874493fcff382ffac82f3321a7f8d31726eda8aa0d7dd3480f84"} Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.074181 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdqck" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.074716 5129 scope.go:117] "RemoveContainer" containerID="ab7286767d8073deb30416644d38e366872aeccd9803f0a8008ab41b26a08609" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.076172 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" event={"ID":"7a298896-ef40-44ff-a9ba-45fba603014b","Type":"ContainerDied","Data":"28fc0e8cf5aa88ada0759ce3b09f653b36f6e8df2b133f6c1b614d4358079722"} Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.076187 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cb8c5" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.084385 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gw4hz" event={"ID":"ff0704b1-3b5e-4e02-9f82-c1d74ad03387","Type":"ContainerDied","Data":"4ff447d4501e4c0b9bd80c0475fca4fb1bc6745f5f322d5dc5932cf4d14afeb8"} Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.084411 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gw4hz" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.089111 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fg5m2" event={"ID":"647966c7-67bc-4945-a281-477f0f83496e","Type":"ContainerDied","Data":"dff040fc114756b7df333a9840a152c0b48a03145eb2c50db65eb733f96802ad"} Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.089132 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fg5m2" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.095221 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t6l8" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.095958 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" event={"ID":"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6","Type":"ContainerStarted","Data":"4a539cedf1feecc4eda0631cd3ece2b64780ee4483d5ccff3b98d6e50f8cda46"} Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.096005 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" event={"ID":"dc3dfa46-c5d8-40d0-8f3b-b0522341edd6","Type":"ContainerStarted","Data":"e06b81996d4d7c5d0b3a905aeb0540ceb32bafad116e141273d1431c0ab8ba1f"} Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.096363 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.100634 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cb8c5"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.100920 5129 scope.go:117] "RemoveContainer" containerID="e666cbc843910b19e70ba5eb743eb0dbf3da67aaacac0fd04845e9331ec18950" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.103875 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.105593 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cb8c5"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.110559 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gw4hz"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.117538 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gw4hz"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.129799 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqck"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.137271 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdqck"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.145869 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t6l8"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.148852 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6t6l8"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.148936 5129 scope.go:117] "RemoveContainer" containerID="94298492eb0225c9e930006963d29648a3c2ea84c076689d23a10bed0c19ef10" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.158536 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fg5m2"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.160325 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fg5m2"] Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.181082 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j8zqt" podStartSLOduration=2.181059435 podStartE2EDuration="2.181059435s" podCreationTimestamp="2026-03-14 07:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:06:12.176484096 +0000 UTC m=+434.928399280" watchObservedRunningTime="2026-03-14 07:06:12.181059435 +0000 UTC m=+434.932974629" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.182548 5129 scope.go:117] "RemoveContainer" containerID="5d511a9bb9f7d75284acc71525b28c408c0b269fe9f13b9f244a5410ef988af2" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.206872 5129 scope.go:117] "RemoveContainer" containerID="b56c67ed916dbfb9522d4eb33b4cb80614cf5dc13278fd879d13a84ed48ddf40" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.219514 5129 scope.go:117] "RemoveContainer" containerID="1346a0f9987b675f39ff7d6a9789cfbca306fe0ca919c5e2c4f83dc3afd3c37e" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.239142 5129 scope.go:117] "RemoveContainer" containerID="8825e5de0ec40557a273c02c3eeaca432046c8ae235cec0bc10458d619a39117" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.256338 5129 scope.go:117] "RemoveContainer" containerID="9a7c6f281770d9548b8e524a5a1504fb4a0316b6c4cd16b401285f80898ed820" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.268408 5129 scope.go:117] "RemoveContainer" containerID="f9df24927cdb20da53feb58f0a5c808ef05ba7dfb79907d29aa6d9181f0cdeff" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.282161 5129 scope.go:117] "RemoveContainer" containerID="cd801e5cc3681e7ea8d6e92752d0b2a37dc53b6316fcecd7259b9c0eea27e300" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.940996 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-strv7"] Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941183 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941195 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941207 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941212 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941220 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941226 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941233 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941239 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941246 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941252 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941260 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941265 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941274 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941279 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941287 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941295 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941305 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941310 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941320 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941326 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941337 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941343 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="extract-content" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941351 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941357 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941362 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941368 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="extract-utilities" Mar 14 07:06:12 crc kubenswrapper[5129]: E0314 07:06:12.941378 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941383 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941457 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941466 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941475 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941483 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a848f19c-da50-403e-b620-5425b51fab9a" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941494 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" containerName="marketplace-operator" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.941501 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="647966c7-67bc-4945-a281-477f0f83496e" containerName="registry-server" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.942185 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.944289 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 07:06:12 crc kubenswrapper[5129]: I0314 07:06:12.953848 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-strv7"] Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.012533 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-catalog-content\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.012620 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-utilities\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.012655 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvkp\" (UniqueName: \"kubernetes.io/projected/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-kube-api-access-jkvkp\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.113947 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-utilities\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.114024 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvkp\" (UniqueName: \"kubernetes.io/projected/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-kube-api-access-jkvkp\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.114169 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-catalog-content\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.114650 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-catalog-content\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.115150 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-utilities\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.141125 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gkxcz"] Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.143948 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.150095 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.159571 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gkxcz"] Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.160744 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvkp\" (UniqueName: \"kubernetes.io/projected/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-kube-api-access-jkvkp\") pod \"certified-operators-strv7\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.216134 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312716da-06d0-4ad7-9edf-f659d31db550-utilities\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.216475 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6qz7\" (UniqueName: \"kubernetes.io/projected/312716da-06d0-4ad7-9edf-f659d31db550-kube-api-access-n6qz7\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.216593 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312716da-06d0-4ad7-9edf-f659d31db550-catalog-content\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.275245 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.322165 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312716da-06d0-4ad7-9edf-f659d31db550-catalog-content\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.322288 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312716da-06d0-4ad7-9edf-f659d31db550-utilities\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.322338 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6qz7\" (UniqueName: \"kubernetes.io/projected/312716da-06d0-4ad7-9edf-f659d31db550-kube-api-access-n6qz7\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.323154 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312716da-06d0-4ad7-9edf-f659d31db550-utilities\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.327458 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312716da-06d0-4ad7-9edf-f659d31db550-catalog-content\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.343939 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6qz7\" (UniqueName: \"kubernetes.io/projected/312716da-06d0-4ad7-9edf-f659d31db550-kube-api-access-n6qz7\") pod \"redhat-marketplace-gkxcz\" (UID: \"312716da-06d0-4ad7-9edf-f659d31db550\") " pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.487481 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.586693 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-strv7"] Mar 14 07:06:13 crc kubenswrapper[5129]: W0314 07:06:13.605936 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215cccb9_4c4f_4dc4_9c7a_607a4d8079b6.slice/crio-13ad314c505d7e6a5c5f2a4b3ffb72245cc636c14ba1292d090a5f2e9b616843 WatchSource:0}: Error finding container 13ad314c505d7e6a5c5f2a4b3ffb72245cc636c14ba1292d090a5f2e9b616843: Status 404 returned error can't find the container with id 13ad314c505d7e6a5c5f2a4b3ffb72245cc636c14ba1292d090a5f2e9b616843 Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.606771 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lfrnd" Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.653990 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pggtq"] Mar 14 07:06:13 crc kubenswrapper[5129]: I0314 07:06:13.724095 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gkxcz"] Mar 14 07:06:13 crc kubenswrapper[5129]: W0314 07:06:13.732098 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312716da_06d0_4ad7_9edf_f659d31db550.slice/crio-573cfffb419c6a7292570f8e0888d72689a3d41de0ddc022643fa67a4f9eb527 WatchSource:0}: Error finding container 573cfffb419c6a7292570f8e0888d72689a3d41de0ddc022643fa67a4f9eb527: Status 404 returned error can't find the container with id 573cfffb419c6a7292570f8e0888d72689a3d41de0ddc022643fa67a4f9eb527 Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.042001 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdca220-d63a-45e2-ad4e-d2b920554116" path="/var/lib/kubelet/pods/2cdca220-d63a-45e2-ad4e-d2b920554116/volumes" Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.043343 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647966c7-67bc-4945-a281-477f0f83496e" path="/var/lib/kubelet/pods/647966c7-67bc-4945-a281-477f0f83496e/volumes" Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.044281 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a298896-ef40-44ff-a9ba-45fba603014b" path="/var/lib/kubelet/pods/7a298896-ef40-44ff-a9ba-45fba603014b/volumes" Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.045352 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a848f19c-da50-403e-b620-5425b51fab9a" path="/var/lib/kubelet/pods/a848f19c-da50-403e-b620-5425b51fab9a/volumes" Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.046484 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0704b1-3b5e-4e02-9f82-c1d74ad03387" path="/var/lib/kubelet/pods/ff0704b1-3b5e-4e02-9f82-c1d74ad03387/volumes" Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.111946 5129 generic.go:334] "Generic (PLEG): container finished" podID="215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" containerID="12a6be0b93ad7fcc33126034ae9eb6e1fdf358c4b571963506c4b0e9b556e847" exitCode=0 Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.111978 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-strv7" event={"ID":"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6","Type":"ContainerDied","Data":"12a6be0b93ad7fcc33126034ae9eb6e1fdf358c4b571963506c4b0e9b556e847"} Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.112010 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-strv7" event={"ID":"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6","Type":"ContainerStarted","Data":"13ad314c505d7e6a5c5f2a4b3ffb72245cc636c14ba1292d090a5f2e9b616843"} Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.113358 5129 generic.go:334] "Generic (PLEG): container finished" podID="312716da-06d0-4ad7-9edf-f659d31db550" containerID="0b9a7364738933763edeff21805d281115febb53fddf36b5d329f43283df407c" exitCode=0 Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.113391 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gkxcz" event={"ID":"312716da-06d0-4ad7-9edf-f659d31db550","Type":"ContainerDied","Data":"0b9a7364738933763edeff21805d281115febb53fddf36b5d329f43283df407c"} Mar 14 07:06:14 crc kubenswrapper[5129]: I0314 07:06:14.113415 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gkxcz" event={"ID":"312716da-06d0-4ad7-9edf-f659d31db550","Type":"ContainerStarted","Data":"573cfffb419c6a7292570f8e0888d72689a3d41de0ddc022643fa67a4f9eb527"} Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.119010 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-strv7" event={"ID":"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6","Type":"ContainerStarted","Data":"ab8da096f25973d88d30b9c7cd076e230953e47ab295242f1f1613b05a251df4"} Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.120902 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gkxcz" event={"ID":"312716da-06d0-4ad7-9edf-f659d31db550","Type":"ContainerStarted","Data":"88779b2f644768eaaf3ac45aea801b7ef168460458b5d9bf628182df7b335431"} Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.543222 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mc684"] Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.544227 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.547875 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.562712 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc684"] Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.720578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-utilities\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.720667 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj77k\" (UniqueName: \"kubernetes.io/projected/de8a905c-2cbf-426e-8272-fa1897f95c39-kube-api-access-vj77k\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.720714 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-catalog-content\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.741427 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86hcf"] Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.742457 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.745897 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.752739 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86hcf"] Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.821628 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-utilities\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.821670 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj77k\" (UniqueName: \"kubernetes.io/projected/de8a905c-2cbf-426e-8272-fa1897f95c39-kube-api-access-vj77k\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.821696 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-catalog-content\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.822083 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-utilities\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.822138 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-catalog-content\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.840718 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj77k\" (UniqueName: \"kubernetes.io/projected/de8a905c-2cbf-426e-8272-fa1897f95c39-kube-api-access-vj77k\") pod \"community-operators-mc684\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.872573 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.922893 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsq6\" (UniqueName: \"kubernetes.io/projected/65592ebe-e824-4d7d-9385-9073d54404e0-kube-api-access-nmsq6\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.923025 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65592ebe-e824-4d7d-9385-9073d54404e0-catalog-content\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:15 crc kubenswrapper[5129]: I0314 07:06:15.923104 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65592ebe-e824-4d7d-9385-9073d54404e0-utilities\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.025880 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsq6\" (UniqueName: \"kubernetes.io/projected/65592ebe-e824-4d7d-9385-9073d54404e0-kube-api-access-nmsq6\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.025970 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65592ebe-e824-4d7d-9385-9073d54404e0-catalog-content\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.026016 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65592ebe-e824-4d7d-9385-9073d54404e0-utilities\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.026965 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65592ebe-e824-4d7d-9385-9073d54404e0-utilities\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.026971 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65592ebe-e824-4d7d-9385-9073d54404e0-catalog-content\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.048512 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsq6\" (UniqueName: \"kubernetes.io/projected/65592ebe-e824-4d7d-9385-9073d54404e0-kube-api-access-nmsq6\") pod \"redhat-operators-86hcf\" (UID: \"65592ebe-e824-4d7d-9385-9073d54404e0\") " pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.050196 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc684"] Mar 14 07:06:16 crc kubenswrapper[5129]: W0314 07:06:16.050685 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8a905c_2cbf_426e_8272_fa1897f95c39.slice/crio-214156216dc86dec446259ca979f291cf6ce569a8efb2644054840269e1bc6b7 WatchSource:0}: Error finding container 214156216dc86dec446259ca979f291cf6ce569a8efb2644054840269e1bc6b7: Status 404 returned error can't find the container with id 214156216dc86dec446259ca979f291cf6ce569a8efb2644054840269e1bc6b7 Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.102260 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.134369 5129 generic.go:334] "Generic (PLEG): container finished" podID="215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" containerID="ab8da096f25973d88d30b9c7cd076e230953e47ab295242f1f1613b05a251df4" exitCode=0 Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.134455 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-strv7" event={"ID":"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6","Type":"ContainerDied","Data":"ab8da096f25973d88d30b9c7cd076e230953e47ab295242f1f1613b05a251df4"} Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.138451 5129 generic.go:334] "Generic (PLEG): container finished" podID="312716da-06d0-4ad7-9edf-f659d31db550" containerID="88779b2f644768eaaf3ac45aea801b7ef168460458b5d9bf628182df7b335431" exitCode=0 Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.138548 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gkxcz" event={"ID":"312716da-06d0-4ad7-9edf-f659d31db550","Type":"ContainerDied","Data":"88779b2f644768eaaf3ac45aea801b7ef168460458b5d9bf628182df7b335431"} Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.143888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc684" event={"ID":"de8a905c-2cbf-426e-8272-fa1897f95c39","Type":"ContainerStarted","Data":"214156216dc86dec446259ca979f291cf6ce569a8efb2644054840269e1bc6b7"} Mar 14 07:06:16 crc kubenswrapper[5129]: I0314 07:06:16.286185 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86hcf"] Mar 14 07:06:16 crc kubenswrapper[5129]: W0314 07:06:16.292773 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65592ebe_e824_4d7d_9385_9073d54404e0.slice/crio-fc9dbe2db3e8dc903ad1e368bdfbc92fcc46668ce9ce63cdeae3d32feded1244 WatchSource:0}: Error finding container fc9dbe2db3e8dc903ad1e368bdfbc92fcc46668ce9ce63cdeae3d32feded1244: Status 404 returned error can't find the container with id fc9dbe2db3e8dc903ad1e368bdfbc92fcc46668ce9ce63cdeae3d32feded1244 Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.149591 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-strv7" event={"ID":"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6","Type":"ContainerStarted","Data":"ec152d5d802793035ccc850a9e49efc93814f47cd39385fb1e9fb174a720eb2f"} Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.150879 5129 generic.go:334] "Generic (PLEG): container finished" podID="65592ebe-e824-4d7d-9385-9073d54404e0" containerID="23c937adb368ead8f803c405a40e4a6a1a5ec33da561c3dce321cabdfecf0dec" exitCode=0 Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.150935 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86hcf" event={"ID":"65592ebe-e824-4d7d-9385-9073d54404e0","Type":"ContainerDied","Data":"23c937adb368ead8f803c405a40e4a6a1a5ec33da561c3dce321cabdfecf0dec"} Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.150954 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86hcf" event={"ID":"65592ebe-e824-4d7d-9385-9073d54404e0","Type":"ContainerStarted","Data":"fc9dbe2db3e8dc903ad1e368bdfbc92fcc46668ce9ce63cdeae3d32feded1244"} Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.156426 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gkxcz" event={"ID":"312716da-06d0-4ad7-9edf-f659d31db550","Type":"ContainerStarted","Data":"9962412eed2ec4573fe3a6876fa061debc6192a9a84f947b25c6116acaf9e711"} Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.159481 5129 generic.go:334] "Generic (PLEG): container finished" podID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerID="65e1caa371f1b5a059a7c758d882bd8f051ad55860e8a1b494cf9cc1ee226f8c" exitCode=0 Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.159519 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc684" event={"ID":"de8a905c-2cbf-426e-8272-fa1897f95c39","Type":"ContainerDied","Data":"65e1caa371f1b5a059a7c758d882bd8f051ad55860e8a1b494cf9cc1ee226f8c"} Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.169139 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-strv7" podStartSLOduration=2.640212107 podStartE2EDuration="5.16912564s" podCreationTimestamp="2026-03-14 07:06:12 +0000 UTC" firstStartedPulling="2026-03-14 07:06:14.113296323 +0000 UTC m=+436.865211497" lastFinishedPulling="2026-03-14 07:06:16.642209806 +0000 UTC m=+439.394125030" observedRunningTime="2026-03-14 07:06:17.168626026 +0000 UTC m=+439.920541210" watchObservedRunningTime="2026-03-14 07:06:17.16912564 +0000 UTC m=+439.921040824" Mar 14 07:06:17 crc kubenswrapper[5129]: I0314 07:06:17.221205 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gkxcz" podStartSLOduration=1.635905476 podStartE2EDuration="4.221190461s" podCreationTimestamp="2026-03-14 07:06:13 +0000 UTC" firstStartedPulling="2026-03-14 07:06:14.114549579 +0000 UTC m=+436.866464773" lastFinishedPulling="2026-03-14 07:06:16.699834564 +0000 UTC m=+439.451749758" observedRunningTime="2026-03-14 07:06:17.21939809 +0000 UTC m=+439.971313284" watchObservedRunningTime="2026-03-14 07:06:17.221190461 +0000 UTC m=+439.973105645" Mar 14 07:06:18 crc kubenswrapper[5129]: I0314 07:06:18.167016 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86hcf" event={"ID":"65592ebe-e824-4d7d-9385-9073d54404e0","Type":"ContainerStarted","Data":"fd7ae524da7356119611c5966d0d30f223b69e54699e5cebf390bea18e1589a7"} Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.177249 5129 generic.go:334] "Generic (PLEG): container finished" podID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerID="1fcadc9bffb4c85ce81e6004b70904db66a7cdf2eba5a3e397abf0d3f3c78540" exitCode=0 Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.178931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc684" event={"ID":"de8a905c-2cbf-426e-8272-fa1897f95c39","Type":"ContainerDied","Data":"1fcadc9bffb4c85ce81e6004b70904db66a7cdf2eba5a3e397abf0d3f3c78540"} Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.191049 5129 generic.go:334] "Generic (PLEG): container finished" podID="65592ebe-e824-4d7d-9385-9073d54404e0" containerID="fd7ae524da7356119611c5966d0d30f223b69e54699e5cebf390bea18e1589a7" exitCode=0 Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.191176 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86hcf" event={"ID":"65592ebe-e824-4d7d-9385-9073d54404e0","Type":"ContainerDied","Data":"fd7ae524da7356119611c5966d0d30f223b69e54699e5cebf390bea18e1589a7"} Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.574809 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.575119 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.575307 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.576104 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"530282bf4b5fced32ff8a4f896195aed87fef0330a342177990023776d5d6856"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:06:19 crc kubenswrapper[5129]: I0314 07:06:19.576405 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://530282bf4b5fced32ff8a4f896195aed87fef0330a342177990023776d5d6856" gracePeriod=600 Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.196708 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc684" event={"ID":"de8a905c-2cbf-426e-8272-fa1897f95c39","Type":"ContainerStarted","Data":"759440f4fa219c00e39d4322715d8260cf42be47cb6771beeb2dd0415460ddff"} Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.198967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86hcf" event={"ID":"65592ebe-e824-4d7d-9385-9073d54404e0","Type":"ContainerStarted","Data":"28957d14751d33ea6a47bc7dfe7c29cf43505a90dc5c5d68937387956a1b2287"} Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.200893 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="530282bf4b5fced32ff8a4f896195aed87fef0330a342177990023776d5d6856" exitCode=0 Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.200916 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"530282bf4b5fced32ff8a4f896195aed87fef0330a342177990023776d5d6856"} Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.200931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"ad9a3d8e10cfbb5f940ec24c74e985983e64c59e08753d0434f8b007988ceac6"} Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.200946 5129 scope.go:117] "RemoveContainer" containerID="74f3f6bc39d369e18bc86f5d19b16adaf6821d2084179aac9f8e55c7b7de59d2" Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.219444 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mc684" podStartSLOduration=2.823428891 podStartE2EDuration="5.21942561s" podCreationTimestamp="2026-03-14 07:06:15 +0000 UTC" firstStartedPulling="2026-03-14 07:06:17.160475796 +0000 UTC m=+439.912390980" lastFinishedPulling="2026-03-14 07:06:19.556472475 +0000 UTC m=+442.308387699" observedRunningTime="2026-03-14 07:06:20.215011026 +0000 UTC m=+442.966926220" watchObservedRunningTime="2026-03-14 07:06:20.21942561 +0000 UTC m=+442.971340804" Mar 14 07:06:20 crc kubenswrapper[5129]: I0314 07:06:20.255515 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86hcf" podStartSLOduration=2.746481018 podStartE2EDuration="5.255499699s" podCreationTimestamp="2026-03-14 07:06:15 +0000 UTC" firstStartedPulling="2026-03-14 07:06:17.152088959 +0000 UTC m=+439.904004143" lastFinishedPulling="2026-03-14 07:06:19.66110763 +0000 UTC m=+442.413022824" observedRunningTime="2026-03-14 07:06:20.254626565 +0000 UTC m=+443.006541749" watchObservedRunningTime="2026-03-14 07:06:20.255499699 +0000 UTC m=+443.007414883" Mar 14 07:06:23 crc kubenswrapper[5129]: I0314 07:06:23.275564 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:23 crc kubenswrapper[5129]: I0314 07:06:23.276161 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:23 crc kubenswrapper[5129]: I0314 07:06:23.325432 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:23 crc kubenswrapper[5129]: I0314 07:06:23.488204 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:23 crc kubenswrapper[5129]: I0314 07:06:23.488727 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:23 crc kubenswrapper[5129]: I0314 07:06:23.539990 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:24 crc kubenswrapper[5129]: I0314 07:06:24.273718 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-strv7" Mar 14 07:06:24 crc kubenswrapper[5129]: I0314 07:06:24.281202 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gkxcz" Mar 14 07:06:25 crc kubenswrapper[5129]: I0314 07:06:25.873228 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:25 crc kubenswrapper[5129]: I0314 07:06:25.873308 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:25 crc kubenswrapper[5129]: I0314 07:06:25.934133 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:26 crc kubenswrapper[5129]: I0314 07:06:26.102616 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:26 crc kubenswrapper[5129]: I0314 07:06:26.102699 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:26 crc kubenswrapper[5129]: I0314 07:06:26.283878 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:06:27 crc kubenswrapper[5129]: I0314 07:06:27.162339 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-86hcf" podUID="65592ebe-e824-4d7d-9385-9073d54404e0" containerName="registry-server" probeResult="failure" output=< Mar 14 07:06:27 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:06:27 crc kubenswrapper[5129]: > Mar 14 07:06:36 crc kubenswrapper[5129]: I0314 07:06:36.157229 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:36 crc kubenswrapper[5129]: I0314 07:06:36.227721 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86hcf" Mar 14 07:06:38 crc kubenswrapper[5129]: I0314 07:06:38.702816 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" podUID="f2a7f356-6278-409f-9047-efece8492b78" containerName="registry" containerID="cri-o://c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb" gracePeriod=30 Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.035920 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.136338 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-registry-tls\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.136523 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.136567 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a7f356-6278-409f-9047-efece8492b78-ca-trust-extracted\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.136967 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-registry-certificates\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.138873 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.145793 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.147785 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a7f356-6278-409f-9047-efece8492b78-installation-pull-secrets\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.147903 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-bound-sa-token\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.148027 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5hm2\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-kube-api-access-g5hm2\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.148090 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-trusted-ca\") pod \"f2a7f356-6278-409f-9047-efece8492b78\" (UID: \"f2a7f356-6278-409f-9047-efece8492b78\") " Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.149131 5129 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.149163 5129 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.149751 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.153445 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-kube-api-access-g5hm2" (OuterVolumeSpecName: "kube-api-access-g5hm2") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "kube-api-access-g5hm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.154160 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.154267 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a7f356-6278-409f-9047-efece8492b78-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.162324 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.177360 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2a7f356-6278-409f-9047-efece8492b78-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f2a7f356-6278-409f-9047-efece8492b78" (UID: "f2a7f356-6278-409f-9047-efece8492b78"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.250568 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5hm2\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-kube-api-access-g5hm2\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.250621 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a7f356-6278-409f-9047-efece8492b78-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.250633 5129 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a7f356-6278-409f-9047-efece8492b78-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.250642 5129 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a7f356-6278-409f-9047-efece8492b78-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.250650 5129 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a7f356-6278-409f-9047-efece8492b78-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.313665 5129 generic.go:334] "Generic (PLEG): container finished" podID="f2a7f356-6278-409f-9047-efece8492b78" containerID="c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb" exitCode=0 Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.313705 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" event={"ID":"f2a7f356-6278-409f-9047-efece8492b78","Type":"ContainerDied","Data":"c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb"} Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.313729 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" event={"ID":"f2a7f356-6278-409f-9047-efece8492b78","Type":"ContainerDied","Data":"f45387b275c6d90c9e116ea809f7d9dc61917156838c02d6e66cf19e84efe4f1"} Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.313744 5129 scope.go:117] "RemoveContainer" containerID="c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.313836 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pggtq" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.342014 5129 scope.go:117] "RemoveContainer" containerID="c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb" Mar 14 07:06:39 crc kubenswrapper[5129]: E0314 07:06:39.343085 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb\": container with ID starting with c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb not found: ID does not exist" containerID="c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.343118 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb"} err="failed to get container status \"c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb\": rpc error: code = NotFound desc = could not find container \"c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb\": container with ID starting with c9f3d23a6bb019521ce5292a5588e9d5ea51da810bfbad23c817ffde6d64edcb not found: ID does not exist" Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.355679 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pggtq"] Mar 14 07:06:39 crc kubenswrapper[5129]: I0314 07:06:39.361373 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pggtq"] Mar 14 07:06:40 crc kubenswrapper[5129]: I0314 07:06:40.045252 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a7f356-6278-409f-9047-efece8492b78" path="/var/lib/kubelet/pods/f2a7f356-6278-409f-9047-efece8492b78/volumes" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.152245 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557868-2v7s2"] Mar 14 07:08:00 crc kubenswrapper[5129]: E0314 07:08:00.153554 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a7f356-6278-409f-9047-efece8492b78" containerName="registry" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.153578 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a7f356-6278-409f-9047-efece8492b78" containerName="registry" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.154017 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a7f356-6278-409f-9047-efece8492b78" containerName="registry" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.155403 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.157958 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.158631 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.161720 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.163745 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-2v7s2"] Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.333553 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shw5p\" (UniqueName: \"kubernetes.io/projected/24fb32a0-e29c-4208-996e-e31ca881707c-kube-api-access-shw5p\") pod \"auto-csr-approver-29557868-2v7s2\" (UID: \"24fb32a0-e29c-4208-996e-e31ca881707c\") " pod="openshift-infra/auto-csr-approver-29557868-2v7s2" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.435683 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shw5p\" (UniqueName: \"kubernetes.io/projected/24fb32a0-e29c-4208-996e-e31ca881707c-kube-api-access-shw5p\") pod \"auto-csr-approver-29557868-2v7s2\" (UID: \"24fb32a0-e29c-4208-996e-e31ca881707c\") " pod="openshift-infra/auto-csr-approver-29557868-2v7s2" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.456108 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shw5p\" (UniqueName: \"kubernetes.io/projected/24fb32a0-e29c-4208-996e-e31ca881707c-kube-api-access-shw5p\") pod \"auto-csr-approver-29557868-2v7s2\" (UID: \"24fb32a0-e29c-4208-996e-e31ca881707c\") " pod="openshift-infra/auto-csr-approver-29557868-2v7s2" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.492201 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.721237 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-2v7s2"] Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.725061 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:08:00 crc kubenswrapper[5129]: I0314 07:08:00.828217 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" event={"ID":"24fb32a0-e29c-4208-996e-e31ca881707c","Type":"ContainerStarted","Data":"a7ca8bab8a003eecce41ec2e5a2578705f8030546bd4bb052ecc4f690ee8e404"} Mar 14 07:08:02 crc kubenswrapper[5129]: I0314 07:08:02.844716 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" event={"ID":"24fb32a0-e29c-4208-996e-e31ca881707c","Type":"ContainerStarted","Data":"1a2c100ad6dabd2265e35ecd96f4d1279521b360349c2dfa6b8e21ae92a0a931"} Mar 14 07:08:02 crc kubenswrapper[5129]: I0314 07:08:02.864476 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" podStartSLOduration=1.210104952 podStartE2EDuration="2.864446087s" podCreationTimestamp="2026-03-14 07:08:00 +0000 UTC" firstStartedPulling="2026-03-14 07:08:00.724813381 +0000 UTC m=+543.476728575" lastFinishedPulling="2026-03-14 07:08:02.379154526 +0000 UTC m=+545.131069710" observedRunningTime="2026-03-14 07:08:02.859530282 +0000 UTC m=+545.611445486" watchObservedRunningTime="2026-03-14 07:08:02.864446087 +0000 UTC m=+545.616361311" Mar 14 07:08:03 crc kubenswrapper[5129]: I0314 07:08:03.851534 5129 generic.go:334] "Generic (PLEG): container finished" podID="24fb32a0-e29c-4208-996e-e31ca881707c" containerID="1a2c100ad6dabd2265e35ecd96f4d1279521b360349c2dfa6b8e21ae92a0a931" exitCode=0 Mar 14 07:08:03 crc kubenswrapper[5129]: I0314 07:08:03.851590 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" event={"ID":"24fb32a0-e29c-4208-996e-e31ca881707c","Type":"ContainerDied","Data":"1a2c100ad6dabd2265e35ecd96f4d1279521b360349c2dfa6b8e21ae92a0a931"} Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.108376 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.301534 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shw5p\" (UniqueName: \"kubernetes.io/projected/24fb32a0-e29c-4208-996e-e31ca881707c-kube-api-access-shw5p\") pod \"24fb32a0-e29c-4208-996e-e31ca881707c\" (UID: \"24fb32a0-e29c-4208-996e-e31ca881707c\") " Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.307141 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24fb32a0-e29c-4208-996e-e31ca881707c-kube-api-access-shw5p" (OuterVolumeSpecName: "kube-api-access-shw5p") pod "24fb32a0-e29c-4208-996e-e31ca881707c" (UID: "24fb32a0-e29c-4208-996e-e31ca881707c"). InnerVolumeSpecName "kube-api-access-shw5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.403321 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shw5p\" (UniqueName: \"kubernetes.io/projected/24fb32a0-e29c-4208-996e-e31ca881707c-kube-api-access-shw5p\") on node \"crc\" DevicePath \"\"" Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.865697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" event={"ID":"24fb32a0-e29c-4208-996e-e31ca881707c","Type":"ContainerDied","Data":"a7ca8bab8a003eecce41ec2e5a2578705f8030546bd4bb052ecc4f690ee8e404"} Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.866125 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7ca8bab8a003eecce41ec2e5a2578705f8030546bd4bb052ecc4f690ee8e404" Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.866215 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557868-2v7s2" Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.924549 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-lf4bl"] Mar 14 07:08:05 crc kubenswrapper[5129]: I0314 07:08:05.928039 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557862-lf4bl"] Mar 14 07:08:06 crc kubenswrapper[5129]: I0314 07:08:06.042935 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e49a0a-8f9f-4e78-8098-d195fe3297bd" path="/var/lib/kubelet/pods/e9e49a0a-8f9f-4e78-8098-d195fe3297bd/volumes" Mar 14 07:08:19 crc kubenswrapper[5129]: I0314 07:08:19.574888 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:08:19 crc kubenswrapper[5129]: I0314 07:08:19.575741 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:08:49 crc kubenswrapper[5129]: I0314 07:08:49.574819 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:08:49 crc kubenswrapper[5129]: I0314 07:08:49.575413 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:09:16 crc kubenswrapper[5129]: I0314 07:09:16.293418 5129 scope.go:117] "RemoveContainer" containerID="39befd089a33e2ffb58292b85b60ae481731877a002c54267b4a001c86e775ab" Mar 14 07:09:16 crc kubenswrapper[5129]: I0314 07:09:16.350169 5129 scope.go:117] "RemoveContainer" containerID="87266b4f7ae6ab6042ccacf2ec3620a6d8f14e014765fb67e0613aa06516218f" Mar 14 07:09:16 crc kubenswrapper[5129]: I0314 07:09:16.378569 5129 scope.go:117] "RemoveContainer" containerID="b90b47b68c2587d6749664ae94416e4120c02f08e5be7051331040d949e2086b" Mar 14 07:09:19 crc kubenswrapper[5129]: I0314 07:09:19.574192 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:09:19 crc kubenswrapper[5129]: I0314 07:09:19.574692 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:09:19 crc kubenswrapper[5129]: I0314 07:09:19.574788 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:09:19 crc kubenswrapper[5129]: I0314 07:09:19.575877 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad9a3d8e10cfbb5f940ec24c74e985983e64c59e08753d0434f8b007988ceac6"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:09:19 crc kubenswrapper[5129]: I0314 07:09:19.576048 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://ad9a3d8e10cfbb5f940ec24c74e985983e64c59e08753d0434f8b007988ceac6" gracePeriod=600 Mar 14 07:09:20 crc kubenswrapper[5129]: I0314 07:09:20.345211 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="ad9a3d8e10cfbb5f940ec24c74e985983e64c59e08753d0434f8b007988ceac6" exitCode=0 Mar 14 07:09:20 crc kubenswrapper[5129]: I0314 07:09:20.345337 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"ad9a3d8e10cfbb5f940ec24c74e985983e64c59e08753d0434f8b007988ceac6"} Mar 14 07:09:20 crc kubenswrapper[5129]: I0314 07:09:20.345658 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"f77555a687b65978c9c2c4dd7bac28e7a1c8bc7c2438e2e0e0556ac3023a77b8"} Mar 14 07:09:20 crc kubenswrapper[5129]: I0314 07:09:20.345695 5129 scope.go:117] "RemoveContainer" containerID="530282bf4b5fced32ff8a4f896195aed87fef0330a342177990023776d5d6856" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.144555 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557870-d86vd"] Mar 14 07:10:00 crc kubenswrapper[5129]: E0314 07:10:00.145448 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fb32a0-e29c-4208-996e-e31ca881707c" containerName="oc" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.145465 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fb32a0-e29c-4208-996e-e31ca881707c" containerName="oc" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.145685 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="24fb32a0-e29c-4208-996e-e31ca881707c" containerName="oc" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.146097 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-d86vd" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.149577 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.149837 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.150974 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.154181 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-d86vd"] Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.283156 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw656\" (UniqueName: \"kubernetes.io/projected/1b7ec990-df65-411a-8d53-345e03dd77e1-kube-api-access-zw656\") pod \"auto-csr-approver-29557870-d86vd\" (UID: \"1b7ec990-df65-411a-8d53-345e03dd77e1\") " pod="openshift-infra/auto-csr-approver-29557870-d86vd" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.384630 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw656\" (UniqueName: \"kubernetes.io/projected/1b7ec990-df65-411a-8d53-345e03dd77e1-kube-api-access-zw656\") pod \"auto-csr-approver-29557870-d86vd\" (UID: \"1b7ec990-df65-411a-8d53-345e03dd77e1\") " pod="openshift-infra/auto-csr-approver-29557870-d86vd" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.405659 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw656\" (UniqueName: \"kubernetes.io/projected/1b7ec990-df65-411a-8d53-345e03dd77e1-kube-api-access-zw656\") pod \"auto-csr-approver-29557870-d86vd\" (UID: \"1b7ec990-df65-411a-8d53-345e03dd77e1\") " pod="openshift-infra/auto-csr-approver-29557870-d86vd" Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.473695 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-d86vd" Mar 14 07:10:00 crc kubenswrapper[5129]: W0314 07:10:00.673077 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b7ec990_df65_411a_8d53_345e03dd77e1.slice/crio-b73348e1d03deaf1090011de6d3e6d21be8688492f463f244a61dc4f2d9490ce WatchSource:0}: Error finding container b73348e1d03deaf1090011de6d3e6d21be8688492f463f244a61dc4f2d9490ce: Status 404 returned error can't find the container with id b73348e1d03deaf1090011de6d3e6d21be8688492f463f244a61dc4f2d9490ce Mar 14 07:10:00 crc kubenswrapper[5129]: I0314 07:10:00.674059 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-d86vd"] Mar 14 07:10:01 crc kubenswrapper[5129]: I0314 07:10:01.678317 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-d86vd" event={"ID":"1b7ec990-df65-411a-8d53-345e03dd77e1","Type":"ContainerStarted","Data":"b73348e1d03deaf1090011de6d3e6d21be8688492f463f244a61dc4f2d9490ce"} Mar 14 07:10:02 crc kubenswrapper[5129]: I0314 07:10:02.686472 5129 generic.go:334] "Generic (PLEG): container finished" podID="1b7ec990-df65-411a-8d53-345e03dd77e1" containerID="844a61e51b46cb765cedf8df29da5e9c329099b479aa6c82d2579d7d9734574d" exitCode=0 Mar 14 07:10:02 crc kubenswrapper[5129]: I0314 07:10:02.686531 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-d86vd" event={"ID":"1b7ec990-df65-411a-8d53-345e03dd77e1","Type":"ContainerDied","Data":"844a61e51b46cb765cedf8df29da5e9c329099b479aa6c82d2579d7d9734574d"} Mar 14 07:10:03 crc kubenswrapper[5129]: I0314 07:10:03.922010 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-d86vd" Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.032037 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw656\" (UniqueName: \"kubernetes.io/projected/1b7ec990-df65-411a-8d53-345e03dd77e1-kube-api-access-zw656\") pod \"1b7ec990-df65-411a-8d53-345e03dd77e1\" (UID: \"1b7ec990-df65-411a-8d53-345e03dd77e1\") " Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.041577 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7ec990-df65-411a-8d53-345e03dd77e1-kube-api-access-zw656" (OuterVolumeSpecName: "kube-api-access-zw656") pod "1b7ec990-df65-411a-8d53-345e03dd77e1" (UID: "1b7ec990-df65-411a-8d53-345e03dd77e1"). InnerVolumeSpecName "kube-api-access-zw656". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.133593 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw656\" (UniqueName: \"kubernetes.io/projected/1b7ec990-df65-411a-8d53-345e03dd77e1-kube-api-access-zw656\") on node \"crc\" DevicePath \"\"" Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.715506 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557870-d86vd" event={"ID":"1b7ec990-df65-411a-8d53-345e03dd77e1","Type":"ContainerDied","Data":"b73348e1d03deaf1090011de6d3e6d21be8688492f463f244a61dc4f2d9490ce"} Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.715543 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b73348e1d03deaf1090011de6d3e6d21be8688492f463f244a61dc4f2d9490ce" Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.715550 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557870-d86vd" Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.986379 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-r9tzv"] Mar 14 07:10:04 crc kubenswrapper[5129]: I0314 07:10:04.992712 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557864-r9tzv"] Mar 14 07:10:06 crc kubenswrapper[5129]: I0314 07:10:06.048722 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6849ef7d-3a7f-4c40-8375-c7d651a1985a" path="/var/lib/kubelet/pods/6849ef7d-3a7f-4c40-8375-c7d651a1985a/volumes" Mar 14 07:10:16 crc kubenswrapper[5129]: I0314 07:10:16.432257 5129 scope.go:117] "RemoveContainer" containerID="434b988f32830f00e4165efb189d6ae323a7c22c17bf328f3a88da6c9585491d" Mar 14 07:10:16 crc kubenswrapper[5129]: I0314 07:10:16.454281 5129 scope.go:117] "RemoveContainer" containerID="de8341d36f6ccaf1ef2a567fe07185ce78a2d44f28b03de376f6436e1c9387ad" Mar 14 07:10:16 crc kubenswrapper[5129]: I0314 07:10:16.476215 5129 scope.go:117] "RemoveContainer" containerID="c67d27b7fea88d43560bb8c501366312262ffcfe8710eb9aadad7262db75a441" Mar 14 07:10:16 crc kubenswrapper[5129]: I0314 07:10:16.513138 5129 scope.go:117] "RemoveContainer" containerID="c5fda5f254298e5b858f8568449e0f20230cc86c9afe9a1eca8b8f730108ce30" Mar 14 07:11:16 crc kubenswrapper[5129]: I0314 07:11:16.578631 5129 scope.go:117] "RemoveContainer" containerID="726bfdfc7fa9f776dc2d8cc6428beb7695bd1cabf010b53e28d67885a8ff10c5" Mar 14 07:11:19 crc kubenswrapper[5129]: I0314 07:11:19.574748 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:11:19 crc kubenswrapper[5129]: I0314 07:11:19.575165 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:11:49 crc kubenswrapper[5129]: I0314 07:11:49.574352 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:11:50 crc kubenswrapper[5129]: I0314 07:11:49.575075 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.147957 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557872-xpltw"] Mar 14 07:12:00 crc kubenswrapper[5129]: E0314 07:12:00.149368 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7ec990-df65-411a-8d53-345e03dd77e1" containerName="oc" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.149396 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7ec990-df65-411a-8d53-345e03dd77e1" containerName="oc" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.149702 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b7ec990-df65-411a-8d53-345e03dd77e1" containerName="oc" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.150260 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-xpltw" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.153400 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.158703 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.161075 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.166452 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-xpltw"] Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.190537 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qh8n\" (UniqueName: \"kubernetes.io/projected/f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f-kube-api-access-2qh8n\") pod \"auto-csr-approver-29557872-xpltw\" (UID: \"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f\") " pod="openshift-infra/auto-csr-approver-29557872-xpltw" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.291357 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qh8n\" (UniqueName: \"kubernetes.io/projected/f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f-kube-api-access-2qh8n\") pod \"auto-csr-approver-29557872-xpltw\" (UID: \"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f\") " pod="openshift-infra/auto-csr-approver-29557872-xpltw" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.314515 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qh8n\" (UniqueName: \"kubernetes.io/projected/f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f-kube-api-access-2qh8n\") pod \"auto-csr-approver-29557872-xpltw\" (UID: \"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f\") " pod="openshift-infra/auto-csr-approver-29557872-xpltw" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.490824 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-xpltw" Mar 14 07:12:00 crc kubenswrapper[5129]: I0314 07:12:00.686555 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-xpltw"] Mar 14 07:12:01 crc kubenswrapper[5129]: I0314 07:12:01.441545 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-xpltw" event={"ID":"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f","Type":"ContainerStarted","Data":"72c1f4b24755c6c35ea1319cfb44f710afcaac2e197506f5f1f476c3252ac542"} Mar 14 07:12:02 crc kubenswrapper[5129]: I0314 07:12:02.453140 5129 generic.go:334] "Generic (PLEG): container finished" podID="f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f" containerID="aa2ba740349323917bb8b1a9dd0ddf369ac27b9f3c1945d011313c578b1e009f" exitCode=0 Mar 14 07:12:02 crc kubenswrapper[5129]: I0314 07:12:02.453508 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-xpltw" event={"ID":"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f","Type":"ContainerDied","Data":"aa2ba740349323917bb8b1a9dd0ddf369ac27b9f3c1945d011313c578b1e009f"} Mar 14 07:12:03 crc kubenswrapper[5129]: I0314 07:12:03.726561 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-xpltw" Mar 14 07:12:03 crc kubenswrapper[5129]: I0314 07:12:03.837064 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qh8n\" (UniqueName: \"kubernetes.io/projected/f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f-kube-api-access-2qh8n\") pod \"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f\" (UID: \"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f\") " Mar 14 07:12:03 crc kubenswrapper[5129]: I0314 07:12:03.845105 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f-kube-api-access-2qh8n" (OuterVolumeSpecName: "kube-api-access-2qh8n") pod "f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f" (UID: "f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f"). InnerVolumeSpecName "kube-api-access-2qh8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:12:03 crc kubenswrapper[5129]: I0314 07:12:03.938551 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qh8n\" (UniqueName: \"kubernetes.io/projected/f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f-kube-api-access-2qh8n\") on node \"crc\" DevicePath \"\"" Mar 14 07:12:04 crc kubenswrapper[5129]: I0314 07:12:04.469191 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557872-xpltw" event={"ID":"f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f","Type":"ContainerDied","Data":"72c1f4b24755c6c35ea1319cfb44f710afcaac2e197506f5f1f476c3252ac542"} Mar 14 07:12:04 crc kubenswrapper[5129]: I0314 07:12:04.469245 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72c1f4b24755c6c35ea1319cfb44f710afcaac2e197506f5f1f476c3252ac542" Mar 14 07:12:04 crc kubenswrapper[5129]: I0314 07:12:04.469279 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557872-xpltw" Mar 14 07:12:04 crc kubenswrapper[5129]: I0314 07:12:04.810639 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-9nbbm"] Mar 14 07:12:04 crc kubenswrapper[5129]: I0314 07:12:04.816723 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557866-9nbbm"] Mar 14 07:12:06 crc kubenswrapper[5129]: I0314 07:12:06.048543 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ee55e8-0691-4e28-ba66-cefd04b1a8f2" path="/var/lib/kubelet/pods/70ee55e8-0691-4e28-ba66-cefd04b1a8f2/volumes" Mar 14 07:12:16 crc kubenswrapper[5129]: I0314 07:12:16.649708 5129 scope.go:117] "RemoveContainer" containerID="fd4462ed591782bf9325564d88930212ab0327e76305e17016fb6f082e5d439e" Mar 14 07:12:19 crc kubenswrapper[5129]: I0314 07:12:19.574229 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:12:19 crc kubenswrapper[5129]: I0314 07:12:19.574750 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:12:19 crc kubenswrapper[5129]: I0314 07:12:19.574819 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:12:19 crc kubenswrapper[5129]: I0314 07:12:19.575694 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f77555a687b65978c9c2c4dd7bac28e7a1c8bc7c2438e2e0e0556ac3023a77b8"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:12:19 crc kubenswrapper[5129]: I0314 07:12:19.575829 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://f77555a687b65978c9c2c4dd7bac28e7a1c8bc7c2438e2e0e0556ac3023a77b8" gracePeriod=600 Mar 14 07:12:20 crc kubenswrapper[5129]: I0314 07:12:20.579594 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="f77555a687b65978c9c2c4dd7bac28e7a1c8bc7c2438e2e0e0556ac3023a77b8" exitCode=0 Mar 14 07:12:20 crc kubenswrapper[5129]: I0314 07:12:20.579684 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"f77555a687b65978c9c2c4dd7bac28e7a1c8bc7c2438e2e0e0556ac3023a77b8"} Mar 14 07:12:20 crc kubenswrapper[5129]: I0314 07:12:20.580323 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"8b4a31888cd52c24930717574880110e2ffda0608e5edc1431b7914f81c314fb"} Mar 14 07:12:20 crc kubenswrapper[5129]: I0314 07:12:20.580378 5129 scope.go:117] "RemoveContainer" containerID="ad9a3d8e10cfbb5f940ec24c74e985983e64c59e08753d0434f8b007988ceac6" Mar 14 07:12:37 crc kubenswrapper[5129]: I0314 07:12:37.669896 5129 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.135537 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557874-pdjsp"] Mar 14 07:14:00 crc kubenswrapper[5129]: E0314 07:14:00.136506 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f" containerName="oc" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.136528 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f" containerName="oc" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.136740 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f" containerName="oc" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.137308 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-pdjsp" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.139781 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.139827 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.143860 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.145262 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-pdjsp"] Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.192246 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch84s\" (UniqueName: \"kubernetes.io/projected/0b5741c7-590a-44e3-bdf6-b71f5f7aec49-kube-api-access-ch84s\") pod \"auto-csr-approver-29557874-pdjsp\" (UID: \"0b5741c7-590a-44e3-bdf6-b71f5f7aec49\") " pod="openshift-infra/auto-csr-approver-29557874-pdjsp" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.294662 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch84s\" (UniqueName: \"kubernetes.io/projected/0b5741c7-590a-44e3-bdf6-b71f5f7aec49-kube-api-access-ch84s\") pod \"auto-csr-approver-29557874-pdjsp\" (UID: \"0b5741c7-590a-44e3-bdf6-b71f5f7aec49\") " pod="openshift-infra/auto-csr-approver-29557874-pdjsp" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.320277 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch84s\" (UniqueName: \"kubernetes.io/projected/0b5741c7-590a-44e3-bdf6-b71f5f7aec49-kube-api-access-ch84s\") pod \"auto-csr-approver-29557874-pdjsp\" (UID: \"0b5741c7-590a-44e3-bdf6-b71f5f7aec49\") " pod="openshift-infra/auto-csr-approver-29557874-pdjsp" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.458302 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-pdjsp" Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.644301 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-pdjsp"] Mar 14 07:14:00 crc kubenswrapper[5129]: I0314 07:14:00.657592 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:14:01 crc kubenswrapper[5129]: I0314 07:14:01.155525 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-pdjsp" event={"ID":"0b5741c7-590a-44e3-bdf6-b71f5f7aec49","Type":"ContainerStarted","Data":"90f56c2a7c9752b3485539d3bd4f6b48f1fb06b915745b3c32c18d08121199b9"} Mar 14 07:14:02 crc kubenswrapper[5129]: I0314 07:14:02.163347 5129 generic.go:334] "Generic (PLEG): container finished" podID="0b5741c7-590a-44e3-bdf6-b71f5f7aec49" containerID="2095f9a78fa9f5125a06244efe3139c7f3d1b406945f084663cb839e0e9cdadc" exitCode=0 Mar 14 07:14:02 crc kubenswrapper[5129]: I0314 07:14:02.163587 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-pdjsp" event={"ID":"0b5741c7-590a-44e3-bdf6-b71f5f7aec49","Type":"ContainerDied","Data":"2095f9a78fa9f5125a06244efe3139c7f3d1b406945f084663cb839e0e9cdadc"} Mar 14 07:14:03 crc kubenswrapper[5129]: I0314 07:14:03.383858 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-pdjsp" Mar 14 07:14:03 crc kubenswrapper[5129]: I0314 07:14:03.428265 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch84s\" (UniqueName: \"kubernetes.io/projected/0b5741c7-590a-44e3-bdf6-b71f5f7aec49-kube-api-access-ch84s\") pod \"0b5741c7-590a-44e3-bdf6-b71f5f7aec49\" (UID: \"0b5741c7-590a-44e3-bdf6-b71f5f7aec49\") " Mar 14 07:14:03 crc kubenswrapper[5129]: I0314 07:14:03.435159 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5741c7-590a-44e3-bdf6-b71f5f7aec49-kube-api-access-ch84s" (OuterVolumeSpecName: "kube-api-access-ch84s") pod "0b5741c7-590a-44e3-bdf6-b71f5f7aec49" (UID: "0b5741c7-590a-44e3-bdf6-b71f5f7aec49"). InnerVolumeSpecName "kube-api-access-ch84s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:03 crc kubenswrapper[5129]: I0314 07:14:03.530260 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch84s\" (UniqueName: \"kubernetes.io/projected/0b5741c7-590a-44e3-bdf6-b71f5f7aec49-kube-api-access-ch84s\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:04 crc kubenswrapper[5129]: I0314 07:14:04.174011 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557874-pdjsp" event={"ID":"0b5741c7-590a-44e3-bdf6-b71f5f7aec49","Type":"ContainerDied","Data":"90f56c2a7c9752b3485539d3bd4f6b48f1fb06b915745b3c32c18d08121199b9"} Mar 14 07:14:04 crc kubenswrapper[5129]: I0314 07:14:04.174361 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90f56c2a7c9752b3485539d3bd4f6b48f1fb06b915745b3c32c18d08121199b9" Mar 14 07:14:04 crc kubenswrapper[5129]: I0314 07:14:04.174068 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557874-pdjsp" Mar 14 07:14:04 crc kubenswrapper[5129]: I0314 07:14:04.444460 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-2v7s2"] Mar 14 07:14:04 crc kubenswrapper[5129]: I0314 07:14:04.447490 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557868-2v7s2"] Mar 14 07:14:06 crc kubenswrapper[5129]: I0314 07:14:06.050861 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24fb32a0-e29c-4208-996e-e31ca881707c" path="/var/lib/kubelet/pods/24fb32a0-e29c-4208-996e-e31ca881707c/volumes" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.354793 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7hfdh"] Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.355882 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-controller" containerID="cri-o://8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.355992 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="sbdb" containerID="cri-o://060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.356029 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="northd" containerID="cri-o://5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.356078 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-node" containerID="cri-o://0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.355964 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="nbdb" containerID="cri-o://bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.356087 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-acl-logging" containerID="cri-o://71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.356222 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.422399 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" containerID="cri-o://b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" gracePeriod=30 Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.723896 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/3.log" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.727352 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovn-acl-logging/0.log" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.728245 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovn-controller/0.log" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.728761 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.786744 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-69gmr"] Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787119 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787142 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787156 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5741c7-590a-44e3-bdf6-b71f5f7aec49" containerName="oc" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787165 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5741c7-590a-44e3-bdf6-b71f5f7aec49" containerName="oc" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787178 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787188 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787205 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-node" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787214 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-node" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787228 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="nbdb" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787240 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="nbdb" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787254 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="sbdb" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787263 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="sbdb" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787275 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kubecfg-setup" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787284 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kubecfg-setup" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787293 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-acl-logging" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787304 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-acl-logging" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787318 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="northd" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787328 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="northd" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787344 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787354 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787366 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787374 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787385 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787393 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.787405 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787415 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787566 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.787588 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-acl-logging" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788795 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788840 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788856 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788881 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovn-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788903 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="nbdb" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788916 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="kube-rbac-proxy-node" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788927 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="sbdb" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788943 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="northd" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.788954 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5741c7-590a-44e3-bdf6-b71f5f7aec49" containerName="oc" Mar 14 07:14:10 crc kubenswrapper[5129]: E0314 07:14:10.789268 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.789285 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.789453 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.789466 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerName="ovnkube-controller" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.791997 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917498 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-openvswitch\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917587 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-netd\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917649 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-kubelet\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917690 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-config\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917724 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-netns\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917760 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-systemd-units\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917850 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-log-socket\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917766 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917782 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917806 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917912 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-slash\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917975 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-ovn-kubernetes\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917823 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917692 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917935 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-slash" (OuterVolumeSpecName: "host-slash") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.917939 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-log-socket" (OuterVolumeSpecName: "log-socket") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918046 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-var-lib-openvswitch\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918086 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-ovn\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918116 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovn-node-metrics-cert\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918125 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918139 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918152 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918165 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918191 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-bin\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918260 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918264 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-script-lib\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918297 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918340 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918313 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-etc-openvswitch\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918402 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-node-log\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918437 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-systemd\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918472 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hssn4\" (UniqueName: \"kubernetes.io/projected/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-kube-api-access-hssn4\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918492 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-node-log" (OuterVolumeSpecName: "node-log") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918510 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-env-overrides\") pod \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\" (UID: \"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1\") " Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918579 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918777 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-env-overrides\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918810 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-systemd-units\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918914 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-node-log\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918935 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-systemd\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.918968 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-run-netns\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919045 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-cni-bin\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919135 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7rw\" (UniqueName: \"kubernetes.io/projected/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-kube-api-access-7d7rw\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919185 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919232 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-cni-netd\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919271 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-var-lib-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919321 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovn-node-metrics-cert\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919441 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-etc-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919548 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-log-socket\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919595 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-run-ovn-kubernetes\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919669 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-slash\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919692 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919717 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovnkube-script-lib\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919773 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-ovn\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919840 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-kubelet\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919874 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovnkube-config\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.919990 5129 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920025 5129 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920044 5129 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920063 5129 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920083 5129 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920100 5129 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920115 5129 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-node-log\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920133 5129 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920183 5129 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920203 5129 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920220 5129 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920236 5129 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920251 5129 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920262 5129 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920274 5129 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-log-socket\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920285 5129 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-slash\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.920299 5129 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.925243 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-kube-api-access-hssn4" (OuterVolumeSpecName: "kube-api-access-hssn4") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "kube-api-access-hssn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.926434 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:14:10 crc kubenswrapper[5129]: I0314 07:14:10.933292 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" (UID: "8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021308 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-node-log\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021226 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-node-log\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021414 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-systemd\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021438 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-run-netns\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021486 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-systemd\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021528 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-cni-bin\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021621 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-cni-bin\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021676 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-run-netns\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021689 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7rw\" (UniqueName: \"kubernetes.io/projected/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-kube-api-access-7d7rw\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021784 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021832 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-cni-netd\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021828 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021862 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-var-lib-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021882 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovn-node-metrics-cert\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021889 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-cni-netd\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021907 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-etc-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021935 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-var-lib-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.021998 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-log-socket\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022015 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-run-ovn-kubernetes\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022052 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-slash\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022067 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022083 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovnkube-script-lib\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022101 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-ovn\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022180 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-kubelet\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022204 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovnkube-config\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022249 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-env-overrides\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022284 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-systemd-units\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022334 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022346 5129 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022356 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hssn4\" (UniqueName: \"kubernetes.io/projected/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1-kube-api-access-hssn4\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022384 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-systemd-units\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022409 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-etc-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022430 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-log-socket\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022449 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-run-ovn-kubernetes\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022469 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-slash\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022487 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-openvswitch\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022635 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-host-kubelet\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.022831 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-run-ovn\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.023307 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovnkube-script-lib\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.023699 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovnkube-config\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.023719 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-env-overrides\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.026123 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-ovn-node-metrics-cert\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.044450 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7rw\" (UniqueName: \"kubernetes.io/projected/f9d053a2-9ad4-45f4-becf-6f78b71dd45d-kube-api-access-7d7rw\") pod \"ovnkube-node-69gmr\" (UID: \"f9d053a2-9ad4-45f4-becf-6f78b71dd45d\") " pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.108412 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.215456 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovnkube-controller/3.log" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.217650 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovn-acl-logging/0.log" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218136 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hfdh_8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/ovn-controller/0.log" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218481 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" exitCode=0 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218502 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" exitCode=0 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218511 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" exitCode=0 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218507 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218542 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218556 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218568 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218518 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" exitCode=0 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218583 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218589 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" exitCode=0 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218618 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" exitCode=0 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218627 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" exitCode=143 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218636 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" containerID="8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" exitCode=143 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218645 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218627 5129 scope.go:117] "RemoveContainer" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218683 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218699 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218712 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218719 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218726 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218732 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218739 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218745 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218751 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218758 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218767 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218779 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218789 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218795 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218801 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218808 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218815 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218822 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218829 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218835 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218841 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218852 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218864 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218873 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218880 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218886 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218892 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218898 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218904 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218911 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218917 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218923 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hfdh" event={"ID":"8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1","Type":"ContainerDied","Data":"4c9f9bb5f86d206984cc6834ec5a637cd228953370d0b89a6cb62d56cf3cb0ca"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218945 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218953 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218960 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218966 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218972 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218978 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218984 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218991 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.218998 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.219004 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.219759 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"e485fec9f7895dbabae1e4180cefec6de991b9618852c6080ad841298c0c255e"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.221745 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/2.log" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.222412 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/1.log" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.222458 5129 generic.go:334] "Generic (PLEG): container finished" podID="e37bb55b-4ace-4d62-9711-88d8a1bb8cd8" containerID="f2f3e31f8fa47821dc29e4dfcafdd95ccd4d6ff2f72e48303e4f326b6c1c2ce8" exitCode=2 Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.222495 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerDied","Data":"f2f3e31f8fa47821dc29e4dfcafdd95ccd4d6ff2f72e48303e4f326b6c1c2ce8"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.222524 5129 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02"} Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.223023 5129 scope.go:117] "RemoveContainer" containerID="f2f3e31f8fa47821dc29e4dfcafdd95ccd4d6ff2f72e48303e4f326b6c1c2ce8" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.256084 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.279309 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7hfdh"] Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.282851 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7hfdh"] Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.331653 5129 scope.go:117] "RemoveContainer" containerID="060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.352612 5129 scope.go:117] "RemoveContainer" containerID="bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.370496 5129 scope.go:117] "RemoveContainer" containerID="5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.386138 5129 scope.go:117] "RemoveContainer" containerID="0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.401389 5129 scope.go:117] "RemoveContainer" containerID="0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.420548 5129 scope.go:117] "RemoveContainer" containerID="71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.434464 5129 scope.go:117] "RemoveContainer" containerID="8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.449488 5129 scope.go:117] "RemoveContainer" containerID="3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.462921 5129 scope.go:117] "RemoveContainer" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.463293 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": container with ID starting with b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0 not found: ID does not exist" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.463343 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} err="failed to get container status \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": rpc error: code = NotFound desc = could not find container \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": container with ID starting with b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.463385 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.463687 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": container with ID starting with 2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297 not found: ID does not exist" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.463720 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} err="failed to get container status \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": rpc error: code = NotFound desc = could not find container \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": container with ID starting with 2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.463740 5129 scope.go:117] "RemoveContainer" containerID="060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.464031 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": container with ID starting with 060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7 not found: ID does not exist" containerID="060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.464090 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} err="failed to get container status \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": rpc error: code = NotFound desc = could not find container \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": container with ID starting with 060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.464124 5129 scope.go:117] "RemoveContainer" containerID="bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.464398 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": container with ID starting with bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2 not found: ID does not exist" containerID="bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.464419 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} err="failed to get container status \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": rpc error: code = NotFound desc = could not find container \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": container with ID starting with bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.464454 5129 scope.go:117] "RemoveContainer" containerID="5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.464710 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": container with ID starting with 5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48 not found: ID does not exist" containerID="5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.464741 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} err="failed to get container status \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": rpc error: code = NotFound desc = could not find container \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": container with ID starting with 5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.464760 5129 scope.go:117] "RemoveContainer" containerID="0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.464961 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": container with ID starting with 0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97 not found: ID does not exist" containerID="0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.464985 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} err="failed to get container status \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": rpc error: code = NotFound desc = could not find container \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": container with ID starting with 0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465001 5129 scope.go:117] "RemoveContainer" containerID="0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.465184 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": container with ID starting with 0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026 not found: ID does not exist" containerID="0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465212 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} err="failed to get container status \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": rpc error: code = NotFound desc = could not find container \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": container with ID starting with 0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465227 5129 scope.go:117] "RemoveContainer" containerID="71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.465420 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": container with ID starting with 71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d not found: ID does not exist" containerID="71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465443 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} err="failed to get container status \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": rpc error: code = NotFound desc = could not find container \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": container with ID starting with 71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465460 5129 scope.go:117] "RemoveContainer" containerID="8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.465684 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": container with ID starting with 8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743 not found: ID does not exist" containerID="8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465709 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} err="failed to get container status \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": rpc error: code = NotFound desc = could not find container \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": container with ID starting with 8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465725 5129 scope.go:117] "RemoveContainer" containerID="3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b" Mar 14 07:14:11 crc kubenswrapper[5129]: E0314 07:14:11.465920 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": container with ID starting with 3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b not found: ID does not exist" containerID="3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465944 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} err="failed to get container status \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": rpc error: code = NotFound desc = could not find container \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": container with ID starting with 3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.465962 5129 scope.go:117] "RemoveContainer" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466154 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} err="failed to get container status \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": rpc error: code = NotFound desc = could not find container \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": container with ID starting with b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466176 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466376 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} err="failed to get container status \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": rpc error: code = NotFound desc = could not find container \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": container with ID starting with 2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466400 5129 scope.go:117] "RemoveContainer" containerID="060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466658 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} err="failed to get container status \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": rpc error: code = NotFound desc = could not find container \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": container with ID starting with 060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466702 5129 scope.go:117] "RemoveContainer" containerID="bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466948 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} err="failed to get container status \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": rpc error: code = NotFound desc = could not find container \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": container with ID starting with bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.466972 5129 scope.go:117] "RemoveContainer" containerID="5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.467196 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} err="failed to get container status \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": rpc error: code = NotFound desc = could not find container \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": container with ID starting with 5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.467223 5129 scope.go:117] "RemoveContainer" containerID="0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.467474 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} err="failed to get container status \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": rpc error: code = NotFound desc = could not find container \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": container with ID starting with 0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.467507 5129 scope.go:117] "RemoveContainer" containerID="0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468009 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} err="failed to get container status \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": rpc error: code = NotFound desc = could not find container \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": container with ID starting with 0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468031 5129 scope.go:117] "RemoveContainer" containerID="71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468232 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} err="failed to get container status \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": rpc error: code = NotFound desc = could not find container \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": container with ID starting with 71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468253 5129 scope.go:117] "RemoveContainer" containerID="8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468464 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} err="failed to get container status \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": rpc error: code = NotFound desc = could not find container \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": container with ID starting with 8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468492 5129 scope.go:117] "RemoveContainer" containerID="3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468706 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} err="failed to get container status \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": rpc error: code = NotFound desc = could not find container \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": container with ID starting with 3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468724 5129 scope.go:117] "RemoveContainer" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468947 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} err="failed to get container status \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": rpc error: code = NotFound desc = could not find container \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": container with ID starting with b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.468987 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469200 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} err="failed to get container status \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": rpc error: code = NotFound desc = could not find container \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": container with ID starting with 2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469225 5129 scope.go:117] "RemoveContainer" containerID="060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469455 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} err="failed to get container status \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": rpc error: code = NotFound desc = could not find container \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": container with ID starting with 060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469482 5129 scope.go:117] "RemoveContainer" containerID="bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469719 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} err="failed to get container status \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": rpc error: code = NotFound desc = could not find container \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": container with ID starting with bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469744 5129 scope.go:117] "RemoveContainer" containerID="5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469958 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} err="failed to get container status \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": rpc error: code = NotFound desc = could not find container \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": container with ID starting with 5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.469982 5129 scope.go:117] "RemoveContainer" containerID="0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470192 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} err="failed to get container status \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": rpc error: code = NotFound desc = could not find container \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": container with ID starting with 0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470217 5129 scope.go:117] "RemoveContainer" containerID="0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470435 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} err="failed to get container status \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": rpc error: code = NotFound desc = could not find container \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": container with ID starting with 0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470460 5129 scope.go:117] "RemoveContainer" containerID="71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470673 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} err="failed to get container status \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": rpc error: code = NotFound desc = could not find container \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": container with ID starting with 71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470699 5129 scope.go:117] "RemoveContainer" containerID="8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470896 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} err="failed to get container status \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": rpc error: code = NotFound desc = could not find container \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": container with ID starting with 8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.470920 5129 scope.go:117] "RemoveContainer" containerID="3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471125 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} err="failed to get container status \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": rpc error: code = NotFound desc = could not find container \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": container with ID starting with 3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471151 5129 scope.go:117] "RemoveContainer" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471385 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} err="failed to get container status \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": rpc error: code = NotFound desc = could not find container \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": container with ID starting with b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471407 5129 scope.go:117] "RemoveContainer" containerID="2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471618 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297"} err="failed to get container status \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": rpc error: code = NotFound desc = could not find container \"2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297\": container with ID starting with 2d7ff6fb97077e2f9a0cd16bf448d29c83402f29e8c3de85e623243a159be297 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471652 5129 scope.go:117] "RemoveContainer" containerID="060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471859 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7"} err="failed to get container status \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": rpc error: code = NotFound desc = could not find container \"060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7\": container with ID starting with 060195bdf28a9b9c83de4247cdeb3262eedaa16a47f670a76f28565b023288c7 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.471877 5129 scope.go:117] "RemoveContainer" containerID="bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.472067 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2"} err="failed to get container status \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": rpc error: code = NotFound desc = could not find container \"bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2\": container with ID starting with bcc8110bfe74433101b906b440f6fa758665cb10e3329e1243d1f357a85927e2 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.472089 5129 scope.go:117] "RemoveContainer" containerID="5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.472276 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48"} err="failed to get container status \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": rpc error: code = NotFound desc = could not find container \"5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48\": container with ID starting with 5046a61ff1018a7df77c9c3b63e66e88e2e49aaf70619f8d977b2a3047f06b48 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.472302 5129 scope.go:117] "RemoveContainer" containerID="0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.472537 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97"} err="failed to get container status \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": rpc error: code = NotFound desc = could not find container \"0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97\": container with ID starting with 0ecff26a13008a2e0cf72f197ead2fd433b8bd5d32bae30a5fbf6c5d07314b97 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.472563 5129 scope.go:117] "RemoveContainer" containerID="0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.473074 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026"} err="failed to get container status \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": rpc error: code = NotFound desc = could not find container \"0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026\": container with ID starting with 0c5160dabd77cc4333fe2286b29ca230d93422647a2a10981497195d1ee95026 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.473101 5129 scope.go:117] "RemoveContainer" containerID="71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.473522 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d"} err="failed to get container status \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": rpc error: code = NotFound desc = could not find container \"71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d\": container with ID starting with 71520308f0304d8e08c09af1e5addc253ff2b94c0914f63842957a46c18acb5d not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.473543 5129 scope.go:117] "RemoveContainer" containerID="8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.473871 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743"} err="failed to get container status \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": rpc error: code = NotFound desc = could not find container \"8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743\": container with ID starting with 8095450f08906939fec50406c70e3a22a5100808e8a9453fdd8a931f8e9f5743 not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.473896 5129 scope.go:117] "RemoveContainer" containerID="3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.474139 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b"} err="failed to get container status \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": rpc error: code = NotFound desc = could not find container \"3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b\": container with ID starting with 3f0e81c869812420f009550ea5e9a63fca4f94620ba1669ac9cce82ae663d25b not found: ID does not exist" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.474196 5129 scope.go:117] "RemoveContainer" containerID="b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0" Mar 14 07:14:11 crc kubenswrapper[5129]: I0314 07:14:11.474446 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0"} err="failed to get container status \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": rpc error: code = NotFound desc = could not find container \"b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0\": container with ID starting with b800b94704b0fbb86668a71442279a6c457dcca907e27fc59cd371ec68ba27e0 not found: ID does not exist" Mar 14 07:14:12 crc kubenswrapper[5129]: I0314 07:14:12.045995 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1" path="/var/lib/kubelet/pods/8b20f87a-bc2f-4bc6-b3a8-506caa0f16d1/volumes" Mar 14 07:14:12 crc kubenswrapper[5129]: I0314 07:14:12.246362 5129 generic.go:334] "Generic (PLEG): container finished" podID="f9d053a2-9ad4-45f4-becf-6f78b71dd45d" containerID="17fc2beab5b67d5a7340fcb6d127b0f64863d29e6671a5ac8aba86b56e10f36d" exitCode=0 Mar 14 07:14:12 crc kubenswrapper[5129]: I0314 07:14:12.246952 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerDied","Data":"17fc2beab5b67d5a7340fcb6d127b0f64863d29e6671a5ac8aba86b56e10f36d"} Mar 14 07:14:12 crc kubenswrapper[5129]: I0314 07:14:12.249168 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/2.log" Mar 14 07:14:12 crc kubenswrapper[5129]: I0314 07:14:12.249990 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/1.log" Mar 14 07:14:12 crc kubenswrapper[5129]: I0314 07:14:12.250068 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r4btb" event={"ID":"e37bb55b-4ace-4d62-9711-88d8a1bb8cd8","Type":"ContainerStarted","Data":"b856790703000fd8750b5725235b961299f58ab914a2871ec3daeeae0edd7b37"} Mar 14 07:14:13 crc kubenswrapper[5129]: I0314 07:14:13.259439 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"c5ef1deed35bef4b105eb9700a8ae0db87b5405cb2251df4829a92231b091ee7"} Mar 14 07:14:13 crc kubenswrapper[5129]: I0314 07:14:13.260030 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"22f6ae193c1ecee8d9b84769f44e4e50d570eb31d413d296b66f5cf11d959782"} Mar 14 07:14:13 crc kubenswrapper[5129]: I0314 07:14:13.260049 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"e8b770ef82018eddb2dfca88b5ad929c8d8827b00102d66a4d8d868052c1ebb0"} Mar 14 07:14:13 crc kubenswrapper[5129]: I0314 07:14:13.260062 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"a11bfa0606962c0a38f767d0a95dd61f77154d06529596caa2f12954c6b217dc"} Mar 14 07:14:13 crc kubenswrapper[5129]: I0314 07:14:13.260093 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"508f11bf59d6069b911e93f778ab120c3e4c0351a283f21bf2987ecd467a8a84"} Mar 14 07:14:13 crc kubenswrapper[5129]: I0314 07:14:13.260106 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"7abb0e729bd12214c9acbfb6167ccaabe5352d208f2c6bd80e6d18aa5da7424e"} Mar 14 07:14:15 crc kubenswrapper[5129]: I0314 07:14:15.278368 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"84a244dd2e1b14d012f57ece95c2feb087092d16ddea2bd2989df9006309e6c5"} Mar 14 07:14:16 crc kubenswrapper[5129]: I0314 07:14:16.704771 5129 scope.go:117] "RemoveContainer" containerID="1a2c100ad6dabd2265e35ecd96f4d1279521b360349c2dfa6b8e21ae92a0a931" Mar 14 07:14:16 crc kubenswrapper[5129]: I0314 07:14:16.744042 5129 scope.go:117] "RemoveContainer" containerID="a958c6cbc92804a3a1c2ba1bec4777732a39f16f1cce025d4ed6f18a8e488f02" Mar 14 07:14:17 crc kubenswrapper[5129]: I0314 07:14:17.290974 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r4btb_e37bb55b-4ace-4d62-9711-88d8a1bb8cd8/kube-multus/2.log" Mar 14 07:14:18 crc kubenswrapper[5129]: I0314 07:14:18.301936 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" event={"ID":"f9d053a2-9ad4-45f4-becf-6f78b71dd45d","Type":"ContainerStarted","Data":"67080af1c9ec1ce360998e46fc9cb6245bd68ef2a539fda936079945309f34bf"} Mar 14 07:14:18 crc kubenswrapper[5129]: I0314 07:14:18.302276 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:18 crc kubenswrapper[5129]: I0314 07:14:18.302290 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:18 crc kubenswrapper[5129]: I0314 07:14:18.329206 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" podStartSLOduration=8.32918411 podStartE2EDuration="8.32918411s" podCreationTimestamp="2026-03-14 07:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:14:18.32772539 +0000 UTC m=+921.079640584" watchObservedRunningTime="2026-03-14 07:14:18.32918411 +0000 UTC m=+921.081099294" Mar 14 07:14:18 crc kubenswrapper[5129]: I0314 07:14:18.333855 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:19 crc kubenswrapper[5129]: I0314 07:14:19.308083 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:19 crc kubenswrapper[5129]: I0314 07:14:19.334094 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:19 crc kubenswrapper[5129]: I0314 07:14:19.574379 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:14:19 crc kubenswrapper[5129]: I0314 07:14:19.574485 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.054416 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-wvhn6"] Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.055692 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wvhn6"] Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.055836 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.058472 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.063083 5129 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dl4r7" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.063419 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.063425 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.136227 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26c62665-6770-4c8e-a229-aa9f971a7db1-crc-storage\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.136284 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q85t6\" (UniqueName: \"kubernetes.io/projected/26c62665-6770-4c8e-a229-aa9f971a7db1-kube-api-access-q85t6\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.136313 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26c62665-6770-4c8e-a229-aa9f971a7db1-node-mnt\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.237152 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26c62665-6770-4c8e-a229-aa9f971a7db1-crc-storage\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.237238 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q85t6\" (UniqueName: \"kubernetes.io/projected/26c62665-6770-4c8e-a229-aa9f971a7db1-kube-api-access-q85t6\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.237296 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26c62665-6770-4c8e-a229-aa9f971a7db1-node-mnt\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.237766 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26c62665-6770-4c8e-a229-aa9f971a7db1-node-mnt\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.238663 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26c62665-6770-4c8e-a229-aa9f971a7db1-crc-storage\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.258373 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q85t6\" (UniqueName: \"kubernetes.io/projected/26c62665-6770-4c8e-a229-aa9f971a7db1-kube-api-access-q85t6\") pod \"crc-storage-crc-wvhn6\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: I0314 07:14:20.391341 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: E0314 07:14:20.420736 5129 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wvhn6_crc-storage_26c62665-6770-4c8e-a229-aa9f971a7db1_0(30618af3ec557eeadb7ea913cc908b8f1160a21c3220414a0b67e098977e4293): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 07:14:20 crc kubenswrapper[5129]: E0314 07:14:20.420875 5129 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wvhn6_crc-storage_26c62665-6770-4c8e-a229-aa9f971a7db1_0(30618af3ec557eeadb7ea913cc908b8f1160a21c3220414a0b67e098977e4293): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: E0314 07:14:20.420912 5129 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wvhn6_crc-storage_26c62665-6770-4c8e-a229-aa9f971a7db1_0(30618af3ec557eeadb7ea913cc908b8f1160a21c3220414a0b67e098977e4293): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:20 crc kubenswrapper[5129]: E0314 07:14:20.420988 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-wvhn6_crc-storage(26c62665-6770-4c8e-a229-aa9f971a7db1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-wvhn6_crc-storage(26c62665-6770-4c8e-a229-aa9f971a7db1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wvhn6_crc-storage_26c62665-6770-4c8e-a229-aa9f971a7db1_0(30618af3ec557eeadb7ea913cc908b8f1160a21c3220414a0b67e098977e4293): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-wvhn6" podUID="26c62665-6770-4c8e-a229-aa9f971a7db1" Mar 14 07:14:21 crc kubenswrapper[5129]: I0314 07:14:21.931650 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n6pmk"] Mar 14 07:14:21 crc kubenswrapper[5129]: I0314 07:14:21.933414 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:21 crc kubenswrapper[5129]: I0314 07:14:21.949808 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6pmk"] Mar 14 07:14:21 crc kubenswrapper[5129]: I0314 07:14:21.958069 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-catalog-content\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:21 crc kubenswrapper[5129]: I0314 07:14:21.958133 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmqw9\" (UniqueName: \"kubernetes.io/projected/ec8767dd-5f70-4fe5-ba97-c82b72de2287-kube-api-access-bmqw9\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:21 crc kubenswrapper[5129]: I0314 07:14:21.958210 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-utilities\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.059216 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-utilities\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.059317 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-catalog-content\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.059348 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmqw9\" (UniqueName: \"kubernetes.io/projected/ec8767dd-5f70-4fe5-ba97-c82b72de2287-kube-api-access-bmqw9\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.060212 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-utilities\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.060482 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-catalog-content\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.079402 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmqw9\" (UniqueName: \"kubernetes.io/projected/ec8767dd-5f70-4fe5-ba97-c82b72de2287-kube-api-access-bmqw9\") pod \"community-operators-n6pmk\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.291989 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:22 crc kubenswrapper[5129]: I0314 07:14:22.559813 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6pmk"] Mar 14 07:14:23 crc kubenswrapper[5129]: I0314 07:14:23.332301 5129 generic.go:334] "Generic (PLEG): container finished" podID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerID="f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f" exitCode=0 Mar 14 07:14:23 crc kubenswrapper[5129]: I0314 07:14:23.332338 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6pmk" event={"ID":"ec8767dd-5f70-4fe5-ba97-c82b72de2287","Type":"ContainerDied","Data":"f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f"} Mar 14 07:14:23 crc kubenswrapper[5129]: I0314 07:14:23.332640 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6pmk" event={"ID":"ec8767dd-5f70-4fe5-ba97-c82b72de2287","Type":"ContainerStarted","Data":"ab349c2cb33d47486199702c74cb3a15ad9e20450efbec4052e24169606e7cfb"} Mar 14 07:14:24 crc kubenswrapper[5129]: I0314 07:14:24.342758 5129 generic.go:334] "Generic (PLEG): container finished" podID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerID="df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe" exitCode=0 Mar 14 07:14:24 crc kubenswrapper[5129]: I0314 07:14:24.342843 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6pmk" event={"ID":"ec8767dd-5f70-4fe5-ba97-c82b72de2287","Type":"ContainerDied","Data":"df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe"} Mar 14 07:14:25 crc kubenswrapper[5129]: I0314 07:14:25.352676 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6pmk" event={"ID":"ec8767dd-5f70-4fe5-ba97-c82b72de2287","Type":"ContainerStarted","Data":"d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd"} Mar 14 07:14:25 crc kubenswrapper[5129]: I0314 07:14:25.383637 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n6pmk" podStartSLOduration=2.974755853 podStartE2EDuration="4.383591932s" podCreationTimestamp="2026-03-14 07:14:21 +0000 UTC" firstStartedPulling="2026-03-14 07:14:23.336380797 +0000 UTC m=+926.088296011" lastFinishedPulling="2026-03-14 07:14:24.745216906 +0000 UTC m=+927.497132090" observedRunningTime="2026-03-14 07:14:25.378002287 +0000 UTC m=+928.129917481" watchObservedRunningTime="2026-03-14 07:14:25.383591932 +0000 UTC m=+928.135507126" Mar 14 07:14:32 crc kubenswrapper[5129]: I0314 07:14:32.292501 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:32 crc kubenswrapper[5129]: I0314 07:14:32.293458 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:32 crc kubenswrapper[5129]: I0314 07:14:32.355025 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:32 crc kubenswrapper[5129]: I0314 07:14:32.445195 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:32 crc kubenswrapper[5129]: I0314 07:14:32.592293 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6pmk"] Mar 14 07:14:33 crc kubenswrapper[5129]: I0314 07:14:33.035975 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:33 crc kubenswrapper[5129]: I0314 07:14:33.036442 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:33 crc kubenswrapper[5129]: I0314 07:14:33.312395 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wvhn6"] Mar 14 07:14:33 crc kubenswrapper[5129]: I0314 07:14:33.409894 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wvhn6" event={"ID":"26c62665-6770-4c8e-a229-aa9f971a7db1","Type":"ContainerStarted","Data":"cdb81d40c78ae534805e071778534b4ccc7d35571ba7f360c83444fba4ab48d0"} Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.417393 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n6pmk" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="registry-server" containerID="cri-o://d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd" gracePeriod=2 Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.790161 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.881795 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-utilities\") pod \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.882724 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-utilities" (OuterVolumeSpecName: "utilities") pod "ec8767dd-5f70-4fe5-ba97-c82b72de2287" (UID: "ec8767dd-5f70-4fe5-ba97-c82b72de2287"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.882804 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmqw9\" (UniqueName: \"kubernetes.io/projected/ec8767dd-5f70-4fe5-ba97-c82b72de2287-kube-api-access-bmqw9\") pod \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.882874 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-catalog-content\") pod \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\" (UID: \"ec8767dd-5f70-4fe5-ba97-c82b72de2287\") " Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.883245 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.887810 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8767dd-5f70-4fe5-ba97-c82b72de2287-kube-api-access-bmqw9" (OuterVolumeSpecName: "kube-api-access-bmqw9") pod "ec8767dd-5f70-4fe5-ba97-c82b72de2287" (UID: "ec8767dd-5f70-4fe5-ba97-c82b72de2287"). InnerVolumeSpecName "kube-api-access-bmqw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.932703 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec8767dd-5f70-4fe5-ba97-c82b72de2287" (UID: "ec8767dd-5f70-4fe5-ba97-c82b72de2287"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.984421 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmqw9\" (UniqueName: \"kubernetes.io/projected/ec8767dd-5f70-4fe5-ba97-c82b72de2287-kube-api-access-bmqw9\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:34 crc kubenswrapper[5129]: I0314 07:14:34.984491 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8767dd-5f70-4fe5-ba97-c82b72de2287-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.425344 5129 generic.go:334] "Generic (PLEG): container finished" podID="26c62665-6770-4c8e-a229-aa9f971a7db1" containerID="d1c2e8eb6ec0d2e29ae70e4296da04982a751b1adfd4cb0459bb26e93499a540" exitCode=0 Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.425413 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wvhn6" event={"ID":"26c62665-6770-4c8e-a229-aa9f971a7db1","Type":"ContainerDied","Data":"d1c2e8eb6ec0d2e29ae70e4296da04982a751b1adfd4cb0459bb26e93499a540"} Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.429041 5129 generic.go:334] "Generic (PLEG): container finished" podID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerID="d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd" exitCode=0 Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.429114 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6pmk" event={"ID":"ec8767dd-5f70-4fe5-ba97-c82b72de2287","Type":"ContainerDied","Data":"d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd"} Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.429166 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6pmk" event={"ID":"ec8767dd-5f70-4fe5-ba97-c82b72de2287","Type":"ContainerDied","Data":"ab349c2cb33d47486199702c74cb3a15ad9e20450efbec4052e24169606e7cfb"} Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.429164 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6pmk" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.429192 5129 scope.go:117] "RemoveContainer" containerID="d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.448011 5129 scope.go:117] "RemoveContainer" containerID="df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.516070 5129 scope.go:117] "RemoveContainer" containerID="f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.517652 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6pmk"] Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.521516 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n6pmk"] Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.544508 5129 scope.go:117] "RemoveContainer" containerID="d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd" Mar 14 07:14:35 crc kubenswrapper[5129]: E0314 07:14:35.545018 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd\": container with ID starting with d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd not found: ID does not exist" containerID="d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.545063 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd"} err="failed to get container status \"d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd\": rpc error: code = NotFound desc = could not find container \"d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd\": container with ID starting with d49c8ae6a1cdbe2ccadfed7bec7d3594c2234bce042a3654e9a262e7371e19fd not found: ID does not exist" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.545121 5129 scope.go:117] "RemoveContainer" containerID="df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe" Mar 14 07:14:35 crc kubenswrapper[5129]: E0314 07:14:35.545549 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe\": container with ID starting with df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe not found: ID does not exist" containerID="df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.545586 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe"} err="failed to get container status \"df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe\": rpc error: code = NotFound desc = could not find container \"df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe\": container with ID starting with df05572f167dd0db0462397a8eba5fb7916b3b48671c60facd58d43d588b4bfe not found: ID does not exist" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.545632 5129 scope.go:117] "RemoveContainer" containerID="f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f" Mar 14 07:14:35 crc kubenswrapper[5129]: E0314 07:14:35.545932 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f\": container with ID starting with f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f not found: ID does not exist" containerID="f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f" Mar 14 07:14:35 crc kubenswrapper[5129]: I0314 07:14:35.545969 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f"} err="failed to get container status \"f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f\": rpc error: code = NotFound desc = could not find container \"f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f\": container with ID starting with f02d2b23fcb7a22617627b4ca2cc9c73129fffceaf946818ae90e8f4e146be9f not found: ID does not exist" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.041560 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" path="/var/lib/kubelet/pods/ec8767dd-5f70-4fe5-ba97-c82b72de2287/volumes" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.666028 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.805005 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q85t6\" (UniqueName: \"kubernetes.io/projected/26c62665-6770-4c8e-a229-aa9f971a7db1-kube-api-access-q85t6\") pod \"26c62665-6770-4c8e-a229-aa9f971a7db1\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.805063 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26c62665-6770-4c8e-a229-aa9f971a7db1-node-mnt\") pod \"26c62665-6770-4c8e-a229-aa9f971a7db1\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.805117 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26c62665-6770-4c8e-a229-aa9f971a7db1-crc-storage\") pod \"26c62665-6770-4c8e-a229-aa9f971a7db1\" (UID: \"26c62665-6770-4c8e-a229-aa9f971a7db1\") " Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.805227 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26c62665-6770-4c8e-a229-aa9f971a7db1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "26c62665-6770-4c8e-a229-aa9f971a7db1" (UID: "26c62665-6770-4c8e-a229-aa9f971a7db1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.805563 5129 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26c62665-6770-4c8e-a229-aa9f971a7db1-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.810296 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c62665-6770-4c8e-a229-aa9f971a7db1-kube-api-access-q85t6" (OuterVolumeSpecName: "kube-api-access-q85t6") pod "26c62665-6770-4c8e-a229-aa9f971a7db1" (UID: "26c62665-6770-4c8e-a229-aa9f971a7db1"). InnerVolumeSpecName "kube-api-access-q85t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.821194 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26c62665-6770-4c8e-a229-aa9f971a7db1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "26c62665-6770-4c8e-a229-aa9f971a7db1" (UID: "26c62665-6770-4c8e-a229-aa9f971a7db1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.906426 5129 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26c62665-6770-4c8e-a229-aa9f971a7db1-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:36 crc kubenswrapper[5129]: I0314 07:14:36.906458 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q85t6\" (UniqueName: \"kubernetes.io/projected/26c62665-6770-4c8e-a229-aa9f971a7db1-kube-api-access-q85t6\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:37 crc kubenswrapper[5129]: I0314 07:14:37.442695 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wvhn6" event={"ID":"26c62665-6770-4c8e-a229-aa9f971a7db1","Type":"ContainerDied","Data":"cdb81d40c78ae534805e071778534b4ccc7d35571ba7f360c83444fba4ab48d0"} Mar 14 07:14:37 crc kubenswrapper[5129]: I0314 07:14:37.442735 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb81d40c78ae534805e071778534b4ccc7d35571ba7f360c83444fba4ab48d0" Mar 14 07:14:37 crc kubenswrapper[5129]: I0314 07:14:37.442780 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wvhn6" Mar 14 07:14:41 crc kubenswrapper[5129]: I0314 07:14:41.141009 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-69gmr" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.520475 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b"] Mar 14 07:14:44 crc kubenswrapper[5129]: E0314 07:14:44.521921 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c62665-6770-4c8e-a229-aa9f971a7db1" containerName="storage" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.521947 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c62665-6770-4c8e-a229-aa9f971a7db1" containerName="storage" Mar 14 07:14:44 crc kubenswrapper[5129]: E0314 07:14:44.521960 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="extract-utilities" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.521969 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="extract-utilities" Mar 14 07:14:44 crc kubenswrapper[5129]: E0314 07:14:44.521991 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="registry-server" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.521999 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="registry-server" Mar 14 07:14:44 crc kubenswrapper[5129]: E0314 07:14:44.522011 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="extract-content" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.522019 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="extract-content" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.522130 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c62665-6770-4c8e-a229-aa9f971a7db1" containerName="storage" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.522145 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8767dd-5f70-4fe5-ba97-c82b72de2287" containerName="registry-server" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.523049 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.529269 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.530516 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b"] Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.617179 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.617824 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r547h\" (UniqueName: \"kubernetes.io/projected/196abf24-e768-4d07-ba20-6df2e2c3ec9f-kube-api-access-r547h\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.617969 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.719339 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r547h\" (UniqueName: \"kubernetes.io/projected/196abf24-e768-4d07-ba20-6df2e2c3ec9f-kube-api-access-r547h\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.719576 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.719852 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.720113 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.720457 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.741701 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r547h\" (UniqueName: \"kubernetes.io/projected/196abf24-e768-4d07-ba20-6df2e2c3ec9f-kube-api-access-r547h\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:44 crc kubenswrapper[5129]: I0314 07:14:44.838854 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:45 crc kubenswrapper[5129]: I0314 07:14:45.049696 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b"] Mar 14 07:14:45 crc kubenswrapper[5129]: I0314 07:14:45.489219 5129 generic.go:334] "Generic (PLEG): container finished" podID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerID="c34505af1198feb928998b672575600458d6b1e95d743fb9e3da2f8d38ad8f30" exitCode=0 Mar 14 07:14:45 crc kubenswrapper[5129]: I0314 07:14:45.489323 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" event={"ID":"196abf24-e768-4d07-ba20-6df2e2c3ec9f","Type":"ContainerDied","Data":"c34505af1198feb928998b672575600458d6b1e95d743fb9e3da2f8d38ad8f30"} Mar 14 07:14:45 crc kubenswrapper[5129]: I0314 07:14:45.489553 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" event={"ID":"196abf24-e768-4d07-ba20-6df2e2c3ec9f","Type":"ContainerStarted","Data":"eda7b92070ed74a2899b21349dc1c9e83d6d80f8343749f057435c88ab2550b4"} Mar 14 07:14:46 crc kubenswrapper[5129]: I0314 07:14:46.873959 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wkzfm"] Mar 14 07:14:46 crc kubenswrapper[5129]: I0314 07:14:46.876081 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:46 crc kubenswrapper[5129]: I0314 07:14:46.884161 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkzfm"] Mar 14 07:14:46 crc kubenswrapper[5129]: I0314 07:14:46.964968 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-utilities\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:46 crc kubenswrapper[5129]: I0314 07:14:46.965058 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bth9h\" (UniqueName: \"kubernetes.io/projected/affec354-64b8-4672-9b9d-80a51b04ada9-kube-api-access-bth9h\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:46 crc kubenswrapper[5129]: I0314 07:14:46.965117 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-catalog-content\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.066617 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bth9h\" (UniqueName: \"kubernetes.io/projected/affec354-64b8-4672-9b9d-80a51b04ada9-kube-api-access-bth9h\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.066678 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-catalog-content\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.066733 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-utilities\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.067206 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-utilities\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.067277 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-catalog-content\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.085510 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bth9h\" (UniqueName: \"kubernetes.io/projected/affec354-64b8-4672-9b9d-80a51b04ada9-kube-api-access-bth9h\") pod \"redhat-operators-wkzfm\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.211524 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.405123 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkzfm"] Mar 14 07:14:47 crc kubenswrapper[5129]: W0314 07:14:47.414744 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaffec354_64b8_4672_9b9d_80a51b04ada9.slice/crio-52a8a73ce87d9dec3523291dac5189da2856a5c4eedb80e7cb832fd56cde1e67 WatchSource:0}: Error finding container 52a8a73ce87d9dec3523291dac5189da2856a5c4eedb80e7cb832fd56cde1e67: Status 404 returned error can't find the container with id 52a8a73ce87d9dec3523291dac5189da2856a5c4eedb80e7cb832fd56cde1e67 Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.503919 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkzfm" event={"ID":"affec354-64b8-4672-9b9d-80a51b04ada9","Type":"ContainerStarted","Data":"52a8a73ce87d9dec3523291dac5189da2856a5c4eedb80e7cb832fd56cde1e67"} Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.506152 5129 generic.go:334] "Generic (PLEG): container finished" podID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerID="e27754f5e0edb520175e52a5b4a3e62f3be7ac60c92eb9d072843c9fd2ad98c8" exitCode=0 Mar 14 07:14:47 crc kubenswrapper[5129]: I0314 07:14:47.506192 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" event={"ID":"196abf24-e768-4d07-ba20-6df2e2c3ec9f","Type":"ContainerDied","Data":"e27754f5e0edb520175e52a5b4a3e62f3be7ac60c92eb9d072843c9fd2ad98c8"} Mar 14 07:14:48 crc kubenswrapper[5129]: I0314 07:14:48.511878 5129 generic.go:334] "Generic (PLEG): container finished" podID="affec354-64b8-4672-9b9d-80a51b04ada9" containerID="c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15" exitCode=0 Mar 14 07:14:48 crc kubenswrapper[5129]: I0314 07:14:48.511925 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkzfm" event={"ID":"affec354-64b8-4672-9b9d-80a51b04ada9","Type":"ContainerDied","Data":"c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15"} Mar 14 07:14:48 crc kubenswrapper[5129]: I0314 07:14:48.516864 5129 generic.go:334] "Generic (PLEG): container finished" podID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerID="1216b740571142ac3c70f53ceb96615aea2b34b5726e2f81455efa07007fa9ca" exitCode=0 Mar 14 07:14:48 crc kubenswrapper[5129]: I0314 07:14:48.516929 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" event={"ID":"196abf24-e768-4d07-ba20-6df2e2c3ec9f","Type":"ContainerDied","Data":"1216b740571142ac3c70f53ceb96615aea2b34b5726e2f81455efa07007fa9ca"} Mar 14 07:14:49 crc kubenswrapper[5129]: I0314 07:14:49.527277 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkzfm" event={"ID":"affec354-64b8-4672-9b9d-80a51b04ada9","Type":"ContainerStarted","Data":"85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae"} Mar 14 07:14:49 crc kubenswrapper[5129]: I0314 07:14:49.574761 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:14:49 crc kubenswrapper[5129]: I0314 07:14:49.574861 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:14:49 crc kubenswrapper[5129]: I0314 07:14:49.836817 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.003732 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-util\") pod \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.003805 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-bundle\") pod \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.003827 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r547h\" (UniqueName: \"kubernetes.io/projected/196abf24-e768-4d07-ba20-6df2e2c3ec9f-kube-api-access-r547h\") pod \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\" (UID: \"196abf24-e768-4d07-ba20-6df2e2c3ec9f\") " Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.004351 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-bundle" (OuterVolumeSpecName: "bundle") pod "196abf24-e768-4d07-ba20-6df2e2c3ec9f" (UID: "196abf24-e768-4d07-ba20-6df2e2c3ec9f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.010846 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196abf24-e768-4d07-ba20-6df2e2c3ec9f-kube-api-access-r547h" (OuterVolumeSpecName: "kube-api-access-r547h") pod "196abf24-e768-4d07-ba20-6df2e2c3ec9f" (UID: "196abf24-e768-4d07-ba20-6df2e2c3ec9f"). InnerVolumeSpecName "kube-api-access-r547h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.019089 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-util" (OuterVolumeSpecName: "util") pod "196abf24-e768-4d07-ba20-6df2e2c3ec9f" (UID: "196abf24-e768-4d07-ba20-6df2e2c3ec9f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.105734 5129 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.105781 5129 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/196abf24-e768-4d07-ba20-6df2e2c3ec9f-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.105796 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r547h\" (UniqueName: \"kubernetes.io/projected/196abf24-e768-4d07-ba20-6df2e2c3ec9f-kube-api-access-r547h\") on node \"crc\" DevicePath \"\"" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.537971 5129 generic.go:334] "Generic (PLEG): container finished" podID="affec354-64b8-4672-9b9d-80a51b04ada9" containerID="85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae" exitCode=0 Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.538050 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkzfm" event={"ID":"affec354-64b8-4672-9b9d-80a51b04ada9","Type":"ContainerDied","Data":"85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae"} Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.544678 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.544858 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b" event={"ID":"196abf24-e768-4d07-ba20-6df2e2c3ec9f","Type":"ContainerDied","Data":"eda7b92070ed74a2899b21349dc1c9e83d6d80f8343749f057435c88ab2550b4"} Mar 14 07:14:50 crc kubenswrapper[5129]: I0314 07:14:50.544950 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eda7b92070ed74a2899b21349dc1c9e83d6d80f8343749f057435c88ab2550b4" Mar 14 07:14:51 crc kubenswrapper[5129]: I0314 07:14:51.555316 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkzfm" event={"ID":"affec354-64b8-4672-9b9d-80a51b04ada9","Type":"ContainerStarted","Data":"0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95"} Mar 14 07:14:51 crc kubenswrapper[5129]: I0314 07:14:51.577096 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wkzfm" podStartSLOduration=3.171032359 podStartE2EDuration="5.577071917s" podCreationTimestamp="2026-03-14 07:14:46 +0000 UTC" firstStartedPulling="2026-03-14 07:14:48.515727606 +0000 UTC m=+951.267642830" lastFinishedPulling="2026-03-14 07:14:50.921767164 +0000 UTC m=+953.673682388" observedRunningTime="2026-03-14 07:14:51.575678999 +0000 UTC m=+954.327594243" watchObservedRunningTime="2026-03-14 07:14:51.577071917 +0000 UTC m=+954.328987131" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.895702 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zkslc"] Mar 14 07:14:55 crc kubenswrapper[5129]: E0314 07:14:55.895944 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerName="pull" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.895958 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerName="pull" Mar 14 07:14:55 crc kubenswrapper[5129]: E0314 07:14:55.895971 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerName="extract" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.895979 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerName="extract" Mar 14 07:14:55 crc kubenswrapper[5129]: E0314 07:14:55.896002 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerName="util" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.896011 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerName="util" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.896123 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="196abf24-e768-4d07-ba20-6df2e2c3ec9f" containerName="extract" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.896555 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.899733 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wbm45" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.905381 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.905405 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 14 07:14:55 crc kubenswrapper[5129]: I0314 07:14:55.918915 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zkslc"] Mar 14 07:14:56 crc kubenswrapper[5129]: I0314 07:14:56.006076 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vghjr\" (UniqueName: \"kubernetes.io/projected/4400f598-1fed-45cc-b987-ee2190cef8b4-kube-api-access-vghjr\") pod \"nmstate-operator-796d4cfff4-zkslc\" (UID: \"4400f598-1fed-45cc-b987-ee2190cef8b4\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" Mar 14 07:14:56 crc kubenswrapper[5129]: I0314 07:14:56.107767 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vghjr\" (UniqueName: \"kubernetes.io/projected/4400f598-1fed-45cc-b987-ee2190cef8b4-kube-api-access-vghjr\") pod \"nmstate-operator-796d4cfff4-zkslc\" (UID: \"4400f598-1fed-45cc-b987-ee2190cef8b4\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" Mar 14 07:14:56 crc kubenswrapper[5129]: I0314 07:14:56.137256 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vghjr\" (UniqueName: \"kubernetes.io/projected/4400f598-1fed-45cc-b987-ee2190cef8b4-kube-api-access-vghjr\") pod \"nmstate-operator-796d4cfff4-zkslc\" (UID: \"4400f598-1fed-45cc-b987-ee2190cef8b4\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" Mar 14 07:14:56 crc kubenswrapper[5129]: I0314 07:14:56.233581 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" Mar 14 07:14:56 crc kubenswrapper[5129]: I0314 07:14:56.417198 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zkslc"] Mar 14 07:14:56 crc kubenswrapper[5129]: I0314 07:14:56.609251 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" event={"ID":"4400f598-1fed-45cc-b987-ee2190cef8b4","Type":"ContainerStarted","Data":"b137f5002ad656f5db408021d1c41f8b68eeb41a61570275a742d41edee22708"} Mar 14 07:14:57 crc kubenswrapper[5129]: I0314 07:14:57.212146 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:57 crc kubenswrapper[5129]: I0314 07:14:57.212192 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:14:58 crc kubenswrapper[5129]: I0314 07:14:58.281858 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wkzfm" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="registry-server" probeResult="failure" output=< Mar 14 07:14:58 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:14:58 crc kubenswrapper[5129]: > Mar 14 07:14:59 crc kubenswrapper[5129]: I0314 07:14:59.628833 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" event={"ID":"4400f598-1fed-45cc-b987-ee2190cef8b4","Type":"ContainerStarted","Data":"d5c6b38864aecbcbd21f78dce6e6eca083c9d38fab2f7ccaa1b10ea156fd722f"} Mar 14 07:14:59 crc kubenswrapper[5129]: I0314 07:14:59.652139 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zkslc" podStartSLOduration=2.126490813 podStartE2EDuration="4.652114584s" podCreationTimestamp="2026-03-14 07:14:55 +0000 UTC" firstStartedPulling="2026-03-14 07:14:56.426459694 +0000 UTC m=+959.178374878" lastFinishedPulling="2026-03-14 07:14:58.952083465 +0000 UTC m=+961.703998649" observedRunningTime="2026-03-14 07:14:59.646944711 +0000 UTC m=+962.398859955" watchObservedRunningTime="2026-03-14 07:14:59.652114584 +0000 UTC m=+962.404029808" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.152844 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht"] Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.153854 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.156231 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.160068 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.168766 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht"] Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.256240 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-secret-volume\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.256297 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhrxf\" (UniqueName: \"kubernetes.io/projected/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-kube-api-access-xhrxf\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.256467 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-config-volume\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.357297 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-config-volume\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.357370 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-secret-volume\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.357417 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhrxf\" (UniqueName: \"kubernetes.io/projected/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-kube-api-access-xhrxf\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.358299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-config-volume\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.364161 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-secret-volume\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.377020 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhrxf\" (UniqueName: \"kubernetes.io/projected/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-kube-api-access-xhrxf\") pod \"collect-profiles-29557875-6nwht\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.472841 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:00 crc kubenswrapper[5129]: I0314 07:15:00.888090 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht"] Mar 14 07:15:00 crc kubenswrapper[5129]: W0314 07:15:00.894975 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec477a7_244e_4c14_a6b8_f7d09cb2777d.slice/crio-aac244393c32719d26629563832971e0ea04e30eb1a886d32a7d91d9b54e5dd4 WatchSource:0}: Error finding container aac244393c32719d26629563832971e0ea04e30eb1a886d32a7d91d9b54e5dd4: Status 404 returned error can't find the container with id aac244393c32719d26629563832971e0ea04e30eb1a886d32a7d91d9b54e5dd4 Mar 14 07:15:01 crc kubenswrapper[5129]: I0314 07:15:01.642243 5129 generic.go:334] "Generic (PLEG): container finished" podID="4ec477a7-244e-4c14-a6b8-f7d09cb2777d" containerID="26ca9429c6b3fc835f9f8dfd6e7651b540f2cf04b5223bc72de223a735b3200b" exitCode=0 Mar 14 07:15:01 crc kubenswrapper[5129]: I0314 07:15:01.642323 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" event={"ID":"4ec477a7-244e-4c14-a6b8-f7d09cb2777d","Type":"ContainerDied","Data":"26ca9429c6b3fc835f9f8dfd6e7651b540f2cf04b5223bc72de223a735b3200b"} Mar 14 07:15:01 crc kubenswrapper[5129]: I0314 07:15:01.642810 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" event={"ID":"4ec477a7-244e-4c14-a6b8-f7d09cb2777d","Type":"ContainerStarted","Data":"aac244393c32719d26629563832971e0ea04e30eb1a886d32a7d91d9b54e5dd4"} Mar 14 07:15:02 crc kubenswrapper[5129]: I0314 07:15:02.880398 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:02 crc kubenswrapper[5129]: I0314 07:15:02.905973 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-config-volume\") pod \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " Mar 14 07:15:02 crc kubenswrapper[5129]: I0314 07:15:02.906164 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhrxf\" (UniqueName: \"kubernetes.io/projected/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-kube-api-access-xhrxf\") pod \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " Mar 14 07:15:02 crc kubenswrapper[5129]: I0314 07:15:02.906194 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-secret-volume\") pod \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\" (UID: \"4ec477a7-244e-4c14-a6b8-f7d09cb2777d\") " Mar 14 07:15:02 crc kubenswrapper[5129]: I0314 07:15:02.906928 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-config-volume" (OuterVolumeSpecName: "config-volume") pod "4ec477a7-244e-4c14-a6b8-f7d09cb2777d" (UID: "4ec477a7-244e-4c14-a6b8-f7d09cb2777d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:15:02 crc kubenswrapper[5129]: I0314 07:15:02.912545 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4ec477a7-244e-4c14-a6b8-f7d09cb2777d" (UID: "4ec477a7-244e-4c14-a6b8-f7d09cb2777d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:15:02 crc kubenswrapper[5129]: I0314 07:15:02.913726 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-kube-api-access-xhrxf" (OuterVolumeSpecName: "kube-api-access-xhrxf") pod "4ec477a7-244e-4c14-a6b8-f7d09cb2777d" (UID: "4ec477a7-244e-4c14-a6b8-f7d09cb2777d"). InnerVolumeSpecName "kube-api-access-xhrxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:03 crc kubenswrapper[5129]: I0314 07:15:03.007731 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:03 crc kubenswrapper[5129]: I0314 07:15:03.007775 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:03 crc kubenswrapper[5129]: I0314 07:15:03.007786 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhrxf\" (UniqueName: \"kubernetes.io/projected/4ec477a7-244e-4c14-a6b8-f7d09cb2777d-kube-api-access-xhrxf\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:03 crc kubenswrapper[5129]: I0314 07:15:03.658133 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" event={"ID":"4ec477a7-244e-4c14-a6b8-f7d09cb2777d","Type":"ContainerDied","Data":"aac244393c32719d26629563832971e0ea04e30eb1a886d32a7d91d9b54e5dd4"} Mar 14 07:15:03 crc kubenswrapper[5129]: I0314 07:15:03.658481 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aac244393c32719d26629563832971e0ea04e30eb1a886d32a7d91d9b54e5dd4" Mar 14 07:15:03 crc kubenswrapper[5129]: I0314 07:15:03.658332 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.677869 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9"] Mar 14 07:15:04 crc kubenswrapper[5129]: E0314 07:15:04.678191 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec477a7-244e-4c14-a6b8-f7d09cb2777d" containerName="collect-profiles" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.678210 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec477a7-244e-4c14-a6b8-f7d09cb2777d" containerName="collect-profiles" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.678385 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec477a7-244e-4c14-a6b8-f7d09cb2777d" containerName="collect-profiles" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.679244 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.681584 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zx7tt" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.684653 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr"] Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.685754 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.687762 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.689162 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9"] Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.714946 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr"] Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.732115 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jlwnk"] Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.732976 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.799041 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf"] Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.799633 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.803303 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.805224 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-nww4n" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.806370 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.820839 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf"] Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.829891 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ea3fee1e-d109-4b58-86a8-919ace67ad6a-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-4rmxr\" (UID: \"ea3fee1e-d109-4b58-86a8-919ace67ad6a\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.829983 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nlpj\" (UniqueName: \"kubernetes.io/projected/ea3fee1e-d109-4b58-86a8-919ace67ad6a-kube-api-access-4nlpj\") pod \"nmstate-webhook-5f558f5558-4rmxr\" (UID: \"ea3fee1e-d109-4b58-86a8-919ace67ad6a\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.830361 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nh8j\" (UniqueName: \"kubernetes.io/projected/684cf0e0-3f2e-4904-a9a3-257015dd0f03-kube-api-access-5nh8j\") pod \"nmstate-metrics-9b8c8685d-tqfq9\" (UID: \"684cf0e0-3f2e-4904-a9a3-257015dd0f03\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.931998 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkcq2\" (UniqueName: \"kubernetes.io/projected/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-kube-api-access-fkcq2\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932411 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932448 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nlpj\" (UniqueName: \"kubernetes.io/projected/ea3fee1e-d109-4b58-86a8-919ace67ad6a-kube-api-access-4nlpj\") pod \"nmstate-webhook-5f558f5558-4rmxr\" (UID: \"ea3fee1e-d109-4b58-86a8-919ace67ad6a\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932485 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nh8j\" (UniqueName: \"kubernetes.io/projected/684cf0e0-3f2e-4904-a9a3-257015dd0f03-kube-api-access-5nh8j\") pod \"nmstate-metrics-9b8c8685d-tqfq9\" (UID: \"684cf0e0-3f2e-4904-a9a3-257015dd0f03\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932519 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ea3fee1e-d109-4b58-86a8-919ace67ad6a-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-4rmxr\" (UID: \"ea3fee1e-d109-4b58-86a8-919ace67ad6a\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932549 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-ovs-socket\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932569 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b77cq\" (UniqueName: \"kubernetes.io/projected/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-kube-api-access-b77cq\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932598 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932639 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-dbus-socket\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.932675 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-nmstate-lock\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:04 crc kubenswrapper[5129]: E0314 07:15:04.932700 5129 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 14 07:15:04 crc kubenswrapper[5129]: E0314 07:15:04.932793 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea3fee1e-d109-4b58-86a8-919ace67ad6a-tls-key-pair podName:ea3fee1e-d109-4b58-86a8-919ace67ad6a nodeName:}" failed. No retries permitted until 2026-03-14 07:15:05.432772425 +0000 UTC m=+968.184687689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ea3fee1e-d109-4b58-86a8-919ace67ad6a-tls-key-pair") pod "nmstate-webhook-5f558f5558-4rmxr" (UID: "ea3fee1e-d109-4b58-86a8-919ace67ad6a") : secret "openshift-nmstate-webhook" not found Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.951259 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nh8j\" (UniqueName: \"kubernetes.io/projected/684cf0e0-3f2e-4904-a9a3-257015dd0f03-kube-api-access-5nh8j\") pod \"nmstate-metrics-9b8c8685d-tqfq9\" (UID: \"684cf0e0-3f2e-4904-a9a3-257015dd0f03\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.955566 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nlpj\" (UniqueName: \"kubernetes.io/projected/ea3fee1e-d109-4b58-86a8-919ace67ad6a-kube-api-access-4nlpj\") pod \"nmstate-webhook-5f558f5558-4rmxr\" (UID: \"ea3fee1e-d109-4b58-86a8-919ace67ad6a\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.989649 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-89d4b6dfb-8d5f5"] Mar 14 07:15:04 crc kubenswrapper[5129]: I0314 07:15:04.990260 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.002293 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89d4b6dfb-8d5f5"] Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.006338 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033215 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-ovs-socket\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033266 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b77cq\" (UniqueName: \"kubernetes.io/projected/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-kube-api-access-b77cq\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033287 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-ovs-socket\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033305 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033341 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-dbus-socket\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033379 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-service-ca\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: E0314 07:15:05.033387 5129 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033400 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-trusted-ca-bundle\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: E0314 07:15:05.033440 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-plugin-serving-cert podName:1a4817ce-3ee7-4738-a1f2-5ae751ad564f nodeName:}" failed. No retries permitted until 2026-03-14 07:15:05.533419506 +0000 UTC m=+968.285334690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-zpxkf" (UID: "1a4817ce-3ee7-4738-a1f2-5ae751ad564f") : secret "plugin-serving-cert" not found Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033458 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-nmstate-lock\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033486 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-oauth-serving-cert\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033507 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkcq2\" (UniqueName: \"kubernetes.io/projected/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-kube-api-access-fkcq2\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033530 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjlv\" (UniqueName: \"kubernetes.io/projected/1354fbf5-3bcb-481a-bd7b-5808b22f34da-kube-api-access-kpjlv\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033567 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-serving-cert\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033586 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-oauth-config\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033630 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-dbus-socket\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033865 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-nmstate-lock\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.033964 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.034006 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-config\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.034994 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.052456 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkcq2\" (UniqueName: \"kubernetes.io/projected/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-kube-api-access-fkcq2\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.055451 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b77cq\" (UniqueName: \"kubernetes.io/projected/d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3-kube-api-access-b77cq\") pod \"nmstate-handler-jlwnk\" (UID: \"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3\") " pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.134455 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-config\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.134633 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-trusted-ca-bundle\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.134657 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-service-ca\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.134687 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-oauth-serving-cert\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.134719 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjlv\" (UniqueName: \"kubernetes.io/projected/1354fbf5-3bcb-481a-bd7b-5808b22f34da-kube-api-access-kpjlv\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.134769 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-serving-cert\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.134791 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-oauth-config\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.136327 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-oauth-serving-cert\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.136447 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-service-ca\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.137038 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-trusted-ca-bundle\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.138036 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-config\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.140615 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-serving-cert\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.141003 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1354fbf5-3bcb-481a-bd7b-5808b22f34da-console-oauth-config\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.150416 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjlv\" (UniqueName: \"kubernetes.io/projected/1354fbf5-3bcb-481a-bd7b-5808b22f34da-kube-api-access-kpjlv\") pod \"console-89d4b6dfb-8d5f5\" (UID: \"1354fbf5-3bcb-481a-bd7b-5808b22f34da\") " pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.322114 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.347806 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.399710 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9"] Mar 14 07:15:05 crc kubenswrapper[5129]: W0314 07:15:05.418816 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod684cf0e0_3f2e_4904_a9a3_257015dd0f03.slice/crio-9ca3cafc5c31fdb089b4cc005a81ed100edf5694d712a19acbb0e2ae0acd2c8b WatchSource:0}: Error finding container 9ca3cafc5c31fdb089b4cc005a81ed100edf5694d712a19acbb0e2ae0acd2c8b: Status 404 returned error can't find the container with id 9ca3cafc5c31fdb089b4cc005a81ed100edf5694d712a19acbb0e2ae0acd2c8b Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.437899 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ea3fee1e-d109-4b58-86a8-919ace67ad6a-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-4rmxr\" (UID: \"ea3fee1e-d109-4b58-86a8-919ace67ad6a\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.443648 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ea3fee1e-d109-4b58-86a8-919ace67ad6a-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-4rmxr\" (UID: \"ea3fee1e-d109-4b58-86a8-919ace67ad6a\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.538952 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.542288 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a4817ce-3ee7-4738-a1f2-5ae751ad564f-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zpxkf\" (UID: \"1a4817ce-3ee7-4738-a1f2-5ae751ad564f\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.615994 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.674340 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jlwnk" event={"ID":"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3","Type":"ContainerStarted","Data":"b72da6600c3fe156d9ebda98128154d0838d12dc09eef8090fdb706a4ab370bc"} Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.677003 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" event={"ID":"684cf0e0-3f2e-4904-a9a3-257015dd0f03","Type":"ContainerStarted","Data":"9ca3cafc5c31fdb089b4cc005a81ed100edf5694d712a19acbb0e2ae0acd2c8b"} Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.713269 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89d4b6dfb-8d5f5"] Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.714944 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" Mar 14 07:15:05 crc kubenswrapper[5129]: W0314 07:15:05.729418 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1354fbf5_3bcb_481a_bd7b_5808b22f34da.slice/crio-75f5b0e7b03f100d084e1d551c97f294886f0749c5b30bca9cae47207ba71c10 WatchSource:0}: Error finding container 75f5b0e7b03f100d084e1d551c97f294886f0749c5b30bca9cae47207ba71c10: Status 404 returned error can't find the container with id 75f5b0e7b03f100d084e1d551c97f294886f0749c5b30bca9cae47207ba71c10 Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.800340 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr"] Mar 14 07:15:05 crc kubenswrapper[5129]: W0314 07:15:05.818321 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3fee1e_d109_4b58_86a8_919ace67ad6a.slice/crio-1beb75a375c0e89d3de28ecd90679cb4ea20968d928fe415f18c6f9ffc35cea1 WatchSource:0}: Error finding container 1beb75a375c0e89d3de28ecd90679cb4ea20968d928fe415f18c6f9ffc35cea1: Status 404 returned error can't find the container with id 1beb75a375c0e89d3de28ecd90679cb4ea20968d928fe415f18c6f9ffc35cea1 Mar 14 07:15:05 crc kubenswrapper[5129]: I0314 07:15:05.923386 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf"] Mar 14 07:15:05 crc kubenswrapper[5129]: W0314 07:15:05.927989 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a4817ce_3ee7_4738_a1f2_5ae751ad564f.slice/crio-89edce784a39b769e56b94f170d0894b0d8606a901dff1848042386ba5275c27 WatchSource:0}: Error finding container 89edce784a39b769e56b94f170d0894b0d8606a901dff1848042386ba5275c27: Status 404 returned error can't find the container with id 89edce784a39b769e56b94f170d0894b0d8606a901dff1848042386ba5275c27 Mar 14 07:15:06 crc kubenswrapper[5129]: I0314 07:15:06.682768 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" event={"ID":"ea3fee1e-d109-4b58-86a8-919ace67ad6a","Type":"ContainerStarted","Data":"1beb75a375c0e89d3de28ecd90679cb4ea20968d928fe415f18c6f9ffc35cea1"} Mar 14 07:15:06 crc kubenswrapper[5129]: I0314 07:15:06.684054 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" event={"ID":"1a4817ce-3ee7-4738-a1f2-5ae751ad564f","Type":"ContainerStarted","Data":"89edce784a39b769e56b94f170d0894b0d8606a901dff1848042386ba5275c27"} Mar 14 07:15:06 crc kubenswrapper[5129]: I0314 07:15:06.685538 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89d4b6dfb-8d5f5" event={"ID":"1354fbf5-3bcb-481a-bd7b-5808b22f34da","Type":"ContainerStarted","Data":"26b5d84ebad35df833d98f309bb40a5729cf5b050285a79bb71aa42ca9aea157"} Mar 14 07:15:06 crc kubenswrapper[5129]: I0314 07:15:06.685569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89d4b6dfb-8d5f5" event={"ID":"1354fbf5-3bcb-481a-bd7b-5808b22f34da","Type":"ContainerStarted","Data":"75f5b0e7b03f100d084e1d551c97f294886f0749c5b30bca9cae47207ba71c10"} Mar 14 07:15:06 crc kubenswrapper[5129]: I0314 07:15:06.717984 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-89d4b6dfb-8d5f5" podStartSLOduration=2.717960862 podStartE2EDuration="2.717960862s" podCreationTimestamp="2026-03-14 07:15:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:15:06.712992298 +0000 UTC m=+969.464907512" watchObservedRunningTime="2026-03-14 07:15:06.717960862 +0000 UTC m=+969.469876036" Mar 14 07:15:07 crc kubenswrapper[5129]: I0314 07:15:07.267953 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:15:07 crc kubenswrapper[5129]: I0314 07:15:07.316755 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:15:07 crc kubenswrapper[5129]: I0314 07:15:07.492576 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkzfm"] Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.706187 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" event={"ID":"684cf0e0-3f2e-4904-a9a3-257015dd0f03","Type":"ContainerStarted","Data":"dfe80a80593f0eb85e12574dfa9432103b1473dba273e0f96ea91faf4258311e"} Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.709171 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jlwnk" event={"ID":"d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3","Type":"ContainerStarted","Data":"1e4a6c01156d85b1e8df88eca47cdf7b73344f92ad78c085642e489c062e8215"} Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.709789 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.711653 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" event={"ID":"ea3fee1e-d109-4b58-86a8-919ace67ad6a","Type":"ContainerStarted","Data":"19f8e9122ae15f6e627c2497f0ebd05931259516c8ae8fcbc9766ad5e36c894b"} Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.711744 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.714223 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" event={"ID":"1a4817ce-3ee7-4738-a1f2-5ae751ad564f","Type":"ContainerStarted","Data":"519d830de14acd27f694ba82bc16eec939adcf553b30a048c93f695d193ae326"} Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.714300 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wkzfm" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="registry-server" containerID="cri-o://0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95" gracePeriod=2 Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.730676 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jlwnk" podStartSLOduration=1.827422398 podStartE2EDuration="4.730651756s" podCreationTimestamp="2026-03-14 07:15:04 +0000 UTC" firstStartedPulling="2026-03-14 07:15:05.369567882 +0000 UTC m=+968.121483066" lastFinishedPulling="2026-03-14 07:15:08.27279724 +0000 UTC m=+971.024712424" observedRunningTime="2026-03-14 07:15:08.725748094 +0000 UTC m=+971.477663328" watchObservedRunningTime="2026-03-14 07:15:08.730651756 +0000 UTC m=+971.482566980" Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.742972 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" podStartSLOduration=2.293328066 podStartE2EDuration="4.742944233s" podCreationTimestamp="2026-03-14 07:15:04 +0000 UTC" firstStartedPulling="2026-03-14 07:15:05.82305271 +0000 UTC m=+968.574967894" lastFinishedPulling="2026-03-14 07:15:08.272668877 +0000 UTC m=+971.024584061" observedRunningTime="2026-03-14 07:15:08.742079632 +0000 UTC m=+971.493994856" watchObservedRunningTime="2026-03-14 07:15:08.742944233 +0000 UTC m=+971.494859437" Mar 14 07:15:08 crc kubenswrapper[5129]: I0314 07:15:08.776827 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zpxkf" podStartSLOduration=2.4318383089999998 podStartE2EDuration="4.776791099s" podCreationTimestamp="2026-03-14 07:15:04 +0000 UTC" firstStartedPulling="2026-03-14 07:15:05.93024183 +0000 UTC m=+968.682157014" lastFinishedPulling="2026-03-14 07:15:08.27519462 +0000 UTC m=+971.027109804" observedRunningTime="2026-03-14 07:15:08.770301187 +0000 UTC m=+971.522216381" watchObservedRunningTime="2026-03-14 07:15:08.776791099 +0000 UTC m=+971.528706323" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.073847 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.187991 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bth9h\" (UniqueName: \"kubernetes.io/projected/affec354-64b8-4672-9b9d-80a51b04ada9-kube-api-access-bth9h\") pod \"affec354-64b8-4672-9b9d-80a51b04ada9\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.188096 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-catalog-content\") pod \"affec354-64b8-4672-9b9d-80a51b04ada9\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.188163 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-utilities\") pod \"affec354-64b8-4672-9b9d-80a51b04ada9\" (UID: \"affec354-64b8-4672-9b9d-80a51b04ada9\") " Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.189347 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-utilities" (OuterVolumeSpecName: "utilities") pod "affec354-64b8-4672-9b9d-80a51b04ada9" (UID: "affec354-64b8-4672-9b9d-80a51b04ada9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.193938 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/affec354-64b8-4672-9b9d-80a51b04ada9-kube-api-access-bth9h" (OuterVolumeSpecName: "kube-api-access-bth9h") pod "affec354-64b8-4672-9b9d-80a51b04ada9" (UID: "affec354-64b8-4672-9b9d-80a51b04ada9"). InnerVolumeSpecName "kube-api-access-bth9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.289379 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bth9h\" (UniqueName: \"kubernetes.io/projected/affec354-64b8-4672-9b9d-80a51b04ada9-kube-api-access-bth9h\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.289424 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.306526 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "affec354-64b8-4672-9b9d-80a51b04ada9" (UID: "affec354-64b8-4672-9b9d-80a51b04ada9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.391269 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affec354-64b8-4672-9b9d-80a51b04ada9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.726294 5129 generic.go:334] "Generic (PLEG): container finished" podID="affec354-64b8-4672-9b9d-80a51b04ada9" containerID="0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95" exitCode=0 Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.726418 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkzfm" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.726491 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkzfm" event={"ID":"affec354-64b8-4672-9b9d-80a51b04ada9","Type":"ContainerDied","Data":"0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95"} Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.726527 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkzfm" event={"ID":"affec354-64b8-4672-9b9d-80a51b04ada9","Type":"ContainerDied","Data":"52a8a73ce87d9dec3523291dac5189da2856a5c4eedb80e7cb832fd56cde1e67"} Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.726573 5129 scope.go:117] "RemoveContainer" containerID="0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.756208 5129 scope.go:117] "RemoveContainer" containerID="85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.767080 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkzfm"] Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.771298 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wkzfm"] Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.809261 5129 scope.go:117] "RemoveContainer" containerID="c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.830404 5129 scope.go:117] "RemoveContainer" containerID="0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95" Mar 14 07:15:09 crc kubenswrapper[5129]: E0314 07:15:09.831032 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95\": container with ID starting with 0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95 not found: ID does not exist" containerID="0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.831128 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95"} err="failed to get container status \"0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95\": rpc error: code = NotFound desc = could not find container \"0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95\": container with ID starting with 0a0dc68396787f89e1d4d65975dcc826e8ad56343e3c006777e53c6232fa4f95 not found: ID does not exist" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.831199 5129 scope.go:117] "RemoveContainer" containerID="85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae" Mar 14 07:15:09 crc kubenswrapper[5129]: E0314 07:15:09.831929 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae\": container with ID starting with 85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae not found: ID does not exist" containerID="85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.831974 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae"} err="failed to get container status \"85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae\": rpc error: code = NotFound desc = could not find container \"85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae\": container with ID starting with 85e0f9d59e1ca6db38ae0e5ca65f8e980f4d6e50d01adca7b9997617ecefb3ae not found: ID does not exist" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.832003 5129 scope.go:117] "RemoveContainer" containerID="c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15" Mar 14 07:15:09 crc kubenswrapper[5129]: E0314 07:15:09.832326 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15\": container with ID starting with c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15 not found: ID does not exist" containerID="c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15" Mar 14 07:15:09 crc kubenswrapper[5129]: I0314 07:15:09.832370 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15"} err="failed to get container status \"c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15\": rpc error: code = NotFound desc = could not find container \"c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15\": container with ID starting with c173a525f0af61c0bb5db92126d23089591b393cbb2aa74927f8c990b619ff15 not found: ID does not exist" Mar 14 07:15:10 crc kubenswrapper[5129]: I0314 07:15:10.043030 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" path="/var/lib/kubelet/pods/affec354-64b8-4672-9b9d-80a51b04ada9/volumes" Mar 14 07:15:11 crc kubenswrapper[5129]: I0314 07:15:11.748213 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" event={"ID":"684cf0e0-3f2e-4904-a9a3-257015dd0f03","Type":"ContainerStarted","Data":"5a84c949f8325d67d1c0fd7636396def7c14d7cd7899d0dbf7075cb6173ff5af"} Mar 14 07:15:11 crc kubenswrapper[5129]: I0314 07:15:11.774429 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tqfq9" podStartSLOduration=2.569887664 podStartE2EDuration="7.774409216s" podCreationTimestamp="2026-03-14 07:15:04 +0000 UTC" firstStartedPulling="2026-03-14 07:15:05.421146837 +0000 UTC m=+968.173062011" lastFinishedPulling="2026-03-14 07:15:10.625668389 +0000 UTC m=+973.377583563" observedRunningTime="2026-03-14 07:15:11.770325674 +0000 UTC m=+974.522240868" watchObservedRunningTime="2026-03-14 07:15:11.774409216 +0000 UTC m=+974.526324400" Mar 14 07:15:15 crc kubenswrapper[5129]: I0314 07:15:15.322993 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:15 crc kubenswrapper[5129]: I0314 07:15:15.324092 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:15 crc kubenswrapper[5129]: I0314 07:15:15.330098 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:15 crc kubenswrapper[5129]: I0314 07:15:15.385378 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jlwnk" Mar 14 07:15:15 crc kubenswrapper[5129]: I0314 07:15:15.777905 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-89d4b6dfb-8d5f5" Mar 14 07:15:15 crc kubenswrapper[5129]: I0314 07:15:15.835037 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vpw78"] Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.575378 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.575867 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.575940 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.576728 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b4a31888cd52c24930717574880110e2ffda0608e5edc1431b7914f81c314fb"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.576836 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://8b4a31888cd52c24930717574880110e2ffda0608e5edc1431b7914f81c314fb" gracePeriod=600 Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.846661 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="8b4a31888cd52c24930717574880110e2ffda0608e5edc1431b7914f81c314fb" exitCode=0 Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.846711 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"8b4a31888cd52c24930717574880110e2ffda0608e5edc1431b7914f81c314fb"} Mar 14 07:15:19 crc kubenswrapper[5129]: I0314 07:15:19.846748 5129 scope.go:117] "RemoveContainer" containerID="f77555a687b65978c9c2c4dd7bac28e7a1c8bc7c2438e2e0e0556ac3023a77b8" Mar 14 07:15:20 crc kubenswrapper[5129]: I0314 07:15:20.854028 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"74d416b7b010cf091ac9c019f0f997a8bcaaae0573c048f8fb52b3ba09a111ee"} Mar 14 07:15:25 crc kubenswrapper[5129]: I0314 07:15:25.625152 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-4rmxr" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.124527 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hbqw2"] Mar 14 07:15:30 crc kubenswrapper[5129]: E0314 07:15:30.125576 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="extract-utilities" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.125670 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="extract-utilities" Mar 14 07:15:30 crc kubenswrapper[5129]: E0314 07:15:30.125743 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="extract-content" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.125761 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="extract-content" Mar 14 07:15:30 crc kubenswrapper[5129]: E0314 07:15:30.125783 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="registry-server" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.125799 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="registry-server" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.126015 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="affec354-64b8-4672-9b9d-80a51b04ada9" containerName="registry-server" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.127428 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.129047 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbqw2"] Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.311626 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-utilities\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.311700 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-catalog-content\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.311731 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6p6l\" (UniqueName: \"kubernetes.io/projected/5a1d2259-7335-4b63-8f50-e2df84977856-kube-api-access-q6p6l\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.412846 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-utilities\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.412926 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-catalog-content\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.412954 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6p6l\" (UniqueName: \"kubernetes.io/projected/5a1d2259-7335-4b63-8f50-e2df84977856-kube-api-access-q6p6l\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.413331 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-utilities\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.413389 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-catalog-content\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.443442 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6p6l\" (UniqueName: \"kubernetes.io/projected/5a1d2259-7335-4b63-8f50-e2df84977856-kube-api-access-q6p6l\") pod \"redhat-marketplace-hbqw2\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.481461 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.878079 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbqw2"] Mar 14 07:15:30 crc kubenswrapper[5129]: W0314 07:15:30.883266 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a1d2259_7335_4b63_8f50_e2df84977856.slice/crio-9665aef3712c6e7d2903ca28e7ddd301da3ce13f39263f1254fceef64b0a9cc6 WatchSource:0}: Error finding container 9665aef3712c6e7d2903ca28e7ddd301da3ce13f39263f1254fceef64b0a9cc6: Status 404 returned error can't find the container with id 9665aef3712c6e7d2903ca28e7ddd301da3ce13f39263f1254fceef64b0a9cc6 Mar 14 07:15:30 crc kubenswrapper[5129]: I0314 07:15:30.918410 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbqw2" event={"ID":"5a1d2259-7335-4b63-8f50-e2df84977856","Type":"ContainerStarted","Data":"9665aef3712c6e7d2903ca28e7ddd301da3ce13f39263f1254fceef64b0a9cc6"} Mar 14 07:15:31 crc kubenswrapper[5129]: I0314 07:15:31.928946 5129 generic.go:334] "Generic (PLEG): container finished" podID="5a1d2259-7335-4b63-8f50-e2df84977856" containerID="2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a" exitCode=0 Mar 14 07:15:31 crc kubenswrapper[5129]: I0314 07:15:31.928976 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbqw2" event={"ID":"5a1d2259-7335-4b63-8f50-e2df84977856","Type":"ContainerDied","Data":"2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a"} Mar 14 07:15:32 crc kubenswrapper[5129]: I0314 07:15:32.943937 5129 generic.go:334] "Generic (PLEG): container finished" podID="5a1d2259-7335-4b63-8f50-e2df84977856" containerID="5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c" exitCode=0 Mar 14 07:15:32 crc kubenswrapper[5129]: I0314 07:15:32.944136 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbqw2" event={"ID":"5a1d2259-7335-4b63-8f50-e2df84977856","Type":"ContainerDied","Data":"5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c"} Mar 14 07:15:33 crc kubenswrapper[5129]: I0314 07:15:33.963893 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbqw2" event={"ID":"5a1d2259-7335-4b63-8f50-e2df84977856","Type":"ContainerStarted","Data":"a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451"} Mar 14 07:15:33 crc kubenswrapper[5129]: I0314 07:15:33.984709 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hbqw2" podStartSLOduration=2.491567016 podStartE2EDuration="3.984692182s" podCreationTimestamp="2026-03-14 07:15:30 +0000 UTC" firstStartedPulling="2026-03-14 07:15:31.93333369 +0000 UTC m=+994.685248874" lastFinishedPulling="2026-03-14 07:15:33.426458856 +0000 UTC m=+996.178374040" observedRunningTime="2026-03-14 07:15:33.983765538 +0000 UTC m=+996.735680732" watchObservedRunningTime="2026-03-14 07:15:33.984692182 +0000 UTC m=+996.736607366" Mar 14 07:15:35 crc kubenswrapper[5129]: I0314 07:15:35.896704 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dff5s"] Mar 14 07:15:35 crc kubenswrapper[5129]: I0314 07:15:35.898869 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:35 crc kubenswrapper[5129]: I0314 07:15:35.913718 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dff5s"] Mar 14 07:15:35 crc kubenswrapper[5129]: I0314 07:15:35.987610 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-catalog-content\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:35 crc kubenswrapper[5129]: I0314 07:15:35.987666 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-utilities\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:35 crc kubenswrapper[5129]: I0314 07:15:35.987700 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6x9\" (UniqueName: \"kubernetes.io/projected/a078d2e7-9855-4886-8bf9-372c610b5eff-kube-api-access-cg6x9\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.089080 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-catalog-content\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.089481 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-utilities\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.089514 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6x9\" (UniqueName: \"kubernetes.io/projected/a078d2e7-9855-4886-8bf9-372c610b5eff-kube-api-access-cg6x9\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.089832 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-catalog-content\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.090071 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-utilities\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.108965 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6x9\" (UniqueName: \"kubernetes.io/projected/a078d2e7-9855-4886-8bf9-372c610b5eff-kube-api-access-cg6x9\") pod \"certified-operators-dff5s\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.224809 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.477386 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dff5s"] Mar 14 07:15:36 crc kubenswrapper[5129]: W0314 07:15:36.485393 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda078d2e7_9855_4886_8bf9_372c610b5eff.slice/crio-a79d847f48cf380b666b0975e4b3764ec3a6a88fac86e89144705d08c4d2c013 WatchSource:0}: Error finding container a79d847f48cf380b666b0975e4b3764ec3a6a88fac86e89144705d08c4d2c013: Status 404 returned error can't find the container with id a79d847f48cf380b666b0975e4b3764ec3a6a88fac86e89144705d08c4d2c013 Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.989009 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dff5s" event={"ID":"a078d2e7-9855-4886-8bf9-372c610b5eff","Type":"ContainerDied","Data":"2354a1a96be8d5f5c5708bc7a411d2b3566b8150607ef36b4975f64cc5276bfb"} Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.988890 5129 generic.go:334] "Generic (PLEG): container finished" podID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerID="2354a1a96be8d5f5c5708bc7a411d2b3566b8150607ef36b4975f64cc5276bfb" exitCode=0 Mar 14 07:15:36 crc kubenswrapper[5129]: I0314 07:15:36.989104 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dff5s" event={"ID":"a078d2e7-9855-4886-8bf9-372c610b5eff","Type":"ContainerStarted","Data":"a79d847f48cf380b666b0975e4b3764ec3a6a88fac86e89144705d08c4d2c013"} Mar 14 07:15:37 crc kubenswrapper[5129]: I0314 07:15:37.997192 5129 generic.go:334] "Generic (PLEG): container finished" podID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerID="9acd8efc456f63c3a33a02566214656001e7a7cc94e54f91134e8043a22b8bd0" exitCode=0 Mar 14 07:15:37 crc kubenswrapper[5129]: I0314 07:15:37.997256 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dff5s" event={"ID":"a078d2e7-9855-4886-8bf9-372c610b5eff","Type":"ContainerDied","Data":"9acd8efc456f63c3a33a02566214656001e7a7cc94e54f91134e8043a22b8bd0"} Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.736104 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m"] Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.737345 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.742195 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.748274 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m"] Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.833236 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.833340 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bm9z\" (UniqueName: \"kubernetes.io/projected/2afae4f4-1faf-44cf-81ee-5f11553a1407-kube-api-access-8bm9z\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.833379 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.934702 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bm9z\" (UniqueName: \"kubernetes.io/projected/2afae4f4-1faf-44cf-81ee-5f11553a1407-kube-api-access-8bm9z\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.934791 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.934899 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.935303 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.935430 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:38 crc kubenswrapper[5129]: I0314 07:15:38.953717 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bm9z\" (UniqueName: \"kubernetes.io/projected/2afae4f4-1faf-44cf-81ee-5f11553a1407-kube-api-access-8bm9z\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:39 crc kubenswrapper[5129]: I0314 07:15:39.004571 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dff5s" event={"ID":"a078d2e7-9855-4886-8bf9-372c610b5eff","Type":"ContainerStarted","Data":"1707221b498a323bc8ccc260d0bf5fa061ebb06488b6f9b51add584ee68590b8"} Mar 14 07:15:39 crc kubenswrapper[5129]: I0314 07:15:39.061288 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:39 crc kubenswrapper[5129]: I0314 07:15:39.473630 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dff5s" podStartSLOduration=3.047519857 podStartE2EDuration="4.473583237s" podCreationTimestamp="2026-03-14 07:15:35 +0000 UTC" firstStartedPulling="2026-03-14 07:15:36.990514604 +0000 UTC m=+999.742429798" lastFinishedPulling="2026-03-14 07:15:38.416577994 +0000 UTC m=+1001.168493178" observedRunningTime="2026-03-14 07:15:39.021919887 +0000 UTC m=+1001.773835081" watchObservedRunningTime="2026-03-14 07:15:39.473583237 +0000 UTC m=+1002.225498441" Mar 14 07:15:39 crc kubenswrapper[5129]: I0314 07:15:39.478909 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m"] Mar 14 07:15:39 crc kubenswrapper[5129]: W0314 07:15:39.495757 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2afae4f4_1faf_44cf_81ee_5f11553a1407.slice/crio-799ba517f15afb69b402cbf7c6b0733c647364046948ccd59664701d290a8957 WatchSource:0}: Error finding container 799ba517f15afb69b402cbf7c6b0733c647364046948ccd59664701d290a8957: Status 404 returned error can't find the container with id 799ba517f15afb69b402cbf7c6b0733c647364046948ccd59664701d290a8957 Mar 14 07:15:40 crc kubenswrapper[5129]: I0314 07:15:40.013663 5129 generic.go:334] "Generic (PLEG): container finished" podID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerID="c4d8e411f52207a53099b58cd42e7b085560263fd3a872faccf53e60318f75db" exitCode=0 Mar 14 07:15:40 crc kubenswrapper[5129]: I0314 07:15:40.013835 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" event={"ID":"2afae4f4-1faf-44cf-81ee-5f11553a1407","Type":"ContainerDied","Data":"c4d8e411f52207a53099b58cd42e7b085560263fd3a872faccf53e60318f75db"} Mar 14 07:15:40 crc kubenswrapper[5129]: I0314 07:15:40.014245 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" event={"ID":"2afae4f4-1faf-44cf-81ee-5f11553a1407","Type":"ContainerStarted","Data":"799ba517f15afb69b402cbf7c6b0733c647364046948ccd59664701d290a8957"} Mar 14 07:15:40 crc kubenswrapper[5129]: I0314 07:15:40.481980 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:40 crc kubenswrapper[5129]: I0314 07:15:40.482030 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:40 crc kubenswrapper[5129]: I0314 07:15:40.552701 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:40 crc kubenswrapper[5129]: I0314 07:15:40.881006 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vpw78" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" containerName="console" containerID="cri-o://2c36c6c958b1570e5ca98640d10a3c2833d85fc9b0f3d98768338aa9eaba79db" gracePeriod=15 Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.025790 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vpw78_92aed46c-5740-4407-81ed-4ff642a70c54/console/0.log" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.025926 5129 generic.go:334] "Generic (PLEG): container finished" podID="92aed46c-5740-4407-81ed-4ff642a70c54" containerID="2c36c6c958b1570e5ca98640d10a3c2833d85fc9b0f3d98768338aa9eaba79db" exitCode=2 Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.025995 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpw78" event={"ID":"92aed46c-5740-4407-81ed-4ff642a70c54","Type":"ContainerDied","Data":"2c36c6c958b1570e5ca98640d10a3c2833d85fc9b0f3d98768338aa9eaba79db"} Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.091381 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.283696 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vpw78_92aed46c-5740-4407-81ed-4ff642a70c54/console/0.log" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.283759 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.470827 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-oauth-config\") pod \"92aed46c-5740-4407-81ed-4ff642a70c54\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.472356 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-console-config\") pod \"92aed46c-5740-4407-81ed-4ff642a70c54\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.472394 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvws6\" (UniqueName: \"kubernetes.io/projected/92aed46c-5740-4407-81ed-4ff642a70c54-kube-api-access-dvws6\") pod \"92aed46c-5740-4407-81ed-4ff642a70c54\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.472455 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-trusted-ca-bundle\") pod \"92aed46c-5740-4407-81ed-4ff642a70c54\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.472509 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-service-ca\") pod \"92aed46c-5740-4407-81ed-4ff642a70c54\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.472538 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-oauth-serving-cert\") pod \"92aed46c-5740-4407-81ed-4ff642a70c54\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.472570 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-serving-cert\") pod \"92aed46c-5740-4407-81ed-4ff642a70c54\" (UID: \"92aed46c-5740-4407-81ed-4ff642a70c54\") " Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.473980 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "92aed46c-5740-4407-81ed-4ff642a70c54" (UID: "92aed46c-5740-4407-81ed-4ff642a70c54"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.474058 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-console-config" (OuterVolumeSpecName: "console-config") pod "92aed46c-5740-4407-81ed-4ff642a70c54" (UID: "92aed46c-5740-4407-81ed-4ff642a70c54"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.474122 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "92aed46c-5740-4407-81ed-4ff642a70c54" (UID: "92aed46c-5740-4407-81ed-4ff642a70c54"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.474148 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-service-ca" (OuterVolumeSpecName: "service-ca") pod "92aed46c-5740-4407-81ed-4ff642a70c54" (UID: "92aed46c-5740-4407-81ed-4ff642a70c54"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.477109 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "92aed46c-5740-4407-81ed-4ff642a70c54" (UID: "92aed46c-5740-4407-81ed-4ff642a70c54"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.478168 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92aed46c-5740-4407-81ed-4ff642a70c54-kube-api-access-dvws6" (OuterVolumeSpecName: "kube-api-access-dvws6") pod "92aed46c-5740-4407-81ed-4ff642a70c54" (UID: "92aed46c-5740-4407-81ed-4ff642a70c54"). InnerVolumeSpecName "kube-api-access-dvws6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.478359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "92aed46c-5740-4407-81ed-4ff642a70c54" (UID: "92aed46c-5740-4407-81ed-4ff642a70c54"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.573735 5129 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.573773 5129 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.573785 5129 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.573795 5129 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.573806 5129 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92aed46c-5740-4407-81ed-4ff642a70c54-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.573816 5129 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92aed46c-5740-4407-81ed-4ff642a70c54-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:41 crc kubenswrapper[5129]: I0314 07:15:41.573824 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvws6\" (UniqueName: \"kubernetes.io/projected/92aed46c-5740-4407-81ed-4ff642a70c54-kube-api-access-dvws6\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:42 crc kubenswrapper[5129]: I0314 07:15:42.036954 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vpw78_92aed46c-5740-4407-81ed-4ff642a70c54/console/0.log" Mar 14 07:15:42 crc kubenswrapper[5129]: I0314 07:15:42.037173 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:15:42 crc kubenswrapper[5129]: I0314 07:15:42.040534 5129 generic.go:334] "Generic (PLEG): container finished" podID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerID="58d78708a23bd4ce32101ac79dedb2f4468f5493a480256ddc0b097c7261f820" exitCode=0 Mar 14 07:15:42 crc kubenswrapper[5129]: I0314 07:15:42.048671 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vpw78" event={"ID":"92aed46c-5740-4407-81ed-4ff642a70c54","Type":"ContainerDied","Data":"c68908b8746452e572666602e51cba1b4bc426bb35f3796601aa3f4490720b20"} Mar 14 07:15:42 crc kubenswrapper[5129]: I0314 07:15:42.048731 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" event={"ID":"2afae4f4-1faf-44cf-81ee-5f11553a1407","Type":"ContainerDied","Data":"58d78708a23bd4ce32101ac79dedb2f4468f5493a480256ddc0b097c7261f820"} Mar 14 07:15:42 crc kubenswrapper[5129]: I0314 07:15:42.048766 5129 scope.go:117] "RemoveContainer" containerID="2c36c6c958b1570e5ca98640d10a3c2833d85fc9b0f3d98768338aa9eaba79db" Mar 14 07:15:43 crc kubenswrapper[5129]: I0314 07:15:43.049581 5129 generic.go:334] "Generic (PLEG): container finished" podID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerID="9a52a5b247a83b1ff899ab99f6ac8ddf4fb44657a5269fc9dbc7f61cb990166b" exitCode=0 Mar 14 07:15:43 crc kubenswrapper[5129]: I0314 07:15:43.049640 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" event={"ID":"2afae4f4-1faf-44cf-81ee-5f11553a1407","Type":"ContainerDied","Data":"9a52a5b247a83b1ff899ab99f6ac8ddf4fb44657a5269fc9dbc7f61cb990166b"} Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.321934 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.515460 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-util\") pod \"2afae4f4-1faf-44cf-81ee-5f11553a1407\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.515576 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-bundle\") pod \"2afae4f4-1faf-44cf-81ee-5f11553a1407\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.515664 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bm9z\" (UniqueName: \"kubernetes.io/projected/2afae4f4-1faf-44cf-81ee-5f11553a1407-kube-api-access-8bm9z\") pod \"2afae4f4-1faf-44cf-81ee-5f11553a1407\" (UID: \"2afae4f4-1faf-44cf-81ee-5f11553a1407\") " Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.517575 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-bundle" (OuterVolumeSpecName: "bundle") pod "2afae4f4-1faf-44cf-81ee-5f11553a1407" (UID: "2afae4f4-1faf-44cf-81ee-5f11553a1407"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.523567 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afae4f4-1faf-44cf-81ee-5f11553a1407-kube-api-access-8bm9z" (OuterVolumeSpecName: "kube-api-access-8bm9z") pod "2afae4f4-1faf-44cf-81ee-5f11553a1407" (UID: "2afae4f4-1faf-44cf-81ee-5f11553a1407"). InnerVolumeSpecName "kube-api-access-8bm9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.545414 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-util" (OuterVolumeSpecName: "util") pod "2afae4f4-1faf-44cf-81ee-5f11553a1407" (UID: "2afae4f4-1faf-44cf-81ee-5f11553a1407"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.616700 5129 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.616744 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bm9z\" (UniqueName: \"kubernetes.io/projected/2afae4f4-1faf-44cf-81ee-5f11553a1407-kube-api-access-8bm9z\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:44 crc kubenswrapper[5129]: I0314 07:15:44.616754 5129 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2afae4f4-1faf-44cf-81ee-5f11553a1407-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.072677 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" event={"ID":"2afae4f4-1faf-44cf-81ee-5f11553a1407","Type":"ContainerDied","Data":"799ba517f15afb69b402cbf7c6b0733c647364046948ccd59664701d290a8957"} Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.072751 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="799ba517f15afb69b402cbf7c6b0733c647364046948ccd59664701d290a8957" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.072809 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.092258 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbqw2"] Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.092668 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hbqw2" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="registry-server" containerID="cri-o://a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451" gracePeriod=2 Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.488674 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.631722 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6p6l\" (UniqueName: \"kubernetes.io/projected/5a1d2259-7335-4b63-8f50-e2df84977856-kube-api-access-q6p6l\") pod \"5a1d2259-7335-4b63-8f50-e2df84977856\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.631811 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-utilities\") pod \"5a1d2259-7335-4b63-8f50-e2df84977856\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.631841 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-catalog-content\") pod \"5a1d2259-7335-4b63-8f50-e2df84977856\" (UID: \"5a1d2259-7335-4b63-8f50-e2df84977856\") " Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.633323 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-utilities" (OuterVolumeSpecName: "utilities") pod "5a1d2259-7335-4b63-8f50-e2df84977856" (UID: "5a1d2259-7335-4b63-8f50-e2df84977856"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.637703 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1d2259-7335-4b63-8f50-e2df84977856-kube-api-access-q6p6l" (OuterVolumeSpecName: "kube-api-access-q6p6l") pod "5a1d2259-7335-4b63-8f50-e2df84977856" (UID: "5a1d2259-7335-4b63-8f50-e2df84977856"). InnerVolumeSpecName "kube-api-access-q6p6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.668029 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a1d2259-7335-4b63-8f50-e2df84977856" (UID: "5a1d2259-7335-4b63-8f50-e2df84977856"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.733667 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6p6l\" (UniqueName: \"kubernetes.io/projected/5a1d2259-7335-4b63-8f50-e2df84977856-kube-api-access-q6p6l\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.733715 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:45 crc kubenswrapper[5129]: I0314 07:15:45.733732 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1d2259-7335-4b63-8f50-e2df84977856-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.093175 5129 generic.go:334] "Generic (PLEG): container finished" podID="5a1d2259-7335-4b63-8f50-e2df84977856" containerID="a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451" exitCode=0 Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.093341 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbqw2" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.093323 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbqw2" event={"ID":"5a1d2259-7335-4b63-8f50-e2df84977856","Type":"ContainerDied","Data":"a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451"} Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.093891 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbqw2" event={"ID":"5a1d2259-7335-4b63-8f50-e2df84977856","Type":"ContainerDied","Data":"9665aef3712c6e7d2903ca28e7ddd301da3ce13f39263f1254fceef64b0a9cc6"} Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.093928 5129 scope.go:117] "RemoveContainer" containerID="a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.133295 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbqw2"] Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.140079 5129 scope.go:117] "RemoveContainer" containerID="5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.142637 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbqw2"] Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.161197 5129 scope.go:117] "RemoveContainer" containerID="2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.187361 5129 scope.go:117] "RemoveContainer" containerID="a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451" Mar 14 07:15:46 crc kubenswrapper[5129]: E0314 07:15:46.187983 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451\": container with ID starting with a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451 not found: ID does not exist" containerID="a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.188034 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451"} err="failed to get container status \"a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451\": rpc error: code = NotFound desc = could not find container \"a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451\": container with ID starting with a1295b865bed2a3650965acae3d481c8f9e848e9d48aaa9655244674e57ef451 not found: ID does not exist" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.188067 5129 scope.go:117] "RemoveContainer" containerID="5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c" Mar 14 07:15:46 crc kubenswrapper[5129]: E0314 07:15:46.188566 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c\": container with ID starting with 5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c not found: ID does not exist" containerID="5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.188656 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c"} err="failed to get container status \"5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c\": rpc error: code = NotFound desc = could not find container \"5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c\": container with ID starting with 5a1e2c645b2e1d2abff0b0b31bba7cad4cdf39cf63092c2e2cf913bd62f3e83c not found: ID does not exist" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.188689 5129 scope.go:117] "RemoveContainer" containerID="2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a" Mar 14 07:15:46 crc kubenswrapper[5129]: E0314 07:15:46.189508 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a\": container with ID starting with 2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a not found: ID does not exist" containerID="2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.189541 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a"} err="failed to get container status \"2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a\": rpc error: code = NotFound desc = could not find container \"2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a\": container with ID starting with 2ea7f0c414ce125305767b09be62c0baa2f92fdd122abe5248c4d168c9c9bc7a not found: ID does not exist" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.225340 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.225407 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:46 crc kubenswrapper[5129]: I0314 07:15:46.270901 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:47 crc kubenswrapper[5129]: I0314 07:15:47.137588 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:48 crc kubenswrapper[5129]: I0314 07:15:48.043116 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" path="/var/lib/kubelet/pods/5a1d2259-7335-4b63-8f50-e2df84977856/volumes" Mar 14 07:15:49 crc kubenswrapper[5129]: I0314 07:15:49.888587 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dff5s"] Mar 14 07:15:49 crc kubenswrapper[5129]: I0314 07:15:49.888912 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dff5s" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="registry-server" containerID="cri-o://1707221b498a323bc8ccc260d0bf5fa061ebb06488b6f9b51add584ee68590b8" gracePeriod=2 Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.119185 5129 generic.go:334] "Generic (PLEG): container finished" podID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerID="1707221b498a323bc8ccc260d0bf5fa061ebb06488b6f9b51add584ee68590b8" exitCode=0 Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.119580 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dff5s" event={"ID":"a078d2e7-9855-4886-8bf9-372c610b5eff","Type":"ContainerDied","Data":"1707221b498a323bc8ccc260d0bf5fa061ebb06488b6f9b51add584ee68590b8"} Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.221170 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.298982 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-utilities\") pod \"a078d2e7-9855-4886-8bf9-372c610b5eff\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.299101 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-catalog-content\") pod \"a078d2e7-9855-4886-8bf9-372c610b5eff\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.299155 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg6x9\" (UniqueName: \"kubernetes.io/projected/a078d2e7-9855-4886-8bf9-372c610b5eff-kube-api-access-cg6x9\") pod \"a078d2e7-9855-4886-8bf9-372c610b5eff\" (UID: \"a078d2e7-9855-4886-8bf9-372c610b5eff\") " Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.300291 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-utilities" (OuterVolumeSpecName: "utilities") pod "a078d2e7-9855-4886-8bf9-372c610b5eff" (UID: "a078d2e7-9855-4886-8bf9-372c610b5eff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.305829 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a078d2e7-9855-4886-8bf9-372c610b5eff-kube-api-access-cg6x9" (OuterVolumeSpecName: "kube-api-access-cg6x9") pod "a078d2e7-9855-4886-8bf9-372c610b5eff" (UID: "a078d2e7-9855-4886-8bf9-372c610b5eff"). InnerVolumeSpecName "kube-api-access-cg6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.365041 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a078d2e7-9855-4886-8bf9-372c610b5eff" (UID: "a078d2e7-9855-4886-8bf9-372c610b5eff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.400227 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.400259 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a078d2e7-9855-4886-8bf9-372c610b5eff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:50 crc kubenswrapper[5129]: I0314 07:15:50.400269 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg6x9\" (UniqueName: \"kubernetes.io/projected/a078d2e7-9855-4886-8bf9-372c610b5eff-kube-api-access-cg6x9\") on node \"crc\" DevicePath \"\"" Mar 14 07:15:51 crc kubenswrapper[5129]: I0314 07:15:51.130194 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dff5s" event={"ID":"a078d2e7-9855-4886-8bf9-372c610b5eff","Type":"ContainerDied","Data":"a79d847f48cf380b666b0975e4b3764ec3a6a88fac86e89144705d08c4d2c013"} Mar 14 07:15:51 crc kubenswrapper[5129]: I0314 07:15:51.130247 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dff5s" Mar 14 07:15:51 crc kubenswrapper[5129]: I0314 07:15:51.130259 5129 scope.go:117] "RemoveContainer" containerID="1707221b498a323bc8ccc260d0bf5fa061ebb06488b6f9b51add584ee68590b8" Mar 14 07:15:51 crc kubenswrapper[5129]: I0314 07:15:51.144553 5129 scope.go:117] "RemoveContainer" containerID="9acd8efc456f63c3a33a02566214656001e7a7cc94e54f91134e8043a22b8bd0" Mar 14 07:15:51 crc kubenswrapper[5129]: I0314 07:15:51.161280 5129 scope.go:117] "RemoveContainer" containerID="2354a1a96be8d5f5c5708bc7a411d2b3566b8150607ef36b4975f64cc5276bfb" Mar 14 07:15:51 crc kubenswrapper[5129]: I0314 07:15:51.171920 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dff5s"] Mar 14 07:15:51 crc kubenswrapper[5129]: I0314 07:15:51.177907 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dff5s"] Mar 14 07:15:52 crc kubenswrapper[5129]: I0314 07:15:52.046724 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" path="/var/lib/kubelet/pods/a078d2e7-9855-4886-8bf9-372c610b5eff/volumes" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057343 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn"] Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057736 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="extract-content" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057749 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="extract-content" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057762 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerName="util" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057771 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerName="util" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057790 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerName="pull" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057796 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerName="pull" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057809 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="extract-content" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057815 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="extract-content" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057826 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" containerName="console" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057832 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" containerName="console" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057840 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="registry-server" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057846 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="registry-server" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057861 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="registry-server" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057867 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="registry-server" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057874 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="extract-utilities" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057880 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="extract-utilities" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057886 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerName="extract" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057892 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerName="extract" Mar 14 07:15:55 crc kubenswrapper[5129]: E0314 07:15:55.057904 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="extract-utilities" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.057909 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="extract-utilities" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.058071 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afae4f4-1faf-44cf-81ee-5f11553a1407" containerName="extract" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.058087 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a078d2e7-9855-4886-8bf9-372c610b5eff" containerName="registry-server" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.058103 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1d2259-7335-4b63-8f50-e2df84977856" containerName="registry-server" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.058115 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" containerName="console" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.058630 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.065587 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.065696 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.066242 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.066282 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7g5xf" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.066362 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.076446 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn"] Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.157564 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c206b3c-9ece-49e5-8227-1bb654d4635d-webhook-cert\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.157630 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql258\" (UniqueName: \"kubernetes.io/projected/2c206b3c-9ece-49e5-8227-1bb654d4635d-kube-api-access-ql258\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.157672 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c206b3c-9ece-49e5-8227-1bb654d4635d-apiservice-cert\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.259562 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c206b3c-9ece-49e5-8227-1bb654d4635d-apiservice-cert\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.259751 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c206b3c-9ece-49e5-8227-1bb654d4635d-webhook-cert\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.259807 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql258\" (UniqueName: \"kubernetes.io/projected/2c206b3c-9ece-49e5-8227-1bb654d4635d-kube-api-access-ql258\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.265756 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c206b3c-9ece-49e5-8227-1bb654d4635d-webhook-cert\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.266797 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c206b3c-9ece-49e5-8227-1bb654d4635d-apiservice-cert\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.287503 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql258\" (UniqueName: \"kubernetes.io/projected/2c206b3c-9ece-49e5-8227-1bb654d4635d-kube-api-access-ql258\") pod \"metallb-operator-controller-manager-5dd66c8db4-67bnn\" (UID: \"2c206b3c-9ece-49e5-8227-1bb654d4635d\") " pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.306930 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc"] Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.307742 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.310757 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.310858 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-j55pb" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.311012 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.324194 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc"] Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.362030 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b375b429-b57b-46b8-b616-a6a6723cf3c6-webhook-cert\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.362074 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b375b429-b57b-46b8-b616-a6a6723cf3c6-apiservice-cert\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.362124 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjddn\" (UniqueName: \"kubernetes.io/projected/b375b429-b57b-46b8-b616-a6a6723cf3c6-kube-api-access-wjddn\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.385757 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.463768 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b375b429-b57b-46b8-b616-a6a6723cf3c6-webhook-cert\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.464045 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b375b429-b57b-46b8-b616-a6a6723cf3c6-apiservice-cert\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.464166 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjddn\" (UniqueName: \"kubernetes.io/projected/b375b429-b57b-46b8-b616-a6a6723cf3c6-kube-api-access-wjddn\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.469221 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b375b429-b57b-46b8-b616-a6a6723cf3c6-webhook-cert\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.469230 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b375b429-b57b-46b8-b616-a6a6723cf3c6-apiservice-cert\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.481893 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjddn\" (UniqueName: \"kubernetes.io/projected/b375b429-b57b-46b8-b616-a6a6723cf3c6-kube-api-access-wjddn\") pod \"metallb-operator-webhook-server-767958b5d7-fxpkc\" (UID: \"b375b429-b57b-46b8-b616-a6a6723cf3c6\") " pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.605592 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn"] Mar 14 07:15:55 crc kubenswrapper[5129]: W0314 07:15:55.615325 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c206b3c_9ece_49e5_8227_1bb654d4635d.slice/crio-59600c6f4aff02cad3f78847d2240edd0d94ce96f89ffc6eff5857a4a6c2c9fa WatchSource:0}: Error finding container 59600c6f4aff02cad3f78847d2240edd0d94ce96f89ffc6eff5857a4a6c2c9fa: Status 404 returned error can't find the container with id 59600c6f4aff02cad3f78847d2240edd0d94ce96f89ffc6eff5857a4a6c2c9fa Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.678786 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:15:55 crc kubenswrapper[5129]: I0314 07:15:55.941441 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc"] Mar 14 07:15:56 crc kubenswrapper[5129]: I0314 07:15:56.158034 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" event={"ID":"2c206b3c-9ece-49e5-8227-1bb654d4635d","Type":"ContainerStarted","Data":"59600c6f4aff02cad3f78847d2240edd0d94ce96f89ffc6eff5857a4a6c2c9fa"} Mar 14 07:15:56 crc kubenswrapper[5129]: I0314 07:15:56.158815 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" event={"ID":"b375b429-b57b-46b8-b616-a6a6723cf3c6","Type":"ContainerStarted","Data":"264eb2518e7d65b972fd4df400720a46fb2e4abf907d82934bb2d5a6d098ec45"} Mar 14 07:15:59 crc kubenswrapper[5129]: I0314 07:15:59.176505 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" event={"ID":"2c206b3c-9ece-49e5-8227-1bb654d4635d","Type":"ContainerStarted","Data":"671a6b81428d08435d54c4830fdaba035f5234d4570660da7e9bb62dc34955d0"} Mar 14 07:15:59 crc kubenswrapper[5129]: I0314 07:15:59.177889 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:15:59 crc kubenswrapper[5129]: I0314 07:15:59.201657 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" podStartSLOduration=1.231748179 podStartE2EDuration="4.201639653s" podCreationTimestamp="2026-03-14 07:15:55 +0000 UTC" firstStartedPulling="2026-03-14 07:15:55.617860073 +0000 UTC m=+1018.369775267" lastFinishedPulling="2026-03-14 07:15:58.587751567 +0000 UTC m=+1021.339666741" observedRunningTime="2026-03-14 07:15:59.19631308 +0000 UTC m=+1021.948228264" watchObservedRunningTime="2026-03-14 07:15:59.201639653 +0000 UTC m=+1021.953554837" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.121556 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557876-2lhvz"] Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.123312 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-2lhvz" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.126392 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-2lhvz"] Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.126538 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.126570 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.126752 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.221225 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6rs\" (UniqueName: \"kubernetes.io/projected/2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4-kube-api-access-kb6rs\") pod \"auto-csr-approver-29557876-2lhvz\" (UID: \"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4\") " pod="openshift-infra/auto-csr-approver-29557876-2lhvz" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.322727 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6rs\" (UniqueName: \"kubernetes.io/projected/2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4-kube-api-access-kb6rs\") pod \"auto-csr-approver-29557876-2lhvz\" (UID: \"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4\") " pod="openshift-infra/auto-csr-approver-29557876-2lhvz" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.352359 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6rs\" (UniqueName: \"kubernetes.io/projected/2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4-kube-api-access-kb6rs\") pod \"auto-csr-approver-29557876-2lhvz\" (UID: \"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4\") " pod="openshift-infra/auto-csr-approver-29557876-2lhvz" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.445990 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-2lhvz" Mar 14 07:16:00 crc kubenswrapper[5129]: I0314 07:16:00.667159 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-2lhvz"] Mar 14 07:16:01 crc kubenswrapper[5129]: I0314 07:16:01.188480 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" event={"ID":"b375b429-b57b-46b8-b616-a6a6723cf3c6","Type":"ContainerStarted","Data":"e9595d282cbaf6375f60452af26bdd4b61a128436a04248479ffc5e532e95f79"} Mar 14 07:16:01 crc kubenswrapper[5129]: I0314 07:16:01.188847 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:16:01 crc kubenswrapper[5129]: I0314 07:16:01.189946 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-2lhvz" event={"ID":"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4","Type":"ContainerStarted","Data":"2411be21fcba08ed2ad18be1f1a2f46ed3599403322b88656579b14488e03fe5"} Mar 14 07:16:01 crc kubenswrapper[5129]: I0314 07:16:01.218960 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" podStartSLOduration=1.720487997 podStartE2EDuration="6.218945133s" podCreationTimestamp="2026-03-14 07:15:55 +0000 UTC" firstStartedPulling="2026-03-14 07:15:55.957306789 +0000 UTC m=+1018.709221973" lastFinishedPulling="2026-03-14 07:16:00.455763925 +0000 UTC m=+1023.207679109" observedRunningTime="2026-03-14 07:16:01.212729918 +0000 UTC m=+1023.964645122" watchObservedRunningTime="2026-03-14 07:16:01.218945133 +0000 UTC m=+1023.970860317" Mar 14 07:16:02 crc kubenswrapper[5129]: I0314 07:16:02.197516 5129 generic.go:334] "Generic (PLEG): container finished" podID="2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4" containerID="9ab487e892f5b469301d30f09c86d4c09d4838db765c227e622aae61b0503382" exitCode=0 Mar 14 07:16:02 crc kubenswrapper[5129]: I0314 07:16:02.197647 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-2lhvz" event={"ID":"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4","Type":"ContainerDied","Data":"9ab487e892f5b469301d30f09c86d4c09d4838db765c227e622aae61b0503382"} Mar 14 07:16:03 crc kubenswrapper[5129]: I0314 07:16:03.521569 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-2lhvz" Mar 14 07:16:03 crc kubenswrapper[5129]: I0314 07:16:03.665848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb6rs\" (UniqueName: \"kubernetes.io/projected/2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4-kube-api-access-kb6rs\") pod \"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4\" (UID: \"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4\") " Mar 14 07:16:03 crc kubenswrapper[5129]: I0314 07:16:03.670561 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4-kube-api-access-kb6rs" (OuterVolumeSpecName: "kube-api-access-kb6rs") pod "2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4" (UID: "2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4"). InnerVolumeSpecName "kube-api-access-kb6rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:16:03 crc kubenswrapper[5129]: I0314 07:16:03.767871 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb6rs\" (UniqueName: \"kubernetes.io/projected/2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4-kube-api-access-kb6rs\") on node \"crc\" DevicePath \"\"" Mar 14 07:16:04 crc kubenswrapper[5129]: I0314 07:16:04.210873 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557876-2lhvz" event={"ID":"2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4","Type":"ContainerDied","Data":"2411be21fcba08ed2ad18be1f1a2f46ed3599403322b88656579b14488e03fe5"} Mar 14 07:16:04 crc kubenswrapper[5129]: I0314 07:16:04.210916 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2411be21fcba08ed2ad18be1f1a2f46ed3599403322b88656579b14488e03fe5" Mar 14 07:16:04 crc kubenswrapper[5129]: I0314 07:16:04.210968 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557876-2lhvz" Mar 14 07:16:04 crc kubenswrapper[5129]: I0314 07:16:04.577635 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-d86vd"] Mar 14 07:16:04 crc kubenswrapper[5129]: I0314 07:16:04.581813 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557870-d86vd"] Mar 14 07:16:06 crc kubenswrapper[5129]: I0314 07:16:06.043298 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b7ec990-df65-411a-8d53-345e03dd77e1" path="/var/lib/kubelet/pods/1b7ec990-df65-411a-8d53-345e03dd77e1/volumes" Mar 14 07:16:12 crc kubenswrapper[5129]: I0314 07:16:12.063837 5129 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod92aed46c-5740-4407-81ed-4ff642a70c54"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod92aed46c-5740-4407-81ed-4ff642a70c54] : Timed out while waiting for systemd to remove kubepods-burstable-pod92aed46c_5740_4407_81ed_4ff642a70c54.slice" Mar 14 07:16:12 crc kubenswrapper[5129]: E0314 07:16:12.064243 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable pod92aed46c-5740-4407-81ed-4ff642a70c54] : unable to destroy cgroup paths for cgroup [kubepods burstable pod92aed46c-5740-4407-81ed-4ff642a70c54] : Timed out while waiting for systemd to remove kubepods-burstable-pod92aed46c_5740_4407_81ed_4ff642a70c54.slice" pod="openshift-console/console-f9d7485db-vpw78" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" Mar 14 07:16:12 crc kubenswrapper[5129]: I0314 07:16:12.260550 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vpw78" Mar 14 07:16:12 crc kubenswrapper[5129]: I0314 07:16:12.289199 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vpw78"] Mar 14 07:16:12 crc kubenswrapper[5129]: I0314 07:16:12.293696 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vpw78"] Mar 14 07:16:14 crc kubenswrapper[5129]: I0314 07:16:14.046740 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92aed46c-5740-4407-81ed-4ff642a70c54" path="/var/lib/kubelet/pods/92aed46c-5740-4407-81ed-4ff642a70c54/volumes" Mar 14 07:16:15 crc kubenswrapper[5129]: I0314 07:16:15.686425 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-767958b5d7-fxpkc" Mar 14 07:16:16 crc kubenswrapper[5129]: I0314 07:16:16.860679 5129 scope.go:117] "RemoveContainer" containerID="844a61e51b46cb765cedf8df29da5e9c329099b479aa6c82d2579d7d9734574d" Mar 14 07:16:35 crc kubenswrapper[5129]: I0314 07:16:35.390256 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5dd66c8db4-67bnn" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.144789 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-88nmz"] Mar 14 07:16:36 crc kubenswrapper[5129]: E0314 07:16:36.145096 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4" containerName="oc" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.145110 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4" containerName="oc" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.145260 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4" containerName="oc" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.147570 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.149965 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v"] Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.150364 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-lkdr8" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.150644 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.150838 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.150903 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.156092 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v"] Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.156279 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.228729 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-79pdg"] Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.229550 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: W0314 07:16:36.230950 5129 reflector.go:561] object-"metallb-system"/"speaker-certs-secret": failed to list *v1.Secret: secrets "speaker-certs-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 14 07:16:36 crc kubenswrapper[5129]: E0314 07:16:36.231005 5129 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"speaker-certs-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"speaker-certs-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:16:36 crc kubenswrapper[5129]: W0314 07:16:36.231104 5129 reflector.go:561] object-"metallb-system"/"metallb-memberlist": failed to list *v1.Secret: secrets "metallb-memberlist" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 14 07:16:36 crc kubenswrapper[5129]: E0314 07:16:36.231126 5129 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-memberlist\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-memberlist\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:16:36 crc kubenswrapper[5129]: W0314 07:16:36.231590 5129 reflector.go:561] object-"metallb-system"/"metallb-excludel2": failed to list *v1.ConfigMap: configmaps "metallb-excludel2" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 14 07:16:36 crc kubenswrapper[5129]: E0314 07:16:36.231641 5129 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-excludel2\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"metallb-excludel2\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:16:36 crc kubenswrapper[5129]: W0314 07:16:36.231787 5129 reflector.go:561] object-"metallb-system"/"speaker-dockercfg-hrzc4": failed to list *v1.Secret: secrets "speaker-dockercfg-hrzc4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 14 07:16:36 crc kubenswrapper[5129]: E0314 07:16:36.231806 5129 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"speaker-dockercfg-hrzc4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"speaker-dockercfg-hrzc4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.245886 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-77prk"] Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.246758 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.249034 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.265879 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-77prk"] Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.316843 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics-certs\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.316893 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd4v\" (UniqueName: \"kubernetes.io/projected/4c5ce029-cd89-4615-95e5-26e366269bc1-kube-api-access-rvd4v\") pod \"frr-k8s-webhook-server-bcc4b6f68-wwj7v\" (UID: \"4c5ce029-cd89-4615-95e5-26e366269bc1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.316983 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-startup\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.317011 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c5ce029-cd89-4615-95e5-26e366269bc1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wwj7v\" (UID: \"4c5ce029-cd89-4615-95e5-26e366269bc1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.317034 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzvc\" (UniqueName: \"kubernetes.io/projected/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-kube-api-access-2kzvc\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.317169 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-conf\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.317217 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-sockets\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.317302 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.317334 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-reloader\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418191 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-sockets\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418244 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/509b7fd3-32d6-4c74-a48c-0e2ed674175d-memberlist\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418302 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/509b7fd3-32d6-4c74-a48c-0e2ed674175d-metallb-excludel2\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418331 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418352 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/509b7fd3-32d6-4c74-a48c-0e2ed674175d-metrics-certs\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418452 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-reloader\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418503 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnk7h\" (UniqueName: \"kubernetes.io/projected/434ee85e-a99b-416e-b96c-9c019d77850a-kube-api-access-tnk7h\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418568 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/434ee85e-a99b-416e-b96c-9c019d77850a-cert\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418618 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics-certs\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418638 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/434ee85e-a99b-416e-b96c-9c019d77850a-metrics-certs\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418689 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd4v\" (UniqueName: \"kubernetes.io/projected/4c5ce029-cd89-4615-95e5-26e366269bc1-kube-api-access-rvd4v\") pod \"frr-k8s-webhook-server-bcc4b6f68-wwj7v\" (UID: \"4c5ce029-cd89-4615-95e5-26e366269bc1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418720 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c5ce029-cd89-4615-95e5-26e366269bc1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wwj7v\" (UID: \"4c5ce029-cd89-4615-95e5-26e366269bc1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418735 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-startup\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418754 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzvc\" (UniqueName: \"kubernetes.io/projected/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-kube-api-access-2kzvc\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: E0314 07:16:36.418755 5129 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418774 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwksw\" (UniqueName: \"kubernetes.io/projected/509b7fd3-32d6-4c74-a48c-0e2ed674175d-kube-api-access-gwksw\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: E0314 07:16:36.418824 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics-certs podName:a42d927d-7fdc-49e1-9c20-eb3f410a3b9b nodeName:}" failed. No retries permitted until 2026-03-14 07:16:36.918797695 +0000 UTC m=+1059.670712879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics-certs") pod "frr-k8s-88nmz" (UID: "a42d927d-7fdc-49e1-9c20-eb3f410a3b9b") : secret "frr-k8s-certs-secret" not found Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418826 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-sockets\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418852 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-reloader\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.418858 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-conf\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.419135 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.419164 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-conf\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.419828 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-frr-startup\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.425560 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c5ce029-cd89-4615-95e5-26e366269bc1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wwj7v\" (UID: \"4c5ce029-cd89-4615-95e5-26e366269bc1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.434064 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzvc\" (UniqueName: \"kubernetes.io/projected/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-kube-api-access-2kzvc\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.434632 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd4v\" (UniqueName: \"kubernetes.io/projected/4c5ce029-cd89-4615-95e5-26e366269bc1-kube-api-access-rvd4v\") pod \"frr-k8s-webhook-server-bcc4b6f68-wwj7v\" (UID: \"4c5ce029-cd89-4615-95e5-26e366269bc1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.494375 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.519751 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnk7h\" (UniqueName: \"kubernetes.io/projected/434ee85e-a99b-416e-b96c-9c019d77850a-kube-api-access-tnk7h\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.519798 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/434ee85e-a99b-416e-b96c-9c019d77850a-cert\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.519834 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/434ee85e-a99b-416e-b96c-9c019d77850a-metrics-certs\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.519868 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwksw\" (UniqueName: \"kubernetes.io/projected/509b7fd3-32d6-4c74-a48c-0e2ed674175d-kube-api-access-gwksw\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.519927 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/509b7fd3-32d6-4c74-a48c-0e2ed674175d-memberlist\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.519976 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/509b7fd3-32d6-4c74-a48c-0e2ed674175d-metallb-excludel2\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.519999 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/509b7fd3-32d6-4c74-a48c-0e2ed674175d-metrics-certs\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.522351 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.525070 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/434ee85e-a99b-416e-b96c-9c019d77850a-metrics-certs\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.535156 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/434ee85e-a99b-416e-b96c-9c019d77850a-cert\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.538109 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnk7h\" (UniqueName: \"kubernetes.io/projected/434ee85e-a99b-416e-b96c-9c019d77850a-kube-api-access-tnk7h\") pod \"controller-7bb4cc7c98-77prk\" (UID: \"434ee85e-a99b-416e-b96c-9c019d77850a\") " pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.542663 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwksw\" (UniqueName: \"kubernetes.io/projected/509b7fd3-32d6-4c74-a48c-0e2ed674175d-kube-api-access-gwksw\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.560149 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.725869 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v"] Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.771255 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-77prk"] Mar 14 07:16:36 crc kubenswrapper[5129]: W0314 07:16:36.776386 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod434ee85e_a99b_416e_b96c_9c019d77850a.slice/crio-9f6ee311c4cb07c677bcc476decc9cb4cc9faabeeb0e9fcb100e8bf83a5dbb10 WatchSource:0}: Error finding container 9f6ee311c4cb07c677bcc476decc9cb4cc9faabeeb0e9fcb100e8bf83a5dbb10: Status 404 returned error can't find the container with id 9f6ee311c4cb07c677bcc476decc9cb4cc9faabeeb0e9fcb100e8bf83a5dbb10 Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.929775 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics-certs\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:36 crc kubenswrapper[5129]: I0314 07:16:36.939324 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42d927d-7fdc-49e1-9c20-eb3f410a3b9b-metrics-certs\") pod \"frr-k8s-88nmz\" (UID: \"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b\") " pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.076518 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.249001 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.253451 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/509b7fd3-32d6-4c74-a48c-0e2ed674175d-metrics-certs\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.308693 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hrzc4" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.405273 5129 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.413147 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" event={"ID":"4c5ce029-cd89-4615-95e5-26e366269bc1","Type":"ContainerStarted","Data":"5d889329c2c5065811ff08e1dbc8d66dccda8997a7ca38861888da40446610bd"} Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.414822 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerStarted","Data":"4bbb1e36fa9f8450c4447c934dc988a55d06a37cf1fdc67980ef9a39c8ad96d8"} Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.416147 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-77prk" event={"ID":"434ee85e-a99b-416e-b96c-9c019d77850a","Type":"ContainerStarted","Data":"29f0ff6a075c61abfec0c331be8f82c67196ecc97a1b27d02851a60896ba54a4"} Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.416174 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-77prk" event={"ID":"434ee85e-a99b-416e-b96c-9c019d77850a","Type":"ContainerStarted","Data":"cae5034cf17d20628d7e23447b7b66cbfa3bb5b29c616b634241e303c0bfa93b"} Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.416187 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-77prk" event={"ID":"434ee85e-a99b-416e-b96c-9c019d77850a","Type":"ContainerStarted","Data":"9f6ee311c4cb07c677bcc476decc9cb4cc9faabeeb0e9fcb100e8bf83a5dbb10"} Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.416351 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.416826 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/509b7fd3-32d6-4c74-a48c-0e2ed674175d-memberlist\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.448762 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-77prk" podStartSLOduration=1.448731031 podStartE2EDuration="1.448731031s" podCreationTimestamp="2026-03-14 07:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:16:37.4450526 +0000 UTC m=+1060.196967824" watchObservedRunningTime="2026-03-14 07:16:37.448731031 +0000 UTC m=+1060.200646255" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.493676 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.502500 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/509b7fd3-32d6-4c74-a48c-0e2ed674175d-metallb-excludel2\") pod \"speaker-79pdg\" (UID: \"509b7fd3-32d6-4c74-a48c-0e2ed674175d\") " pod="metallb-system/speaker-79pdg" Mar 14 07:16:37 crc kubenswrapper[5129]: I0314 07:16:37.746120 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-79pdg" Mar 14 07:16:37 crc kubenswrapper[5129]: W0314 07:16:37.803366 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509b7fd3_32d6_4c74_a48c_0e2ed674175d.slice/crio-9e2c6505c5c165f1a9ce91070bccfb093e4e6a9abd35aad23802770a4b1bb893 WatchSource:0}: Error finding container 9e2c6505c5c165f1a9ce91070bccfb093e4e6a9abd35aad23802770a4b1bb893: Status 404 returned error can't find the container with id 9e2c6505c5c165f1a9ce91070bccfb093e4e6a9abd35aad23802770a4b1bb893 Mar 14 07:16:38 crc kubenswrapper[5129]: I0314 07:16:38.431857 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-79pdg" event={"ID":"509b7fd3-32d6-4c74-a48c-0e2ed674175d","Type":"ContainerStarted","Data":"8602db69ea10bc9fe629af596208352d94324177c9c175c1569fc213147a2bd6"} Mar 14 07:16:38 crc kubenswrapper[5129]: I0314 07:16:38.432174 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-79pdg" event={"ID":"509b7fd3-32d6-4c74-a48c-0e2ed674175d","Type":"ContainerStarted","Data":"c2e0ccaee583cd26f83d3020db2510e4ae94f44307c987d756262ca8de2f6e04"} Mar 14 07:16:38 crc kubenswrapper[5129]: I0314 07:16:38.432184 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-79pdg" event={"ID":"509b7fd3-32d6-4c74-a48c-0e2ed674175d","Type":"ContainerStarted","Data":"9e2c6505c5c165f1a9ce91070bccfb093e4e6a9abd35aad23802770a4b1bb893"} Mar 14 07:16:38 crc kubenswrapper[5129]: I0314 07:16:38.432705 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-79pdg" Mar 14 07:16:38 crc kubenswrapper[5129]: I0314 07:16:38.466155 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-79pdg" podStartSLOduration=2.466136005 podStartE2EDuration="2.466136005s" podCreationTimestamp="2026-03-14 07:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:16:38.463570532 +0000 UTC m=+1061.215485716" watchObservedRunningTime="2026-03-14 07:16:38.466136005 +0000 UTC m=+1061.218051189" Mar 14 07:16:44 crc kubenswrapper[5129]: I0314 07:16:44.483869 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" event={"ID":"4c5ce029-cd89-4615-95e5-26e366269bc1","Type":"ContainerStarted","Data":"2a0117a4518ee46db893db9b02f7d87c4710b159bf651b424bb3f0e3dfe26063"} Mar 14 07:16:44 crc kubenswrapper[5129]: I0314 07:16:44.484362 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:44 crc kubenswrapper[5129]: I0314 07:16:44.486753 5129 generic.go:334] "Generic (PLEG): container finished" podID="a42d927d-7fdc-49e1-9c20-eb3f410a3b9b" containerID="00b96bc5a7ad5ecec7bf53204ff38a33396c46e4619d8eb84875055d35ac4f01" exitCode=0 Mar 14 07:16:44 crc kubenswrapper[5129]: I0314 07:16:44.486781 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerDied","Data":"00b96bc5a7ad5ecec7bf53204ff38a33396c46e4619d8eb84875055d35ac4f01"} Mar 14 07:16:44 crc kubenswrapper[5129]: I0314 07:16:44.498354 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" podStartSLOduration=1.881104971 podStartE2EDuration="8.498337294s" podCreationTimestamp="2026-03-14 07:16:36 +0000 UTC" firstStartedPulling="2026-03-14 07:16:36.739681166 +0000 UTC m=+1059.491596350" lastFinishedPulling="2026-03-14 07:16:43.356913489 +0000 UTC m=+1066.108828673" observedRunningTime="2026-03-14 07:16:44.496978509 +0000 UTC m=+1067.248893683" watchObservedRunningTime="2026-03-14 07:16:44.498337294 +0000 UTC m=+1067.250252498" Mar 14 07:16:45 crc kubenswrapper[5129]: I0314 07:16:45.495799 5129 generic.go:334] "Generic (PLEG): container finished" podID="a42d927d-7fdc-49e1-9c20-eb3f410a3b9b" containerID="66bbe037b4e3efeaad086c90906e391a91d622d9db8c263603b1cdc69b24e354" exitCode=0 Mar 14 07:16:45 crc kubenswrapper[5129]: I0314 07:16:45.495947 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerDied","Data":"66bbe037b4e3efeaad086c90906e391a91d622d9db8c263603b1cdc69b24e354"} Mar 14 07:16:46 crc kubenswrapper[5129]: I0314 07:16:46.509835 5129 generic.go:334] "Generic (PLEG): container finished" podID="a42d927d-7fdc-49e1-9c20-eb3f410a3b9b" containerID="df0b023d33753486db7e93d216e0cf8222259cc605a18ced6024a5bdcbdeeaa2" exitCode=0 Mar 14 07:16:46 crc kubenswrapper[5129]: I0314 07:16:46.510063 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerDied","Data":"df0b023d33753486db7e93d216e0cf8222259cc605a18ced6024a5bdcbdeeaa2"} Mar 14 07:16:46 crc kubenswrapper[5129]: I0314 07:16:46.565996 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-77prk" Mar 14 07:16:47 crc kubenswrapper[5129]: I0314 07:16:47.522098 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerStarted","Data":"74555aeb330100607e0254ad2197d71e10564e46ffcb7064b10823b03b52a972"} Mar 14 07:16:47 crc kubenswrapper[5129]: I0314 07:16:47.522448 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerStarted","Data":"1f295f84cc2c0c8103b25a0ed4ec3bf5ddebb869fbc7ecb03a12f7a29057385b"} Mar 14 07:16:47 crc kubenswrapper[5129]: I0314 07:16:47.522461 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerStarted","Data":"647b9ed29224a123a91785a367be9cfae15f3f0c98309776eadda98d7d4d0e08"} Mar 14 07:16:47 crc kubenswrapper[5129]: I0314 07:16:47.522470 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerStarted","Data":"08d7f6733fc82471f1e412659509c34634fdf6d73f8e148d0fdf7c62a1ee824b"} Mar 14 07:16:47 crc kubenswrapper[5129]: I0314 07:16:47.522478 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerStarted","Data":"bfa09d13845ce62ad3963a605698413f8f04abf96a678c472459f552c8e29545"} Mar 14 07:16:48 crc kubenswrapper[5129]: I0314 07:16:48.543046 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88nmz" event={"ID":"a42d927d-7fdc-49e1-9c20-eb3f410a3b9b","Type":"ContainerStarted","Data":"c0afac2b25fb1c609590ec99d1eba75cd1ab5f21226caecb95d9df86e0189dac"} Mar 14 07:16:48 crc kubenswrapper[5129]: I0314 07:16:48.543419 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:48 crc kubenswrapper[5129]: I0314 07:16:48.577826 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-88nmz" podStartSLOduration=6.377902256 podStartE2EDuration="12.577800175s" podCreationTimestamp="2026-03-14 07:16:36 +0000 UTC" firstStartedPulling="2026-03-14 07:16:37.163208834 +0000 UTC m=+1059.915124018" lastFinishedPulling="2026-03-14 07:16:43.363106753 +0000 UTC m=+1066.115021937" observedRunningTime="2026-03-14 07:16:48.575076457 +0000 UTC m=+1071.326991681" watchObservedRunningTime="2026-03-14 07:16:48.577800175 +0000 UTC m=+1071.329715399" Mar 14 07:16:52 crc kubenswrapper[5129]: I0314 07:16:52.077057 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:52 crc kubenswrapper[5129]: I0314 07:16:52.139746 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:56 crc kubenswrapper[5129]: I0314 07:16:56.499263 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wwj7v" Mar 14 07:16:57 crc kubenswrapper[5129]: I0314 07:16:57.080148 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-88nmz" Mar 14 07:16:57 crc kubenswrapper[5129]: I0314 07:16:57.752701 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-79pdg" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.218354 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6"] Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.220316 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.223725 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.225946 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6"] Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.348614 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.348698 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.348904 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnwnq\" (UniqueName: \"kubernetes.io/projected/322f38ef-7ed1-4282-8ac5-59fcd232784d-kube-api-access-pnwnq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.450233 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.450310 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnwnq\" (UniqueName: \"kubernetes.io/projected/322f38ef-7ed1-4282-8ac5-59fcd232784d-kube-api-access-pnwnq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.450359 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.450831 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.450868 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.470751 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnwnq\" (UniqueName: \"kubernetes.io/projected/322f38ef-7ed1-4282-8ac5-59fcd232784d-kube-api-access-pnwnq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.537917 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:16:59 crc kubenswrapper[5129]: I0314 07:16:59.974030 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6"] Mar 14 07:17:00 crc kubenswrapper[5129]: I0314 07:17:00.625458 5129 generic.go:334] "Generic (PLEG): container finished" podID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerID="7f89bf1c580c600ae37eb19cba8be1ec85e920d1b1cc185c6721311657dbf7b0" exitCode=0 Mar 14 07:17:00 crc kubenswrapper[5129]: I0314 07:17:00.625504 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" event={"ID":"322f38ef-7ed1-4282-8ac5-59fcd232784d","Type":"ContainerDied","Data":"7f89bf1c580c600ae37eb19cba8be1ec85e920d1b1cc185c6721311657dbf7b0"} Mar 14 07:17:00 crc kubenswrapper[5129]: I0314 07:17:00.625539 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" event={"ID":"322f38ef-7ed1-4282-8ac5-59fcd232784d","Type":"ContainerStarted","Data":"d06971e5037bb5675f70bf14c8a1be967bb1a46b5a1584daa4b806cb4a740459"} Mar 14 07:17:03 crc kubenswrapper[5129]: I0314 07:17:03.643176 5129 generic.go:334] "Generic (PLEG): container finished" podID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerID="ba1cfe492f6abb58449fe0760a96ab15bc8eb50fe528c7b6fd8bff05ac9673cc" exitCode=0 Mar 14 07:17:03 crc kubenswrapper[5129]: I0314 07:17:03.643275 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" event={"ID":"322f38ef-7ed1-4282-8ac5-59fcd232784d","Type":"ContainerDied","Data":"ba1cfe492f6abb58449fe0760a96ab15bc8eb50fe528c7b6fd8bff05ac9673cc"} Mar 14 07:17:04 crc kubenswrapper[5129]: I0314 07:17:04.654374 5129 generic.go:334] "Generic (PLEG): container finished" podID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerID="d49fb8e481002a06b76c55d795b3f14d969d3c2c77bcc2dd1ef0b9a05a243153" exitCode=0 Mar 14 07:17:04 crc kubenswrapper[5129]: I0314 07:17:04.654427 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" event={"ID":"322f38ef-7ed1-4282-8ac5-59fcd232784d","Type":"ContainerDied","Data":"d49fb8e481002a06b76c55d795b3f14d969d3c2c77bcc2dd1ef0b9a05a243153"} Mar 14 07:17:05 crc kubenswrapper[5129]: I0314 07:17:05.871682 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.041776 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnwnq\" (UniqueName: \"kubernetes.io/projected/322f38ef-7ed1-4282-8ac5-59fcd232784d-kube-api-access-pnwnq\") pod \"322f38ef-7ed1-4282-8ac5-59fcd232784d\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.042225 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-bundle\") pod \"322f38ef-7ed1-4282-8ac5-59fcd232784d\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.042371 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-util\") pod \"322f38ef-7ed1-4282-8ac5-59fcd232784d\" (UID: \"322f38ef-7ed1-4282-8ac5-59fcd232784d\") " Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.044093 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-bundle" (OuterVolumeSpecName: "bundle") pod "322f38ef-7ed1-4282-8ac5-59fcd232784d" (UID: "322f38ef-7ed1-4282-8ac5-59fcd232784d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.056825 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322f38ef-7ed1-4282-8ac5-59fcd232784d-kube-api-access-pnwnq" (OuterVolumeSpecName: "kube-api-access-pnwnq") pod "322f38ef-7ed1-4282-8ac5-59fcd232784d" (UID: "322f38ef-7ed1-4282-8ac5-59fcd232784d"). InnerVolumeSpecName "kube-api-access-pnwnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.068881 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-util" (OuterVolumeSpecName: "util") pod "322f38ef-7ed1-4282-8ac5-59fcd232784d" (UID: "322f38ef-7ed1-4282-8ac5-59fcd232784d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.143473 5129 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.143513 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnwnq\" (UniqueName: \"kubernetes.io/projected/322f38ef-7ed1-4282-8ac5-59fcd232784d-kube-api-access-pnwnq\") on node \"crc\" DevicePath \"\"" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.143524 5129 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/322f38ef-7ed1-4282-8ac5-59fcd232784d-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.683043 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" event={"ID":"322f38ef-7ed1-4282-8ac5-59fcd232784d","Type":"ContainerDied","Data":"d06971e5037bb5675f70bf14c8a1be967bb1a46b5a1584daa4b806cb4a740459"} Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.683126 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d06971e5037bb5675f70bf14c8a1be967bb1a46b5a1584daa4b806cb4a740459" Mar 14 07:17:06 crc kubenswrapper[5129]: I0314 07:17:06.683188 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.142664 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz"] Mar 14 07:17:09 crc kubenswrapper[5129]: E0314 07:17:09.143119 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerName="util" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.143131 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerName="util" Mar 14 07:17:09 crc kubenswrapper[5129]: E0314 07:17:09.143142 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerName="pull" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.143147 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerName="pull" Mar 14 07:17:09 crc kubenswrapper[5129]: E0314 07:17:09.143155 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerName="extract" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.143163 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerName="extract" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.143266 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="322f38ef-7ed1-4282-8ac5-59fcd232784d" containerName="extract" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.143689 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.145973 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.146045 5129 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-qpbbs" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.146158 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.163050 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz"] Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.283095 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cl52\" (UniqueName: \"kubernetes.io/projected/58d27422-4ea2-4e53-a895-adaf4878bf22-kube-api-access-6cl52\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cp2jz\" (UID: \"58d27422-4ea2-4e53-a895-adaf4878bf22\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.283164 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58d27422-4ea2-4e53-a895-adaf4878bf22-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cp2jz\" (UID: \"58d27422-4ea2-4e53-a895-adaf4878bf22\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.384972 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cl52\" (UniqueName: \"kubernetes.io/projected/58d27422-4ea2-4e53-a895-adaf4878bf22-kube-api-access-6cl52\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cp2jz\" (UID: \"58d27422-4ea2-4e53-a895-adaf4878bf22\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.385154 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58d27422-4ea2-4e53-a895-adaf4878bf22-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cp2jz\" (UID: \"58d27422-4ea2-4e53-a895-adaf4878bf22\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.386358 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58d27422-4ea2-4e53-a895-adaf4878bf22-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cp2jz\" (UID: \"58d27422-4ea2-4e53-a895-adaf4878bf22\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.404589 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cl52\" (UniqueName: \"kubernetes.io/projected/58d27422-4ea2-4e53-a895-adaf4878bf22-kube-api-access-6cl52\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cp2jz\" (UID: \"58d27422-4ea2-4e53-a895-adaf4878bf22\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.457124 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" Mar 14 07:17:09 crc kubenswrapper[5129]: I0314 07:17:09.885148 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz"] Mar 14 07:17:10 crc kubenswrapper[5129]: I0314 07:17:10.707273 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" event={"ID":"58d27422-4ea2-4e53-a895-adaf4878bf22","Type":"ContainerStarted","Data":"58d953ca9a20eb36a2fc7758798080a7863db3e54dfa42b8103e36d334dd850e"} Mar 14 07:17:12 crc kubenswrapper[5129]: I0314 07:17:12.719380 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" event={"ID":"58d27422-4ea2-4e53-a895-adaf4878bf22","Type":"ContainerStarted","Data":"79e9d7620ce3d9e65e611fd7dd078459cf38fb4570907a8dd6689fd39a389ac5"} Mar 14 07:17:12 crc kubenswrapper[5129]: I0314 07:17:12.741406 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cp2jz" podStartSLOduration=1.090914188 podStartE2EDuration="3.741386916s" podCreationTimestamp="2026-03-14 07:17:09 +0000 UTC" firstStartedPulling="2026-03-14 07:17:09.892935278 +0000 UTC m=+1092.644850472" lastFinishedPulling="2026-03-14 07:17:12.543408016 +0000 UTC m=+1095.295323200" observedRunningTime="2026-03-14 07:17:12.736495213 +0000 UTC m=+1095.488410407" watchObservedRunningTime="2026-03-14 07:17:12.741386916 +0000 UTC m=+1095.493302100" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.554554 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgd5m"] Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.555446 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.557129 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.557242 5129 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-lwzwd" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.560477 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.560993 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fef9f0a-fc01-4562-ab16-343197009953-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgd5m\" (UID: \"8fef9f0a-fc01-4562-ab16-343197009953\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.561072 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt2bc\" (UniqueName: \"kubernetes.io/projected/8fef9f0a-fc01-4562-ab16-343197009953-kube-api-access-qt2bc\") pod \"cert-manager-webhook-6888856db4-kgd5m\" (UID: \"8fef9f0a-fc01-4562-ab16-343197009953\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.577190 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgd5m"] Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.661746 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt2bc\" (UniqueName: \"kubernetes.io/projected/8fef9f0a-fc01-4562-ab16-343197009953-kube-api-access-qt2bc\") pod \"cert-manager-webhook-6888856db4-kgd5m\" (UID: \"8fef9f0a-fc01-4562-ab16-343197009953\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.662001 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fef9f0a-fc01-4562-ab16-343197009953-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgd5m\" (UID: \"8fef9f0a-fc01-4562-ab16-343197009953\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.678845 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fef9f0a-fc01-4562-ab16-343197009953-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgd5m\" (UID: \"8fef9f0a-fc01-4562-ab16-343197009953\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.683870 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt2bc\" (UniqueName: \"kubernetes.io/projected/8fef9f0a-fc01-4562-ab16-343197009953-kube-api-access-qt2bc\") pod \"cert-manager-webhook-6888856db4-kgd5m\" (UID: \"8fef9f0a-fc01-4562-ab16-343197009953\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:15 crc kubenswrapper[5129]: I0314 07:17:15.872034 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:16 crc kubenswrapper[5129]: I0314 07:17:16.376453 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgd5m"] Mar 14 07:17:16 crc kubenswrapper[5129]: I0314 07:17:16.740195 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" event={"ID":"8fef9f0a-fc01-4562-ab16-343197009953","Type":"ContainerStarted","Data":"6fc3bd88e59c773c641bd91305aa53c8fb3280da5d6f032c3cbf34e73c925510"} Mar 14 07:17:17 crc kubenswrapper[5129]: I0314 07:17:17.926707 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-z95qp"] Mar 14 07:17:17 crc kubenswrapper[5129]: I0314 07:17:17.927668 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:17 crc kubenswrapper[5129]: I0314 07:17:17.929610 5129 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-nmhxd" Mar 14 07:17:17 crc kubenswrapper[5129]: I0314 07:17:17.940230 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-z95qp"] Mar 14 07:17:17 crc kubenswrapper[5129]: I0314 07:17:17.992927 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38b33329-e9fa-4b4c-8287-e85ece3d93d6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-z95qp\" (UID: \"38b33329-e9fa-4b4c-8287-e85ece3d93d6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:17 crc kubenswrapper[5129]: I0314 07:17:17.992993 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bnpc\" (UniqueName: \"kubernetes.io/projected/38b33329-e9fa-4b4c-8287-e85ece3d93d6-kube-api-access-6bnpc\") pod \"cert-manager-cainjector-5545bd876-z95qp\" (UID: \"38b33329-e9fa-4b4c-8287-e85ece3d93d6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:18 crc kubenswrapper[5129]: I0314 07:17:18.095185 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bnpc\" (UniqueName: \"kubernetes.io/projected/38b33329-e9fa-4b4c-8287-e85ece3d93d6-kube-api-access-6bnpc\") pod \"cert-manager-cainjector-5545bd876-z95qp\" (UID: \"38b33329-e9fa-4b4c-8287-e85ece3d93d6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:18 crc kubenswrapper[5129]: I0314 07:17:18.095309 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38b33329-e9fa-4b4c-8287-e85ece3d93d6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-z95qp\" (UID: \"38b33329-e9fa-4b4c-8287-e85ece3d93d6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:18 crc kubenswrapper[5129]: I0314 07:17:18.123439 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38b33329-e9fa-4b4c-8287-e85ece3d93d6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-z95qp\" (UID: \"38b33329-e9fa-4b4c-8287-e85ece3d93d6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:18 crc kubenswrapper[5129]: I0314 07:17:18.124113 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bnpc\" (UniqueName: \"kubernetes.io/projected/38b33329-e9fa-4b4c-8287-e85ece3d93d6-kube-api-access-6bnpc\") pod \"cert-manager-cainjector-5545bd876-z95qp\" (UID: \"38b33329-e9fa-4b4c-8287-e85ece3d93d6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:18 crc kubenswrapper[5129]: I0314 07:17:18.253041 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" Mar 14 07:17:18 crc kubenswrapper[5129]: I0314 07:17:18.483199 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-z95qp"] Mar 14 07:17:18 crc kubenswrapper[5129]: I0314 07:17:18.753188 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" event={"ID":"38b33329-e9fa-4b4c-8287-e85ece3d93d6","Type":"ContainerStarted","Data":"6db386801c92d78317eab3fe6d45aa8c1c4d7fe3f7e1f6ecd4f795869c8edff1"} Mar 14 07:17:19 crc kubenswrapper[5129]: I0314 07:17:19.574884 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:17:19 crc kubenswrapper[5129]: I0314 07:17:19.574934 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:17:22 crc kubenswrapper[5129]: I0314 07:17:22.115590 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" event={"ID":"8fef9f0a-fc01-4562-ab16-343197009953","Type":"ContainerStarted","Data":"183d5addc0dc8fdef2932f0b42792b8c83817c28a9bfa95ce613319635a97d70"} Mar 14 07:17:22 crc kubenswrapper[5129]: I0314 07:17:22.117467 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:22 crc kubenswrapper[5129]: I0314 07:17:22.118191 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" event={"ID":"38b33329-e9fa-4b4c-8287-e85ece3d93d6","Type":"ContainerStarted","Data":"da562bb59005b6ec9cd2bb1115b296573ea74bc842a003200dcebd4ec86d32f2"} Mar 14 07:17:22 crc kubenswrapper[5129]: I0314 07:17:22.145758 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" podStartSLOduration=2.300548081 podStartE2EDuration="7.145736493s" podCreationTimestamp="2026-03-14 07:17:15 +0000 UTC" firstStartedPulling="2026-03-14 07:17:16.382251661 +0000 UTC m=+1099.134166885" lastFinishedPulling="2026-03-14 07:17:21.227440103 +0000 UTC m=+1103.979355297" observedRunningTime="2026-03-14 07:17:22.13762315 +0000 UTC m=+1104.889538344" watchObservedRunningTime="2026-03-14 07:17:22.145736493 +0000 UTC m=+1104.897651707" Mar 14 07:17:22 crc kubenswrapper[5129]: I0314 07:17:22.155212 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-z95qp" podStartSLOduration=2.422114352 podStartE2EDuration="5.155191382s" podCreationTimestamp="2026-03-14 07:17:17 +0000 UTC" firstStartedPulling="2026-03-14 07:17:18.494803615 +0000 UTC m=+1101.246718789" lastFinishedPulling="2026-03-14 07:17:21.227880635 +0000 UTC m=+1103.979795819" observedRunningTime="2026-03-14 07:17:22.153104995 +0000 UTC m=+1104.905020189" watchObservedRunningTime="2026-03-14 07:17:22.155191382 +0000 UTC m=+1104.907106586" Mar 14 07:17:30 crc kubenswrapper[5129]: I0314 07:17:30.875987 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-kgd5m" Mar 14 07:17:35 crc kubenswrapper[5129]: I0314 07:17:35.787983 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-bjkz4"] Mar 14 07:17:35 crc kubenswrapper[5129]: I0314 07:17:35.790346 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:35 crc kubenswrapper[5129]: I0314 07:17:35.794189 5129 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zq49r" Mar 14 07:17:35 crc kubenswrapper[5129]: I0314 07:17:35.807291 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-bjkz4"] Mar 14 07:17:35 crc kubenswrapper[5129]: I0314 07:17:35.932306 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrdk5\" (UniqueName: \"kubernetes.io/projected/a08bceb1-f043-47d5-adea-41adb89f8acc-kube-api-access-vrdk5\") pod \"cert-manager-545d4d4674-bjkz4\" (UID: \"a08bceb1-f043-47d5-adea-41adb89f8acc\") " pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:35 crc kubenswrapper[5129]: I0314 07:17:35.933545 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a08bceb1-f043-47d5-adea-41adb89f8acc-bound-sa-token\") pod \"cert-manager-545d4d4674-bjkz4\" (UID: \"a08bceb1-f043-47d5-adea-41adb89f8acc\") " pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:36 crc kubenswrapper[5129]: I0314 07:17:36.035322 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a08bceb1-f043-47d5-adea-41adb89f8acc-bound-sa-token\") pod \"cert-manager-545d4d4674-bjkz4\" (UID: \"a08bceb1-f043-47d5-adea-41adb89f8acc\") " pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:36 crc kubenswrapper[5129]: I0314 07:17:36.035983 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrdk5\" (UniqueName: \"kubernetes.io/projected/a08bceb1-f043-47d5-adea-41adb89f8acc-kube-api-access-vrdk5\") pod \"cert-manager-545d4d4674-bjkz4\" (UID: \"a08bceb1-f043-47d5-adea-41adb89f8acc\") " pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:36 crc kubenswrapper[5129]: I0314 07:17:36.058310 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a08bceb1-f043-47d5-adea-41adb89f8acc-bound-sa-token\") pod \"cert-manager-545d4d4674-bjkz4\" (UID: \"a08bceb1-f043-47d5-adea-41adb89f8acc\") " pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:36 crc kubenswrapper[5129]: I0314 07:17:36.058445 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrdk5\" (UniqueName: \"kubernetes.io/projected/a08bceb1-f043-47d5-adea-41adb89f8acc-kube-api-access-vrdk5\") pod \"cert-manager-545d4d4674-bjkz4\" (UID: \"a08bceb1-f043-47d5-adea-41adb89f8acc\") " pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:36 crc kubenswrapper[5129]: I0314 07:17:36.128631 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-bjkz4" Mar 14 07:17:36 crc kubenswrapper[5129]: I0314 07:17:36.543206 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-bjkz4"] Mar 14 07:17:37 crc kubenswrapper[5129]: I0314 07:17:37.217290 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-bjkz4" event={"ID":"a08bceb1-f043-47d5-adea-41adb89f8acc","Type":"ContainerStarted","Data":"60e6507ae0be37ccc8732d6dba9aafa52c64dc505f27967ca5d9fe98bc2df97a"} Mar 14 07:17:37 crc kubenswrapper[5129]: I0314 07:17:37.217367 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-bjkz4" event={"ID":"a08bceb1-f043-47d5-adea-41adb89f8acc","Type":"ContainerStarted","Data":"5fb24747e18a0bde1748e706192bb8f7df86a13cde8f8ec1ec9e0a2045fe8889"} Mar 14 07:17:37 crc kubenswrapper[5129]: I0314 07:17:37.237141 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-bjkz4" podStartSLOduration=2.237110733 podStartE2EDuration="2.237110733s" podCreationTimestamp="2026-03-14 07:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:17:37.228961858 +0000 UTC m=+1119.980877052" watchObservedRunningTime="2026-03-14 07:17:37.237110733 +0000 UTC m=+1119.989025937" Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.571422 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-psp8m"] Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.573958 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psp8m" Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.576105 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.576321 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-47748" Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.576535 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.583341 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-psp8m"] Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.770131 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffwk5\" (UniqueName: \"kubernetes.io/projected/b9e7a979-b6cc-44c0-bc0b-68a4409ba39f-kube-api-access-ffwk5\") pod \"openstack-operator-index-psp8m\" (UID: \"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f\") " pod="openstack-operators/openstack-operator-index-psp8m" Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.871576 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffwk5\" (UniqueName: \"kubernetes.io/projected/b9e7a979-b6cc-44c0-bc0b-68a4409ba39f-kube-api-access-ffwk5\") pod \"openstack-operator-index-psp8m\" (UID: \"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f\") " pod="openstack-operators/openstack-operator-index-psp8m" Mar 14 07:17:44 crc kubenswrapper[5129]: I0314 07:17:44.907951 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffwk5\" (UniqueName: \"kubernetes.io/projected/b9e7a979-b6cc-44c0-bc0b-68a4409ba39f-kube-api-access-ffwk5\") pod \"openstack-operator-index-psp8m\" (UID: \"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f\") " pod="openstack-operators/openstack-operator-index-psp8m" Mar 14 07:17:45 crc kubenswrapper[5129]: I0314 07:17:45.201524 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psp8m" Mar 14 07:17:45 crc kubenswrapper[5129]: I0314 07:17:45.619317 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-psp8m"] Mar 14 07:17:46 crc kubenswrapper[5129]: I0314 07:17:46.288424 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psp8m" event={"ID":"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f","Type":"ContainerStarted","Data":"5045c2dd06f196d277783329a834a369fd47b513fa80c98bf00251e698b5368e"} Mar 14 07:17:47 crc kubenswrapper[5129]: I0314 07:17:47.295703 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psp8m" event={"ID":"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f","Type":"ContainerStarted","Data":"c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc"} Mar 14 07:17:47 crc kubenswrapper[5129]: I0314 07:17:47.317943 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-psp8m" podStartSLOduration=2.517859225 podStartE2EDuration="3.317913085s" podCreationTimestamp="2026-03-14 07:17:44 +0000 UTC" firstStartedPulling="2026-03-14 07:17:45.629508039 +0000 UTC m=+1128.381423223" lastFinishedPulling="2026-03-14 07:17:46.429561899 +0000 UTC m=+1129.181477083" observedRunningTime="2026-03-14 07:17:47.312384553 +0000 UTC m=+1130.064299747" watchObservedRunningTime="2026-03-14 07:17:47.317913085 +0000 UTC m=+1130.069828329" Mar 14 07:17:47 crc kubenswrapper[5129]: I0314 07:17:47.743957 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-psp8m"] Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.353540 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-56jrj"] Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.355271 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.373227 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-56jrj"] Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.434521 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpjp\" (UniqueName: \"kubernetes.io/projected/39018862-672e-47fd-85bb-c1baa5a8db7b-kube-api-access-xnpjp\") pod \"openstack-operator-index-56jrj\" (UID: \"39018862-672e-47fd-85bb-c1baa5a8db7b\") " pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.535516 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpjp\" (UniqueName: \"kubernetes.io/projected/39018862-672e-47fd-85bb-c1baa5a8db7b-kube-api-access-xnpjp\") pod \"openstack-operator-index-56jrj\" (UID: \"39018862-672e-47fd-85bb-c1baa5a8db7b\") " pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.569890 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpjp\" (UniqueName: \"kubernetes.io/projected/39018862-672e-47fd-85bb-c1baa5a8db7b-kube-api-access-xnpjp\") pod \"openstack-operator-index-56jrj\" (UID: \"39018862-672e-47fd-85bb-c1baa5a8db7b\") " pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.679532 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:48 crc kubenswrapper[5129]: I0314 07:17:48.895487 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-56jrj"] Mar 14 07:17:48 crc kubenswrapper[5129]: W0314 07:17:48.898543 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39018862_672e_47fd_85bb_c1baa5a8db7b.slice/crio-bff8a914125eb3360080c25d1b88fd46fd42dea552cd3ecd85981fc18e5cac1f WatchSource:0}: Error finding container bff8a914125eb3360080c25d1b88fd46fd42dea552cd3ecd85981fc18e5cac1f: Status 404 returned error can't find the container with id bff8a914125eb3360080c25d1b88fd46fd42dea552cd3ecd85981fc18e5cac1f Mar 14 07:17:49 crc kubenswrapper[5129]: I0314 07:17:49.353555 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-psp8m" podUID="b9e7a979-b6cc-44c0-bc0b-68a4409ba39f" containerName="registry-server" containerID="cri-o://c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc" gracePeriod=2 Mar 14 07:17:49 crc kubenswrapper[5129]: I0314 07:17:49.354151 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-56jrj" event={"ID":"39018862-672e-47fd-85bb-c1baa5a8db7b","Type":"ContainerStarted","Data":"bff8a914125eb3360080c25d1b88fd46fd42dea552cd3ecd85981fc18e5cac1f"} Mar 14 07:17:49 crc kubenswrapper[5129]: I0314 07:17:49.574271 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:17:49 crc kubenswrapper[5129]: I0314 07:17:49.574793 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:17:49 crc kubenswrapper[5129]: I0314 07:17:49.805104 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psp8m" Mar 14 07:17:49 crc kubenswrapper[5129]: I0314 07:17:49.953583 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffwk5\" (UniqueName: \"kubernetes.io/projected/b9e7a979-b6cc-44c0-bc0b-68a4409ba39f-kube-api-access-ffwk5\") pod \"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f\" (UID: \"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f\") " Mar 14 07:17:49 crc kubenswrapper[5129]: I0314 07:17:49.959629 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e7a979-b6cc-44c0-bc0b-68a4409ba39f-kube-api-access-ffwk5" (OuterVolumeSpecName: "kube-api-access-ffwk5") pod "b9e7a979-b6cc-44c0-bc0b-68a4409ba39f" (UID: "b9e7a979-b6cc-44c0-bc0b-68a4409ba39f"). InnerVolumeSpecName "kube-api-access-ffwk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.054733 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffwk5\" (UniqueName: \"kubernetes.io/projected/b9e7a979-b6cc-44c0-bc0b-68a4409ba39f-kube-api-access-ffwk5\") on node \"crc\" DevicePath \"\"" Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.363316 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-56jrj" event={"ID":"39018862-672e-47fd-85bb-c1baa5a8db7b","Type":"ContainerStarted","Data":"2b0c5673f2d29e6e0866531d2f7fd75272caa47e7bb864a8d56394acae67259c"} Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.366687 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psp8m" Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.366831 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psp8m" event={"ID":"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f","Type":"ContainerDied","Data":"c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc"} Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.366901 5129 scope.go:117] "RemoveContainer" containerID="c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc" Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.367190 5129 generic.go:334] "Generic (PLEG): container finished" podID="b9e7a979-b6cc-44c0-bc0b-68a4409ba39f" containerID="c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc" exitCode=0 Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.367301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psp8m" event={"ID":"b9e7a979-b6cc-44c0-bc0b-68a4409ba39f","Type":"ContainerDied","Data":"5045c2dd06f196d277783329a834a369fd47b513fa80c98bf00251e698b5368e"} Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.385691 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-56jrj" podStartSLOduration=1.867391956 podStartE2EDuration="2.385465465s" podCreationTimestamp="2026-03-14 07:17:48 +0000 UTC" firstStartedPulling="2026-03-14 07:17:48.903169485 +0000 UTC m=+1131.655084669" lastFinishedPulling="2026-03-14 07:17:49.421242994 +0000 UTC m=+1132.173158178" observedRunningTime="2026-03-14 07:17:50.385134016 +0000 UTC m=+1133.137049220" watchObservedRunningTime="2026-03-14 07:17:50.385465465 +0000 UTC m=+1133.137380679" Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.399881 5129 scope.go:117] "RemoveContainer" containerID="c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc" Mar 14 07:17:50 crc kubenswrapper[5129]: E0314 07:17:50.400415 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc\": container with ID starting with c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc not found: ID does not exist" containerID="c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc" Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.400451 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc"} err="failed to get container status \"c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc\": rpc error: code = NotFound desc = could not find container \"c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc\": container with ID starting with c2ac7d3c1643f2e0388558e013678780b261c8f662fe5afeffec90cd3c778cbc not found: ID does not exist" Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.410038 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-psp8m"] Mar 14 07:17:50 crc kubenswrapper[5129]: I0314 07:17:50.414325 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-psp8m"] Mar 14 07:17:52 crc kubenswrapper[5129]: I0314 07:17:52.050521 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e7a979-b6cc-44c0-bc0b-68a4409ba39f" path="/var/lib/kubelet/pods/b9e7a979-b6cc-44c0-bc0b-68a4409ba39f/volumes" Mar 14 07:17:58 crc kubenswrapper[5129]: I0314 07:17:58.680069 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:58 crc kubenswrapper[5129]: I0314 07:17:58.680563 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:58 crc kubenswrapper[5129]: I0314 07:17:58.711910 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:17:59 crc kubenswrapper[5129]: I0314 07:17:59.482960 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-56jrj" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.132083 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557878-7lvhr"] Mar 14 07:18:00 crc kubenswrapper[5129]: E0314 07:18:00.132702 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e7a979-b6cc-44c0-bc0b-68a4409ba39f" containerName="registry-server" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.132720 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e7a979-b6cc-44c0-bc0b-68a4409ba39f" containerName="registry-server" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.132857 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e7a979-b6cc-44c0-bc0b-68a4409ba39f" containerName="registry-server" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.133392 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-7lvhr" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.136647 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.137351 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.137483 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.149283 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-7lvhr"] Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.301741 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2n69\" (UniqueName: \"kubernetes.io/projected/ed5d2dbd-b414-462c-9fb1-432593876d05-kube-api-access-z2n69\") pod \"auto-csr-approver-29557878-7lvhr\" (UID: \"ed5d2dbd-b414-462c-9fb1-432593876d05\") " pod="openshift-infra/auto-csr-approver-29557878-7lvhr" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.402897 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2n69\" (UniqueName: \"kubernetes.io/projected/ed5d2dbd-b414-462c-9fb1-432593876d05-kube-api-access-z2n69\") pod \"auto-csr-approver-29557878-7lvhr\" (UID: \"ed5d2dbd-b414-462c-9fb1-432593876d05\") " pod="openshift-infra/auto-csr-approver-29557878-7lvhr" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.425039 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2n69\" (UniqueName: \"kubernetes.io/projected/ed5d2dbd-b414-462c-9fb1-432593876d05-kube-api-access-z2n69\") pod \"auto-csr-approver-29557878-7lvhr\" (UID: \"ed5d2dbd-b414-462c-9fb1-432593876d05\") " pod="openshift-infra/auto-csr-approver-29557878-7lvhr" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.454007 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-7lvhr" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.593653 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c"] Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.595874 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.609129 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zvqbx" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.612895 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c"] Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.709025 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.709140 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.709322 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjdbn\" (UniqueName: \"kubernetes.io/projected/5f53d8e2-e38d-425d-948c-bd705f3c273f-kube-api-access-tjdbn\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.713373 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-7lvhr"] Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.811214 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.811720 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.811759 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.811793 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjdbn\" (UniqueName: \"kubernetes.io/projected/5f53d8e2-e38d-425d-948c-bd705f3c273f-kube-api-access-tjdbn\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.812270 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.832989 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjdbn\" (UniqueName: \"kubernetes.io/projected/5f53d8e2-e38d-425d-948c-bd705f3c273f-kube-api-access-tjdbn\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:00 crc kubenswrapper[5129]: I0314 07:18:00.924247 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:01 crc kubenswrapper[5129]: I0314 07:18:01.419241 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c"] Mar 14 07:18:01 crc kubenswrapper[5129]: W0314 07:18:01.424022 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f53d8e2_e38d_425d_948c_bd705f3c273f.slice/crio-1a253c78d15fd4be40a761483cae8eb34fcfd3d7305fe7891be1c8258bf034ef WatchSource:0}: Error finding container 1a253c78d15fd4be40a761483cae8eb34fcfd3d7305fe7891be1c8258bf034ef: Status 404 returned error can't find the container with id 1a253c78d15fd4be40a761483cae8eb34fcfd3d7305fe7891be1c8258bf034ef Mar 14 07:18:01 crc kubenswrapper[5129]: I0314 07:18:01.461256 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" event={"ID":"5f53d8e2-e38d-425d-948c-bd705f3c273f","Type":"ContainerStarted","Data":"1a253c78d15fd4be40a761483cae8eb34fcfd3d7305fe7891be1c8258bf034ef"} Mar 14 07:18:01 crc kubenswrapper[5129]: I0314 07:18:01.462501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-7lvhr" event={"ID":"ed5d2dbd-b414-462c-9fb1-432593876d05","Type":"ContainerStarted","Data":"daa6523444ef16152f099e796a7e6588b15ca5ce5ca830fa682a62524017b07f"} Mar 14 07:18:02 crc kubenswrapper[5129]: I0314 07:18:02.473031 5129 generic.go:334] "Generic (PLEG): container finished" podID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerID="a1957f4f97900226c5ee552020628621dc1aadaeb8833e9af130fd7830c7f424" exitCode=0 Mar 14 07:18:02 crc kubenswrapper[5129]: I0314 07:18:02.473101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" event={"ID":"5f53d8e2-e38d-425d-948c-bd705f3c273f","Type":"ContainerDied","Data":"a1957f4f97900226c5ee552020628621dc1aadaeb8833e9af130fd7830c7f424"} Mar 14 07:18:02 crc kubenswrapper[5129]: I0314 07:18:02.475949 5129 generic.go:334] "Generic (PLEG): container finished" podID="ed5d2dbd-b414-462c-9fb1-432593876d05" containerID="be3ea10d03d65f61c2755445872d28b8dc27d34ae0972e6e5b9c64b7297f6126" exitCode=0 Mar 14 07:18:02 crc kubenswrapper[5129]: I0314 07:18:02.475985 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-7lvhr" event={"ID":"ed5d2dbd-b414-462c-9fb1-432593876d05","Type":"ContainerDied","Data":"be3ea10d03d65f61c2755445872d28b8dc27d34ae0972e6e5b9c64b7297f6126"} Mar 14 07:18:03 crc kubenswrapper[5129]: I0314 07:18:03.487716 5129 generic.go:334] "Generic (PLEG): container finished" podID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerID="aad19c0bf8cb2c78dcbf92ff5812e25a95896534cbdb0be841660138a297fb77" exitCode=0 Mar 14 07:18:03 crc kubenswrapper[5129]: I0314 07:18:03.487826 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" event={"ID":"5f53d8e2-e38d-425d-948c-bd705f3c273f","Type":"ContainerDied","Data":"aad19c0bf8cb2c78dcbf92ff5812e25a95896534cbdb0be841660138a297fb77"} Mar 14 07:18:03 crc kubenswrapper[5129]: I0314 07:18:03.792783 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-7lvhr" Mar 14 07:18:03 crc kubenswrapper[5129]: I0314 07:18:03.968637 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2n69\" (UniqueName: \"kubernetes.io/projected/ed5d2dbd-b414-462c-9fb1-432593876d05-kube-api-access-z2n69\") pod \"ed5d2dbd-b414-462c-9fb1-432593876d05\" (UID: \"ed5d2dbd-b414-462c-9fb1-432593876d05\") " Mar 14 07:18:03 crc kubenswrapper[5129]: I0314 07:18:03.975871 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5d2dbd-b414-462c-9fb1-432593876d05-kube-api-access-z2n69" (OuterVolumeSpecName: "kube-api-access-z2n69") pod "ed5d2dbd-b414-462c-9fb1-432593876d05" (UID: "ed5d2dbd-b414-462c-9fb1-432593876d05"). InnerVolumeSpecName "kube-api-access-z2n69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.070401 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2n69\" (UniqueName: \"kubernetes.io/projected/ed5d2dbd-b414-462c-9fb1-432593876d05-kube-api-access-z2n69\") on node \"crc\" DevicePath \"\"" Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.500565 5129 generic.go:334] "Generic (PLEG): container finished" podID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerID="3ee311b0c1ef47a23922a55b803291e7126db9638fb6dd77cfd2e68725f56267" exitCode=0 Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.500630 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" event={"ID":"5f53d8e2-e38d-425d-948c-bd705f3c273f","Type":"ContainerDied","Data":"3ee311b0c1ef47a23922a55b803291e7126db9638fb6dd77cfd2e68725f56267"} Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.502997 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557878-7lvhr" event={"ID":"ed5d2dbd-b414-462c-9fb1-432593876d05","Type":"ContainerDied","Data":"daa6523444ef16152f099e796a7e6588b15ca5ce5ca830fa682a62524017b07f"} Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.503055 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daa6523444ef16152f099e796a7e6588b15ca5ce5ca830fa682a62524017b07f" Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.503052 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557878-7lvhr" Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.863276 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-xpltw"] Mar 14 07:18:04 crc kubenswrapper[5129]: I0314 07:18:04.869190 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557872-xpltw"] Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.805053 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.892968 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjdbn\" (UniqueName: \"kubernetes.io/projected/5f53d8e2-e38d-425d-948c-bd705f3c273f-kube-api-access-tjdbn\") pod \"5f53d8e2-e38d-425d-948c-bd705f3c273f\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.893665 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-bundle\") pod \"5f53d8e2-e38d-425d-948c-bd705f3c273f\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.894363 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-bundle" (OuterVolumeSpecName: "bundle") pod "5f53d8e2-e38d-425d-948c-bd705f3c273f" (UID: "5f53d8e2-e38d-425d-948c-bd705f3c273f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.908766 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f53d8e2-e38d-425d-948c-bd705f3c273f-kube-api-access-tjdbn" (OuterVolumeSpecName: "kube-api-access-tjdbn") pod "5f53d8e2-e38d-425d-948c-bd705f3c273f" (UID: "5f53d8e2-e38d-425d-948c-bd705f3c273f"). InnerVolumeSpecName "kube-api-access-tjdbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.995343 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-util\") pod \"5f53d8e2-e38d-425d-948c-bd705f3c273f\" (UID: \"5f53d8e2-e38d-425d-948c-bd705f3c273f\") " Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.995668 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjdbn\" (UniqueName: \"kubernetes.io/projected/5f53d8e2-e38d-425d-948c-bd705f3c273f-kube-api-access-tjdbn\") on node \"crc\" DevicePath \"\"" Mar 14 07:18:05 crc kubenswrapper[5129]: I0314 07:18:05.995689 5129 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:18:06 crc kubenswrapper[5129]: I0314 07:18:06.015840 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-util" (OuterVolumeSpecName: "util") pod "5f53d8e2-e38d-425d-948c-bd705f3c273f" (UID: "5f53d8e2-e38d-425d-948c-bd705f3c273f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:18:06 crc kubenswrapper[5129]: I0314 07:18:06.053386 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f" path="/var/lib/kubelet/pods/f70149ec-bb6b-4dd4-9b6d-f7b33fd0879f/volumes" Mar 14 07:18:06 crc kubenswrapper[5129]: I0314 07:18:06.096437 5129 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f53d8e2-e38d-425d-948c-bd705f3c273f-util\") on node \"crc\" DevicePath \"\"" Mar 14 07:18:06 crc kubenswrapper[5129]: I0314 07:18:06.522663 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" event={"ID":"5f53d8e2-e38d-425d-948c-bd705f3c273f","Type":"ContainerDied","Data":"1a253c78d15fd4be40a761483cae8eb34fcfd3d7305fe7891be1c8258bf034ef"} Mar 14 07:18:06 crc kubenswrapper[5129]: I0314 07:18:06.522727 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a253c78d15fd4be40a761483cae8eb34fcfd3d7305fe7891be1c8258bf034ef" Mar 14 07:18:06 crc kubenswrapper[5129]: I0314 07:18:06.522757 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.757282 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4"] Mar 14 07:18:12 crc kubenswrapper[5129]: E0314 07:18:12.758158 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerName="util" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.758195 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerName="util" Mar 14 07:18:12 crc kubenswrapper[5129]: E0314 07:18:12.758229 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerName="pull" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.758240 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerName="pull" Mar 14 07:18:12 crc kubenswrapper[5129]: E0314 07:18:12.758251 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerName="extract" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.758262 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerName="extract" Mar 14 07:18:12 crc kubenswrapper[5129]: E0314 07:18:12.758284 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5d2dbd-b414-462c-9fb1-432593876d05" containerName="oc" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.758295 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5d2dbd-b414-462c-9fb1-432593876d05" containerName="oc" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.758475 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f53d8e2-e38d-425d-948c-bd705f3c273f" containerName="extract" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.758488 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5d2dbd-b414-462c-9fb1-432593876d05" containerName="oc" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.759155 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.761123 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-m5qfb" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.787323 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf72h\" (UniqueName: \"kubernetes.io/projected/ee2defc7-719a-4d20-94ee-0ce74a6015c6-kube-api-access-gf72h\") pod \"openstack-operator-controller-init-6dc56d8cd6-r5vg4\" (UID: \"ee2defc7-719a-4d20-94ee-0ce74a6015c6\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.787665 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4"] Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.888500 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf72h\" (UniqueName: \"kubernetes.io/projected/ee2defc7-719a-4d20-94ee-0ce74a6015c6-kube-api-access-gf72h\") pod \"openstack-operator-controller-init-6dc56d8cd6-r5vg4\" (UID: \"ee2defc7-719a-4d20-94ee-0ce74a6015c6\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" Mar 14 07:18:12 crc kubenswrapper[5129]: I0314 07:18:12.912155 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf72h\" (UniqueName: \"kubernetes.io/projected/ee2defc7-719a-4d20-94ee-0ce74a6015c6-kube-api-access-gf72h\") pod \"openstack-operator-controller-init-6dc56d8cd6-r5vg4\" (UID: \"ee2defc7-719a-4d20-94ee-0ce74a6015c6\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" Mar 14 07:18:13 crc kubenswrapper[5129]: I0314 07:18:13.074510 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" Mar 14 07:18:13 crc kubenswrapper[5129]: I0314 07:18:13.292271 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4"] Mar 14 07:18:13 crc kubenswrapper[5129]: I0314 07:18:13.576203 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" event={"ID":"ee2defc7-719a-4d20-94ee-0ce74a6015c6","Type":"ContainerStarted","Data":"10a8fa24c0087734f63d91458416f353f838725436fbe1c52a8787eb0c87e01a"} Mar 14 07:18:16 crc kubenswrapper[5129]: I0314 07:18:16.954977 5129 scope.go:117] "RemoveContainer" containerID="aa2ba740349323917bb8b1a9dd0ddf369ac27b9f3c1945d011313c578b1e009f" Mar 14 07:18:17 crc kubenswrapper[5129]: I0314 07:18:17.603771 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" event={"ID":"ee2defc7-719a-4d20-94ee-0ce74a6015c6","Type":"ContainerStarted","Data":"91c8578895c0c578567befd0f26363aca3420362ac1b0e4adfd51d28a5bea3a8"} Mar 14 07:18:17 crc kubenswrapper[5129]: I0314 07:18:17.604174 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" Mar 14 07:18:17 crc kubenswrapper[5129]: I0314 07:18:17.647812 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" podStartSLOduration=1.650300538 podStartE2EDuration="5.64779719s" podCreationTimestamp="2026-03-14 07:18:12 +0000 UTC" firstStartedPulling="2026-03-14 07:18:13.318801847 +0000 UTC m=+1156.070717031" lastFinishedPulling="2026-03-14 07:18:17.316298499 +0000 UTC m=+1160.068213683" observedRunningTime="2026-03-14 07:18:17.630567997 +0000 UTC m=+1160.382483191" watchObservedRunningTime="2026-03-14 07:18:17.64779719 +0000 UTC m=+1160.399712374" Mar 14 07:18:19 crc kubenswrapper[5129]: I0314 07:18:19.828973 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:18:19 crc kubenswrapper[5129]: I0314 07:18:19.829352 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:18:19 crc kubenswrapper[5129]: I0314 07:18:19.829415 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:18:19 crc kubenswrapper[5129]: I0314 07:18:19.830184 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74d416b7b010cf091ac9c019f0f997a8bcaaae0573c048f8fb52b3ba09a111ee"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:18:19 crc kubenswrapper[5129]: I0314 07:18:19.830276 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://74d416b7b010cf091ac9c019f0f997a8bcaaae0573c048f8fb52b3ba09a111ee" gracePeriod=600 Mar 14 07:18:20 crc kubenswrapper[5129]: I0314 07:18:20.851375 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="74d416b7b010cf091ac9c019f0f997a8bcaaae0573c048f8fb52b3ba09a111ee" exitCode=0 Mar 14 07:18:20 crc kubenswrapper[5129]: I0314 07:18:20.851461 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"74d416b7b010cf091ac9c019f0f997a8bcaaae0573c048f8fb52b3ba09a111ee"} Mar 14 07:18:20 crc kubenswrapper[5129]: I0314 07:18:20.851912 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"2bfe21ee59c696834fdd6604caeed60a263cff0e96aef565e246c7dd2d9b99d9"} Mar 14 07:18:20 crc kubenswrapper[5129]: I0314 07:18:20.851958 5129 scope.go:117] "RemoveContainer" containerID="8b4a31888cd52c24930717574880110e2ffda0608e5edc1431b7914f81c314fb" Mar 14 07:18:23 crc kubenswrapper[5129]: I0314 07:18:23.077749 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-r5vg4" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.442251 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.446596 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.454283 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.456444 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.457471 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.461320 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fw985" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.461631 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fgrwn" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.468634 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.469573 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.473396 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jrl7c" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.475542 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.497673 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.533242 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.534279 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.538314 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-k4z2t" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.544570 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.545321 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.546931 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rzgrr" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.568098 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.581333 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.582137 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.584063 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-p7t9g" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.585330 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.601422 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.612897 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.613902 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.616144 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-b82fx" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.620490 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.621258 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.623233 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqgp9\" (UniqueName: \"kubernetes.io/projected/33a84627-b97d-4f3b-84ec-d81a54c1e56c-kube-api-access-fqgp9\") pod \"glance-operator-controller-manager-5964f64c48-rzk97\" (UID: \"33a84627-b97d-4f3b-84ec-d81a54c1e56c\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.623401 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvm2j\" (UniqueName: \"kubernetes.io/projected/ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5-kube-api-access-tvm2j\") pod \"barbican-operator-controller-manager-d47688694-s8dzz\" (UID: \"ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.623520 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszlr\" (UniqueName: \"kubernetes.io/projected/689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6-kube-api-access-mszlr\") pod \"designate-operator-controller-manager-66d56f6ff4-rkcbz\" (UID: \"689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.623668 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drfc\" (UniqueName: \"kubernetes.io/projected/3b985683-b068-4b76-b702-927b15cc10ff-kube-api-access-6drfc\") pod \"cinder-operator-controller-manager-984cd4dcf-6vg2z\" (UID: \"3b985683-b068-4b76-b702-927b15cc10ff\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.630731 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.631940 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.636034 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9xqhx" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.637464 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.641426 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.646073 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-nfln9" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.670672 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.686419 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.705129 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.706032 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.710332 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zlvqz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726481 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzj24\" (UniqueName: \"kubernetes.io/projected/2c7b2dc9-e244-426d-b611-ee2629816c17-kube-api-access-mzj24\") pod \"ironic-operator-controller-manager-5bc894d9b-cwhxs\" (UID: \"2c7b2dc9-e244-426d-b611-ee2629816c17\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726531 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqgp9\" (UniqueName: \"kubernetes.io/projected/33a84627-b97d-4f3b-84ec-d81a54c1e56c-kube-api-access-fqgp9\") pod \"glance-operator-controller-manager-5964f64c48-rzk97\" (UID: \"33a84627-b97d-4f3b-84ec-d81a54c1e56c\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726555 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrnh\" (UniqueName: \"kubernetes.io/projected/a56d7e4a-22ff-40c9-b6f6-b070fc628880-kube-api-access-5lrnh\") pod \"horizon-operator-controller-manager-6d9d6b584d-fjnr7\" (UID: \"a56d7e4a-22ff-40c9-b6f6-b070fc628880\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726586 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqcdw\" (UniqueName: \"kubernetes.io/projected/0ece0928-a8a0-46b8-98bd-8b35c8d07fca-kube-api-access-kqcdw\") pod \"heat-operator-controller-manager-77b6666d85-gmxp8\" (UID: \"0ece0928-a8a0-46b8-98bd-8b35c8d07fca\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726628 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726649 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvm2j\" (UniqueName: \"kubernetes.io/projected/ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5-kube-api-access-tvm2j\") pod \"barbican-operator-controller-manager-d47688694-s8dzz\" (UID: \"ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726672 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszlr\" (UniqueName: \"kubernetes.io/projected/689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6-kube-api-access-mszlr\") pod \"designate-operator-controller-manager-66d56f6ff4-rkcbz\" (UID: \"689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726698 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6drfc\" (UniqueName: \"kubernetes.io/projected/3b985683-b068-4b76-b702-927b15cc10ff-kube-api-access-6drfc\") pod \"cinder-operator-controller-manager-984cd4dcf-6vg2z\" (UID: \"3b985683-b068-4b76-b702-927b15cc10ff\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.726717 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c52bz\" (UniqueName: \"kubernetes.io/projected/5b7ec223-c34a-45c1-926c-e957e8cd3086-kube-api-access-c52bz\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.733820 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.734693 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.739790 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4mctb" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.748220 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.749492 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszlr\" (UniqueName: \"kubernetes.io/projected/689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6-kube-api-access-mszlr\") pod \"designate-operator-controller-manager-66d56f6ff4-rkcbz\" (UID: \"689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.751214 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drfc\" (UniqueName: \"kubernetes.io/projected/3b985683-b068-4b76-b702-927b15cc10ff-kube-api-access-6drfc\") pod \"cinder-operator-controller-manager-984cd4dcf-6vg2z\" (UID: \"3b985683-b068-4b76-b702-927b15cc10ff\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.751375 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvm2j\" (UniqueName: \"kubernetes.io/projected/ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5-kube-api-access-tvm2j\") pod \"barbican-operator-controller-manager-d47688694-s8dzz\" (UID: \"ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.754540 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.755381 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.756803 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqgp9\" (UniqueName: \"kubernetes.io/projected/33a84627-b97d-4f3b-84ec-d81a54c1e56c-kube-api-access-fqgp9\") pod \"glance-operator-controller-manager-5964f64c48-rzk97\" (UID: \"33a84627-b97d-4f3b-84ec-d81a54c1e56c\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.768910 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-lprxs"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.769005 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nzmcm" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.769901 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.781415 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.802294 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.817946 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rnt52" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.818315 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.822840 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.829726 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830388 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l89kf\" (UniqueName: \"kubernetes.io/projected/0bc8a10a-075b-4eb1-96cd-081c4ce39a30-kube-api-access-l89kf\") pod \"keystone-operator-controller-manager-684f77d66d-n2qxq\" (UID: \"0bc8a10a-075b-4eb1-96cd-081c4ce39a30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830421 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzj24\" (UniqueName: \"kubernetes.io/projected/2c7b2dc9-e244-426d-b611-ee2629816c17-kube-api-access-mzj24\") pod \"ironic-operator-controller-manager-5bc894d9b-cwhxs\" (UID: \"2c7b2dc9-e244-426d-b611-ee2629816c17\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830454 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrnh\" (UniqueName: \"kubernetes.io/projected/a56d7e4a-22ff-40c9-b6f6-b070fc628880-kube-api-access-5lrnh\") pod \"horizon-operator-controller-manager-6d9d6b584d-fjnr7\" (UID: \"a56d7e4a-22ff-40c9-b6f6-b070fc628880\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830479 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqcdw\" (UniqueName: \"kubernetes.io/projected/0ece0928-a8a0-46b8-98bd-8b35c8d07fca-kube-api-access-kqcdw\") pod \"heat-operator-controller-manager-77b6666d85-gmxp8\" (UID: \"0ece0928-a8a0-46b8-98bd-8b35c8d07fca\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830509 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv9ft\" (UniqueName: \"kubernetes.io/projected/cb4dadcd-075b-4d15-b6e8-90baf37ff7d0-kube-api-access-cv9ft\") pod \"neutron-operator-controller-manager-776c5696bf-wzdf7\" (UID: \"cb4dadcd-075b-4d15-b6e8-90baf37ff7d0\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830542 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830576 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c52bz\" (UniqueName: \"kubernetes.io/projected/5b7ec223-c34a-45c1-926c-e957e8cd3086-kube-api-access-c52bz\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.830651 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcl26\" (UniqueName: \"kubernetes.io/projected/04e8856d-e262-463a-9162-cb7ceef75a38-kube-api-access-hcl26\") pod \"manila-operator-controller-manager-57b484b4df-fb6px\" (UID: \"04e8856d-e262-463a-9162-cb7ceef75a38\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" Mar 14 07:18:41 crc kubenswrapper[5129]: E0314 07:18:41.831315 5129 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:41 crc kubenswrapper[5129]: E0314 07:18:41.831360 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert podName:5b7ec223-c34a-45c1-926c-e957e8cd3086 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:42.33134314 +0000 UTC m=+1185.083258334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8b67l" (UID: "5b7ec223-c34a-45c1-926c-e957e8cd3086") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.844911 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.845830 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.850752 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-szld6" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.869712 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-lprxs"] Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.881947 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.883191 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c52bz\" (UniqueName: \"kubernetes.io/projected/5b7ec223-c34a-45c1-926c-e957e8cd3086-kube-api-access-c52bz\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.933305 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrnh\" (UniqueName: \"kubernetes.io/projected/a56d7e4a-22ff-40c9-b6f6-b070fc628880-kube-api-access-5lrnh\") pod \"horizon-operator-controller-manager-6d9d6b584d-fjnr7\" (UID: \"a56d7e4a-22ff-40c9-b6f6-b070fc628880\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.933929 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqcdw\" (UniqueName: \"kubernetes.io/projected/0ece0928-a8a0-46b8-98bd-8b35c8d07fca-kube-api-access-kqcdw\") pod \"heat-operator-controller-manager-77b6666d85-gmxp8\" (UID: \"0ece0928-a8a0-46b8-98bd-8b35c8d07fca\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.934113 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzj24\" (UniqueName: \"kubernetes.io/projected/2c7b2dc9-e244-426d-b611-ee2629816c17-kube-api-access-mzj24\") pod \"ironic-operator-controller-manager-5bc894d9b-cwhxs\" (UID: \"2c7b2dc9-e244-426d-b611-ee2629816c17\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.939222 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcl26\" (UniqueName: \"kubernetes.io/projected/04e8856d-e262-463a-9162-cb7ceef75a38-kube-api-access-hcl26\") pod \"manila-operator-controller-manager-57b484b4df-fb6px\" (UID: \"04e8856d-e262-463a-9162-cb7ceef75a38\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.939308 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l89kf\" (UniqueName: \"kubernetes.io/projected/0bc8a10a-075b-4eb1-96cd-081c4ce39a30-kube-api-access-l89kf\") pod \"keystone-operator-controller-manager-684f77d66d-n2qxq\" (UID: \"0bc8a10a-075b-4eb1-96cd-081c4ce39a30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.939347 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvz72\" (UniqueName: \"kubernetes.io/projected/cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5-kube-api-access-hvz72\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-kvm84\" (UID: \"cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.939390 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv9ft\" (UniqueName: \"kubernetes.io/projected/cb4dadcd-075b-4d15-b6e8-90baf37ff7d0-kube-api-access-cv9ft\") pod \"neutron-operator-controller-manager-776c5696bf-wzdf7\" (UID: \"cb4dadcd-075b-4d15-b6e8-90baf37ff7d0\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" Mar 14 07:18:41 crc kubenswrapper[5129]: I0314 07:18:41.939448 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmqz\" (UniqueName: \"kubernetes.io/projected/91e2b49a-9bac-44a1-9b90-22f62a1ce727-kube-api-access-stmqz\") pod \"nova-operator-controller-manager-7f84474648-lprxs\" (UID: \"91e2b49a-9bac-44a1-9b90-22f62a1ce727\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.014059 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.017585 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcl26\" (UniqueName: \"kubernetes.io/projected/04e8856d-e262-463a-9162-cb7ceef75a38-kube-api-access-hcl26\") pod \"manila-operator-controller-manager-57b484b4df-fb6px\" (UID: \"04e8856d-e262-463a-9162-cb7ceef75a38\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.028655 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.033211 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv9ft\" (UniqueName: \"kubernetes.io/projected/cb4dadcd-075b-4d15-b6e8-90baf37ff7d0-kube-api-access-cv9ft\") pod \"neutron-operator-controller-manager-776c5696bf-wzdf7\" (UID: \"cb4dadcd-075b-4d15-b6e8-90baf37ff7d0\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.044123 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmqz\" (UniqueName: \"kubernetes.io/projected/91e2b49a-9bac-44a1-9b90-22f62a1ce727-kube-api-access-stmqz\") pod \"nova-operator-controller-manager-7f84474648-lprxs\" (UID: \"91e2b49a-9bac-44a1-9b90-22f62a1ce727\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.044228 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvl2s\" (UniqueName: \"kubernetes.io/projected/f447b14b-2d09-416b-96b4-126ab3dc2515-kube-api-access-jvl2s\") pod \"octavia-operator-controller-manager-5f4f55cb5c-2dmkf\" (UID: \"f447b14b-2d09-416b-96b4-126ab3dc2515\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.044306 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvz72\" (UniqueName: \"kubernetes.io/projected/cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5-kube-api-access-hvz72\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-kvm84\" (UID: \"cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.070566 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.072234 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l89kf\" (UniqueName: \"kubernetes.io/projected/0bc8a10a-075b-4eb1-96cd-081c4ce39a30-kube-api-access-l89kf\") pod \"keystone-operator-controller-manager-684f77d66d-n2qxq\" (UID: \"0bc8a10a-075b-4eb1-96cd-081c4ce39a30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.094039 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmqz\" (UniqueName: \"kubernetes.io/projected/91e2b49a-9bac-44a1-9b90-22f62a1ce727-kube-api-access-stmqz\") pod \"nova-operator-controller-manager-7f84474648-lprxs\" (UID: \"91e2b49a-9bac-44a1-9b90-22f62a1ce727\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.101155 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvz72\" (UniqueName: \"kubernetes.io/projected/cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5-kube-api-access-hvz72\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-kvm84\" (UID: \"cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.104125 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.123409 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.145263 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvl2s\" (UniqueName: \"kubernetes.io/projected/f447b14b-2d09-416b-96b4-126ab3dc2515-kube-api-access-jvl2s\") pod \"octavia-operator-controller-manager-5f4f55cb5c-2dmkf\" (UID: \"f447b14b-2d09-416b-96b4-126ab3dc2515\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.148572 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.149488 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.150224 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.151853 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.157997 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.158323 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-24xf7" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.158439 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rw4p9" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.159683 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.166674 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.172784 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.183087 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.183130 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvl2s\" (UniqueName: \"kubernetes.io/projected/f447b14b-2d09-416b-96b4-126ab3dc2515-kube-api-access-jvl2s\") pod \"octavia-operator-controller-manager-5f4f55cb5c-2dmkf\" (UID: \"f447b14b-2d09-416b-96b4-126ab3dc2515\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.185452 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.189965 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-v52pb" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.204080 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.220301 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.234583 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.235131 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.237518 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.241267 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kgqkj" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.246235 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.246649 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpxdf\" (UniqueName: \"kubernetes.io/projected/e789f354-e686-4cc9-a705-3af685a25988-kube-api-access-lpxdf\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.246741 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwk5b\" (UniqueName: \"kubernetes.io/projected/e21e58a4-940c-4131-9d23-645393687367-kube-api-access-pwk5b\") pod \"ovn-operator-controller-manager-bbc5b68f9-9z48t\" (UID: \"e21e58a4-940c-4131-9d23-645393687367\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.246857 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.261199 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.262265 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.267646 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.269962 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7qspb" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.278780 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.294293 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.294760 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.299192 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.303308 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-j9rbj" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.323442 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.330362 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.331429 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.336293 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dhw8b" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.346975 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.348447 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3a4f21bd-d487-4af5-b741-4971cf4b11d1-kube-api-access-khnhz\") pod \"swift-operator-controller-manager-7f9cc5dd44-4tm5b\" (UID: \"3a4f21bd-d487-4af5-b741-4971cf4b11d1\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.348535 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.348573 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpxdf\" (UniqueName: \"kubernetes.io/projected/e789f354-e686-4cc9-a705-3af685a25988-kube-api-access-lpxdf\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.348625 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgn2b\" (UniqueName: \"kubernetes.io/projected/1bf2bb51-4fd1-4d88-b663-3d41e4236ecd-kube-api-access-zgn2b\") pod \"placement-operator-controller-manager-574d45c66c-h5zmn\" (UID: \"1bf2bb51-4fd1-4d88-b663-3d41e4236ecd\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.348656 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.348680 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwk5b\" (UniqueName: \"kubernetes.io/projected/e21e58a4-940c-4131-9d23-645393687367-kube-api-access-pwk5b\") pod \"ovn-operator-controller-manager-bbc5b68f9-9z48t\" (UID: \"e21e58a4-940c-4131-9d23-645393687367\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.349961 5129 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.350018 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert podName:e789f354-e686-4cc9-a705-3af685a25988 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:42.850000145 +0000 UTC m=+1185.601915329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" (UID: "e789f354-e686-4cc9-a705-3af685a25988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.350487 5129 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.353824 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert podName:5b7ec223-c34a-45c1-926c-e957e8cd3086 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:43.353786339 +0000 UTC m=+1186.105701533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8b67l" (UID: "5b7ec223-c34a-45c1-926c-e957e8cd3086") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.370374 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwk5b\" (UniqueName: \"kubernetes.io/projected/e21e58a4-940c-4131-9d23-645393687367-kube-api-access-pwk5b\") pod \"ovn-operator-controller-manager-bbc5b68f9-9z48t\" (UID: \"e21e58a4-940c-4131-9d23-645393687367\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.375531 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpxdf\" (UniqueName: \"kubernetes.io/projected/e789f354-e686-4cc9-a705-3af685a25988-kube-api-access-lpxdf\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.387771 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.388711 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.413566 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.413786 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.413923 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mfbhx" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.433863 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.453154 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgn2b\" (UniqueName: \"kubernetes.io/projected/1bf2bb51-4fd1-4d88-b663-3d41e4236ecd-kube-api-access-zgn2b\") pod \"placement-operator-controller-manager-574d45c66c-h5zmn\" (UID: \"1bf2bb51-4fd1-4d88-b663-3d41e4236ecd\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.453274 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hccpw\" (UniqueName: \"kubernetes.io/projected/15bf3262-1e7c-42a9-bf65-f8507856d922-kube-api-access-hccpw\") pod \"telemetry-operator-controller-manager-6854b8b9d9-kpb92\" (UID: \"15bf3262-1e7c-42a9-bf65-f8507856d922\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.453307 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4flch\" (UniqueName: \"kubernetes.io/projected/b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f-kube-api-access-4flch\") pod \"watcher-operator-controller-manager-6c4d75f7f9-58xnz\" (UID: \"b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.453354 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3a4f21bd-d487-4af5-b741-4971cf4b11d1-kube-api-access-khnhz\") pod \"swift-operator-controller-manager-7f9cc5dd44-4tm5b\" (UID: \"3a4f21bd-d487-4af5-b741-4971cf4b11d1\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.453395 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nb98\" (UniqueName: \"kubernetes.io/projected/1d8c0991-2d49-42c4-bed5-62c86ef72f24-kube-api-access-2nb98\") pod \"test-operator-controller-manager-5c5cb9c4d7-nmvb4\" (UID: \"1d8c0991-2d49-42c4-bed5-62c86ef72f24\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.456807 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.457714 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.462017 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-hvlvx" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.484436 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgn2b\" (UniqueName: \"kubernetes.io/projected/1bf2bb51-4fd1-4d88-b663-3d41e4236ecd-kube-api-access-zgn2b\") pod \"placement-operator-controller-manager-574d45c66c-h5zmn\" (UID: \"1bf2bb51-4fd1-4d88-b663-3d41e4236ecd\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.486413 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3a4f21bd-d487-4af5-b741-4971cf4b11d1-kube-api-access-khnhz\") pod \"swift-operator-controller-manager-7f9cc5dd44-4tm5b\" (UID: \"3a4f21bd-d487-4af5-b741-4971cf4b11d1\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.488314 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.569534 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ls2h\" (UniqueName: \"kubernetes.io/projected/0be8f03b-22f7-421f-9da8-a3653e323613-kube-api-access-9ls2h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xgnfq\" (UID: \"0be8f03b-22f7-421f-9da8-a3653e323613\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.570838 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.570955 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hccpw\" (UniqueName: \"kubernetes.io/projected/15bf3262-1e7c-42a9-bf65-f8507856d922-kube-api-access-hccpw\") pod \"telemetry-operator-controller-manager-6854b8b9d9-kpb92\" (UID: \"15bf3262-1e7c-42a9-bf65-f8507856d922\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.571032 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4flch\" (UniqueName: \"kubernetes.io/projected/b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f-kube-api-access-4flch\") pod \"watcher-operator-controller-manager-6c4d75f7f9-58xnz\" (UID: \"b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.571117 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nb98\" (UniqueName: \"kubernetes.io/projected/1d8c0991-2d49-42c4-bed5-62c86ef72f24-kube-api-access-2nb98\") pod \"test-operator-controller-manager-5c5cb9c4d7-nmvb4\" (UID: \"1d8c0991-2d49-42c4-bed5-62c86ef72f24\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.571183 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.571271 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjhp2\" (UniqueName: \"kubernetes.io/projected/4484eedf-8b6d-45a2-af19-09ada3258a22-kube-api-access-sjhp2\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.589957 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.594654 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hccpw\" (UniqueName: \"kubernetes.io/projected/15bf3262-1e7c-42a9-bf65-f8507856d922-kube-api-access-hccpw\") pod \"telemetry-operator-controller-manager-6854b8b9d9-kpb92\" (UID: \"15bf3262-1e7c-42a9-bf65-f8507856d922\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.602085 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nb98\" (UniqueName: \"kubernetes.io/projected/1d8c0991-2d49-42c4-bed5-62c86ef72f24-kube-api-access-2nb98\") pod \"test-operator-controller-manager-5c5cb9c4d7-nmvb4\" (UID: \"1d8c0991-2d49-42c4-bed5-62c86ef72f24\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.606841 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4flch\" (UniqueName: \"kubernetes.io/projected/b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f-kube-api-access-4flch\") pod \"watcher-operator-controller-manager-6c4d75f7f9-58xnz\" (UID: \"b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.617102 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.627192 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.631940 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.648123 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.673887 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.673967 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjhp2\" (UniqueName: \"kubernetes.io/projected/4484eedf-8b6d-45a2-af19-09ada3258a22-kube-api-access-sjhp2\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.674030 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ls2h\" (UniqueName: \"kubernetes.io/projected/0be8f03b-22f7-421f-9da8-a3653e323613-kube-api-access-9ls2h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xgnfq\" (UID: \"0be8f03b-22f7-421f-9da8-a3653e323613\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.674117 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.674332 5129 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.674379 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:43.174364831 +0000 UTC m=+1185.926280015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "metrics-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.674419 5129 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.674438 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:43.174431522 +0000 UTC m=+1185.926346696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.693793 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjhp2\" (UniqueName: \"kubernetes.io/projected/4484eedf-8b6d-45a2-af19-09ada3258a22-kube-api-access-sjhp2\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.707211 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ls2h\" (UniqueName: \"kubernetes.io/projected/0be8f03b-22f7-421f-9da8-a3653e323613-kube-api-access-9ls2h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xgnfq\" (UID: \"0be8f03b-22f7-421f-9da8-a3653e323613\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.718689 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.721307 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.727506 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.762186 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.771010 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.796079 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.878584 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.878775 5129 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: E0314 07:18:42.878827 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert podName:e789f354-e686-4cc9-a705-3af685a25988 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:43.87880948 +0000 UTC m=+1186.630724664 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" (UID: "e789f354-e686-4cc9-a705-3af685a25988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:42 crc kubenswrapper[5129]: W0314 07:18:42.887735 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689ffa0e_d6ac_4dfb_bd55_3a957aeb1cb6.slice/crio-1d6b974d4b2aa0af665b358fb892c855cd75b99309e4988a1f060873e543e445 WatchSource:0}: Error finding container 1d6b974d4b2aa0af665b358fb892c855cd75b99309e4988a1f060873e543e445: Status 404 returned error can't find the container with id 1d6b974d4b2aa0af665b358fb892c855cd75b99309e4988a1f060873e543e445 Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.938029 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.945009 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-lprxs"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.952390 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84"] Mar 14 07:18:42 crc kubenswrapper[5129]: I0314 07:18:42.968269 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px"] Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.016845 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04e8856d_e262_463a_9162_cb7ceef75a38.slice/crio-c9b8173614c895a66dbb8222448c91b43e513718e898c77d27113a61ee182ff1 WatchSource:0}: Error finding container c9b8173614c895a66dbb8222448c91b43e513718e898c77d27113a61ee182ff1: Status 404 returned error can't find the container with id c9b8173614c895a66dbb8222448c91b43e513718e898c77d27113a61ee182ff1 Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.118639 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" event={"ID":"689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6","Type":"ContainerStarted","Data":"1d6b974d4b2aa0af665b358fb892c855cd75b99309e4988a1f060873e543e445"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.120444 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" event={"ID":"33a84627-b97d-4f3b-84ec-d81a54c1e56c","Type":"ContainerStarted","Data":"75798eaa3ee753b292b9c1b4dcde8ee767f6451cc821bde8cc88ee355b4e7da1"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.121353 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" event={"ID":"04e8856d-e262-463a-9162-cb7ceef75a38","Type":"ContainerStarted","Data":"c9b8173614c895a66dbb8222448c91b43e513718e898c77d27113a61ee182ff1"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.122378 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" event={"ID":"cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5","Type":"ContainerStarted","Data":"397372bce6ed1708ba7a9c02e1114f507360ad7ed20d198ce3bcdb10a7c9de9a"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.123193 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" event={"ID":"ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5","Type":"ContainerStarted","Data":"7f1ff21906604723c313548adf3f68752f80c58a6d6e6c6591d7e33a2c920982"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.124209 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" event={"ID":"3b985683-b068-4b76-b702-927b15cc10ff","Type":"ContainerStarted","Data":"b8c10e2f7e4d6cb05476277aadc4015546a0fa2e1e7afce70c3545bccc0cee3d"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.125624 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" event={"ID":"91e2b49a-9bac-44a1-9b90-22f62a1ce727","Type":"ContainerStarted","Data":"05b4bb9b82c3e7f3e9d2e2040addddb699906be8b7836176ffeb40dc74863b77"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.127138 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" event={"ID":"2c7b2dc9-e244-426d-b611-ee2629816c17","Type":"ContainerStarted","Data":"a324cb4cd67e6327bb47a0090ecd0926fbdc0f2f008443807a4f97849271a642"} Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.185364 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.185461 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.185532 5129 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.185612 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:44.185580631 +0000 UTC m=+1186.937495815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "metrics-server-cert" not found Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.185816 5129 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.185886 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:44.185869679 +0000 UTC m=+1186.937784863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "webhook-server-cert" not found Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.188751 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8"] Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.195124 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7"] Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.198556 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda56d7e4a_22ff_40c9_b6f6_b070fc628880.slice/crio-612eb92522665d27bdc63b21bb6bc22c28fe78f2c71f3c64e18abb3d7d561748 WatchSource:0}: Error finding container 612eb92522665d27bdc63b21bb6bc22c28fe78f2c71f3c64e18abb3d7d561748: Status 404 returned error can't find the container with id 612eb92522665d27bdc63b21bb6bc22c28fe78f2c71f3c64e18abb3d7d561748 Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.202842 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ece0928_a8a0_46b8_98bd_8b35c8d07fca.slice/crio-f4d37e560a038784b8126197f4a95f3cd6cd9179a7f4acae8b2e9ac761d32e43 WatchSource:0}: Error finding container f4d37e560a038784b8126197f4a95f3cd6cd9179a7f4acae8b2e9ac761d32e43: Status 404 returned error can't find the container with id f4d37e560a038784b8126197f4a95f3cd6cd9179a7f4acae8b2e9ac761d32e43 Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.388876 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.389400 5129 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.389457 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert podName:5b7ec223-c34a-45c1-926c-e957e8cd3086 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:45.389437744 +0000 UTC m=+1188.141352938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8b67l" (UID: "5b7ec223-c34a-45c1-926c-e957e8cd3086") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.424561 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq"] Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.438233 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf"] Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.439672 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc8a10a_075b_4eb1_96cd_081c4ce39a30.slice/crio-7a01c12123215b508665545b40992b71c4348606acbee45d143359a8a0a517c1 WatchSource:0}: Error finding container 7a01c12123215b508665545b40992b71c4348606acbee45d143359a8a0a517c1: Status 404 returned error can't find the container with id 7a01c12123215b508665545b40992b71c4348606acbee45d143359a8a0a517c1 Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.448139 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7"] Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.463352 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92"] Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.540160 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b"] Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.542300 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a4f21bd_d487_4af5_b741_4971cf4b11d1.slice/crio-6809cd5a36c245471770f89ec867b4e9468d15d912921ff720673da94263e605 WatchSource:0}: Error finding container 6809cd5a36c245471770f89ec867b4e9468d15d912921ff720673da94263e605: Status 404 returned error can't find the container with id 6809cd5a36c245471770f89ec867b4e9468d15d912921ff720673da94263e605 Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.542816 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode21e58a4_940c_4131_9d23_645393687367.slice/crio-bc0cd3a7eb653c72d11827d1da8549463da3793af15cf0285445fc51e2fe2689 WatchSource:0}: Error finding container bc0cd3a7eb653c72d11827d1da8549463da3793af15cf0285445fc51e2fe2689: Status 404 returned error can't find the container with id bc0cd3a7eb653c72d11827d1da8549463da3793af15cf0285445fc51e2fe2689 Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.544105 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t"] Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.545170 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pwk5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-9z48t_openstack-operators(e21e58a4-940c-4131-9d23-645393687367): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.547665 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" podUID="e21e58a4-940c-4131-9d23-645393687367" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.554532 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-khnhz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-4tm5b_openstack-operators(3a4f21bd-d487-4af5-b741-4971cf4b11d1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.557560 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8c0991_2d49_42c4_bed5_62c86ef72f24.slice/crio-13b7385dd4713a4b17911de7d7a93a7e6e04fb1b28c6938ebe2224e78e33b514 WatchSource:0}: Error finding container 13b7385dd4713a4b17911de7d7a93a7e6e04fb1b28c6938ebe2224e78e33b514: Status 404 returned error can't find the container with id 13b7385dd4713a4b17911de7d7a93a7e6e04fb1b28c6938ebe2224e78e33b514 Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.557578 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" podUID="3a4f21bd-d487-4af5-b741-4971cf4b11d1" Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.558981 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq"] Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.559889 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nb98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-nmvb4_openstack-operators(1d8c0991-2d49-42c4-bed5-62c86ef72f24): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.560992 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" podUID="1d8c0991-2d49-42c4-bed5-62c86ef72f24" Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.563296 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be8f03b_22f7_421f_9da8_a3653e323613.slice/crio-395b8c5f87bc3604f04cae3139c8df9db3c45153a8550043f8fe7487eb171df9 WatchSource:0}: Error finding container 395b8c5f87bc3604f04cae3139c8df9db3c45153a8550043f8fe7487eb171df9: Status 404 returned error can't find the container with id 395b8c5f87bc3604f04cae3139c8df9db3c45153a8550043f8fe7487eb171df9 Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.565156 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1d9e89a_c41d_44a3_8fae_5a6b69f0f99f.slice/crio-feef43d0cf72c6b68501e9551530e5631610fe2eb3e8d82891cfd271c3401e03 WatchSource:0}: Error finding container feef43d0cf72c6b68501e9551530e5631610fe2eb3e8d82891cfd271c3401e03: Status 404 returned error can't find the container with id feef43d0cf72c6b68501e9551530e5631610fe2eb3e8d82891cfd271c3401e03 Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.565960 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4"] Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.571690 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz"] Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.574102 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ls2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xgnfq_openstack-operators(0be8f03b-22f7-421f-9da8-a3653e323613): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.575397 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" podUID="0be8f03b-22f7-421f-9da8-a3653e323613" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.576086 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4flch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-58xnz_openstack-operators(b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.576634 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn"] Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.577277 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" podUID="b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f" Mar 14 07:18:43 crc kubenswrapper[5129]: W0314 07:18:43.583498 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf2bb51_4fd1_4d88_b663_3d41e4236ecd.slice/crio-28c24be1ef20f284b59dbd3b5d05f3fdf276eae8f8a5c62435691c770154dc0d WatchSource:0}: Error finding container 28c24be1ef20f284b59dbd3b5d05f3fdf276eae8f8a5c62435691c770154dc0d: Status 404 returned error can't find the container with id 28c24be1ef20f284b59dbd3b5d05f3fdf276eae8f8a5c62435691c770154dc0d Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.586858 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zgn2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-h5zmn_openstack-operators(1bf2bb51-4fd1-4d88-b663-3d41e4236ecd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.588273 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" podUID="1bf2bb51-4fd1-4d88-b663-3d41e4236ecd" Mar 14 07:18:43 crc kubenswrapper[5129]: I0314 07:18:43.902382 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.902772 5129 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:43 crc kubenswrapper[5129]: E0314 07:18:43.902845 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert podName:e789f354-e686-4cc9-a705-3af685a25988 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:45.902827605 +0000 UTC m=+1188.654742779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" (UID: "e789f354-e686-4cc9-a705-3af685a25988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.134757 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" event={"ID":"e21e58a4-940c-4131-9d23-645393687367","Type":"ContainerStarted","Data":"bc0cd3a7eb653c72d11827d1da8549463da3793af15cf0285445fc51e2fe2689"} Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.136168 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" podUID="e21e58a4-940c-4131-9d23-645393687367" Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.136505 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" event={"ID":"f447b14b-2d09-416b-96b4-126ab3dc2515","Type":"ContainerStarted","Data":"afb8ee654fc0252a9b0626ae6e20e0c2061509e37d1dd14f72621898d3b2bba7"} Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.138968 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" event={"ID":"0be8f03b-22f7-421f-9da8-a3653e323613","Type":"ContainerStarted","Data":"395b8c5f87bc3604f04cae3139c8df9db3c45153a8550043f8fe7487eb171df9"} Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.139887 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" podUID="0be8f03b-22f7-421f-9da8-a3653e323613" Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.141412 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" event={"ID":"cb4dadcd-075b-4d15-b6e8-90baf37ff7d0","Type":"ContainerStarted","Data":"d2716c2692f098c31bf450a828b6def3e26927cdf60a090d1ead93db3de8dd1b"} Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.143062 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" event={"ID":"b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f","Type":"ContainerStarted","Data":"feef43d0cf72c6b68501e9551530e5631610fe2eb3e8d82891cfd271c3401e03"} Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.144890 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" event={"ID":"0bc8a10a-075b-4eb1-96cd-081c4ce39a30","Type":"ContainerStarted","Data":"7a01c12123215b508665545b40992b71c4348606acbee45d143359a8a0a517c1"} Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.146623 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" event={"ID":"a56d7e4a-22ff-40c9-b6f6-b070fc628880","Type":"ContainerStarted","Data":"612eb92522665d27bdc63b21bb6bc22c28fe78f2c71f3c64e18abb3d7d561748"} Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.147750 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" podUID="b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f" Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.160543 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" event={"ID":"1d8c0991-2d49-42c4-bed5-62c86ef72f24","Type":"ContainerStarted","Data":"13b7385dd4713a4b17911de7d7a93a7e6e04fb1b28c6938ebe2224e78e33b514"} Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.162219 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" podUID="1d8c0991-2d49-42c4-bed5-62c86ef72f24" Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.163000 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" event={"ID":"3a4f21bd-d487-4af5-b741-4971cf4b11d1","Type":"ContainerStarted","Data":"6809cd5a36c245471770f89ec867b4e9468d15d912921ff720673da94263e605"} Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.164196 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" podUID="3a4f21bd-d487-4af5-b741-4971cf4b11d1" Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.165572 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" event={"ID":"15bf3262-1e7c-42a9-bf65-f8507856d922","Type":"ContainerStarted","Data":"fe4539736b07834019e54a539f2b365d3bca502c125014cc974a8d862ad4fe2b"} Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.167379 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" event={"ID":"1bf2bb51-4fd1-4d88-b663-3d41e4236ecd","Type":"ContainerStarted","Data":"28c24be1ef20f284b59dbd3b5d05f3fdf276eae8f8a5c62435691c770154dc0d"} Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.171480 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" event={"ID":"0ece0928-a8a0-46b8-98bd-8b35c8d07fca","Type":"ContainerStarted","Data":"f4d37e560a038784b8126197f4a95f3cd6cd9179a7f4acae8b2e9ac761d32e43"} Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.172922 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" podUID="1bf2bb51-4fd1-4d88-b663-3d41e4236ecd" Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.212667 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.212790 5129 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.212831 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:46.212816505 +0000 UTC m=+1188.964731689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "webhook-server-cert" not found Mar 14 07:18:44 crc kubenswrapper[5129]: I0314 07:18:44.214059 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.214342 5129 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:44 crc kubenswrapper[5129]: E0314 07:18:44.214380 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:46.214371977 +0000 UTC m=+1188.966287161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "metrics-server-cert" not found Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.181412 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" podUID="3a4f21bd-d487-4af5-b741-4971cf4b11d1" Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.182862 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" podUID="e21e58a4-940c-4131-9d23-645393687367" Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.182908 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" podUID="1d8c0991-2d49-42c4-bed5-62c86ef72f24" Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.182932 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" podUID="b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f" Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.182996 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" podUID="0be8f03b-22f7-421f-9da8-a3653e323613" Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.183009 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" podUID="1bf2bb51-4fd1-4d88-b663-3d41e4236ecd" Mar 14 07:18:45 crc kubenswrapper[5129]: I0314 07:18:45.430846 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.431020 5129 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.431109 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert podName:5b7ec223-c34a-45c1-926c-e957e8cd3086 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:49.43108783 +0000 UTC m=+1192.183003084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8b67l" (UID: "5b7ec223-c34a-45c1-926c-e957e8cd3086") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:45 crc kubenswrapper[5129]: I0314 07:18:45.936897 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.937147 5129 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:45 crc kubenswrapper[5129]: E0314 07:18:45.937282 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert podName:e789f354-e686-4cc9-a705-3af685a25988 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:49.937253191 +0000 UTC m=+1192.689168375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" (UID: "e789f354-e686-4cc9-a705-3af685a25988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:46 crc kubenswrapper[5129]: I0314 07:18:46.241179 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:46 crc kubenswrapper[5129]: I0314 07:18:46.241261 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:46 crc kubenswrapper[5129]: E0314 07:18:46.241357 5129 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:46 crc kubenswrapper[5129]: E0314 07:18:46.241455 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:50.241437632 +0000 UTC m=+1192.993352816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "metrics-server-cert" not found Mar 14 07:18:46 crc kubenswrapper[5129]: E0314 07:18:46.241361 5129 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:46 crc kubenswrapper[5129]: E0314 07:18:46.241492 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:50.241484424 +0000 UTC m=+1192.993399608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "webhook-server-cert" not found Mar 14 07:18:49 crc kubenswrapper[5129]: I0314 07:18:49.495151 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:49 crc kubenswrapper[5129]: E0314 07:18:49.495374 5129 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:49 crc kubenswrapper[5129]: E0314 07:18:49.496019 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert podName:5b7ec223-c34a-45c1-926c-e957e8cd3086 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:57.495982224 +0000 UTC m=+1200.247897448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8b67l" (UID: "5b7ec223-c34a-45c1-926c-e957e8cd3086") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:50 crc kubenswrapper[5129]: I0314 07:18:50.002566 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:50 crc kubenswrapper[5129]: E0314 07:18:50.002700 5129 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:50 crc kubenswrapper[5129]: E0314 07:18:50.003157 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert podName:e789f354-e686-4cc9-a705-3af685a25988 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:58.003141194 +0000 UTC m=+1200.755056378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" (UID: "e789f354-e686-4cc9-a705-3af685a25988") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 07:18:50 crc kubenswrapper[5129]: I0314 07:18:50.307295 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:50 crc kubenswrapper[5129]: I0314 07:18:50.307407 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:50 crc kubenswrapper[5129]: E0314 07:18:50.307508 5129 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 07:18:50 crc kubenswrapper[5129]: E0314 07:18:50.307508 5129 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 07:18:50 crc kubenswrapper[5129]: E0314 07:18:50.307554 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:58.30753883 +0000 UTC m=+1201.059454014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "webhook-server-cert" not found Mar 14 07:18:50 crc kubenswrapper[5129]: E0314 07:18:50.307566 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs podName:4484eedf-8b6d-45a2-af19-09ada3258a22 nodeName:}" failed. No retries permitted until 2026-03-14 07:18:58.30756079 +0000 UTC m=+1201.059475974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-jwxjp" (UID: "4484eedf-8b6d-45a2-af19-09ada3258a22") : secret "metrics-server-cert" not found Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.256763 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" event={"ID":"04e8856d-e262-463a-9162-cb7ceef75a38","Type":"ContainerStarted","Data":"104f717307dce1b2bf4dbbeeab3b6a94800ebdf672475e1ed46f95ce6e530f3f"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.258099 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.260615 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" event={"ID":"0ece0928-a8a0-46b8-98bd-8b35c8d07fca","Type":"ContainerStarted","Data":"4ff31454927a981953b0dafafb2e27ca696dce4920dfa14b4396fc9aa9ccf19d"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.261087 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.262646 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" event={"ID":"cb4dadcd-075b-4d15-b6e8-90baf37ff7d0","Type":"ContainerStarted","Data":"d4b5995316c73e4324d6a310ce2c57652f5e23183d090964f2a3d1c7fcf75ffc"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.263093 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.264638 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" event={"ID":"a56d7e4a-22ff-40c9-b6f6-b070fc628880","Type":"ContainerStarted","Data":"78f636abb898fa079a0775aed240522d34d7db328855ce8aaca8b35a4f75ecd7"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.264927 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.266156 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" event={"ID":"2c7b2dc9-e244-426d-b611-ee2629816c17","Type":"ContainerStarted","Data":"a4971398b2c39013f19f4c756d5a1d4c3fc5bdd370e025c3b894bdd25438ab10"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.266237 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.267579 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" event={"ID":"f447b14b-2d09-416b-96b4-126ab3dc2515","Type":"ContainerStarted","Data":"30940e2501229f9f25df7b2cfe6b2a08732b92a974c9b95c4ec4a89b1d6f021e"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.267645 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.268986 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" event={"ID":"3b985683-b068-4b76-b702-927b15cc10ff","Type":"ContainerStarted","Data":"a1aa81eacc5c9433816c46d2c577cba12a2e9c2fc36bf7b28316b54584745233"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.269343 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.270704 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" event={"ID":"33a84627-b97d-4f3b-84ec-d81a54c1e56c","Type":"ContainerStarted","Data":"e6125d82b0468d8d830354a3b7b572351b1063f87e4db7a91f6c655662ad8cb1"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.271050 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.272311 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" event={"ID":"15bf3262-1e7c-42a9-bf65-f8507856d922","Type":"ContainerStarted","Data":"6c7f23ea8a6dfeaf1c87eb99c377c0304351751fbd402f5a05ddb92c4c3fb2c5"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.272728 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.274192 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" event={"ID":"689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6","Type":"ContainerStarted","Data":"0e33093a25c5ecbcbe50ec072d6022ecbcc02de7a64b2e70a49187d367588556"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.274540 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.276026 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" event={"ID":"91e2b49a-9bac-44a1-9b90-22f62a1ce727","Type":"ContainerStarted","Data":"e03a9abbfbb4a8899b828194a3a9559d2db6a4c4741e798bead0c9eaac6d0a5d"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.276522 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.277535 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" event={"ID":"0bc8a10a-075b-4eb1-96cd-081c4ce39a30","Type":"ContainerStarted","Data":"dc41e39d39da578e763c45242ac45d5f5e4afabe72131e5a46cbb2e8904146ef"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.277893 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.282331 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" event={"ID":"cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5","Type":"ContainerStarted","Data":"d401fb830449a8355314643a00e38a46ca94bc98d007bcad216f6bdce5c84f68"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.282508 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.283699 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" event={"ID":"ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5","Type":"ContainerStarted","Data":"116a4f4968400dc918c771f5b4280c89efef252ccdf5af1ee75be70979004faa"} Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.284216 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.343260 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" podStartSLOduration=4.744892774 podStartE2EDuration="15.343244923s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.02147596 +0000 UTC m=+1185.773391144" lastFinishedPulling="2026-03-14 07:18:53.619828109 +0000 UTC m=+1196.371743293" observedRunningTime="2026-03-14 07:18:56.290350559 +0000 UTC m=+1199.042265753" watchObservedRunningTime="2026-03-14 07:18:56.343244923 +0000 UTC m=+1199.095160107" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.345233 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" podStartSLOduration=3.066889925 podStartE2EDuration="15.345226778s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.200952854 +0000 UTC m=+1185.952868038" lastFinishedPulling="2026-03-14 07:18:55.479289687 +0000 UTC m=+1198.231204891" observedRunningTime="2026-03-14 07:18:56.340739844 +0000 UTC m=+1199.092655028" watchObservedRunningTime="2026-03-14 07:18:56.345226778 +0000 UTC m=+1199.097141962" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.539367 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" podStartSLOduration=3.083819829 podStartE2EDuration="15.539352013s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.010009625 +0000 UTC m=+1185.761924809" lastFinishedPulling="2026-03-14 07:18:55.465541809 +0000 UTC m=+1198.217456993" observedRunningTime="2026-03-14 07:18:56.484079654 +0000 UTC m=+1199.235994838" watchObservedRunningTime="2026-03-14 07:18:56.539352013 +0000 UTC m=+1199.291267197" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.541234 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" podStartSLOduration=3.471928777 podStartE2EDuration="15.541226605s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.452485967 +0000 UTC m=+1186.204401151" lastFinishedPulling="2026-03-14 07:18:55.521783795 +0000 UTC m=+1198.273698979" observedRunningTime="2026-03-14 07:18:56.535284071 +0000 UTC m=+1199.287199245" watchObservedRunningTime="2026-03-14 07:18:56.541226605 +0000 UTC m=+1199.293141799" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.596197 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" podStartSLOduration=3.587171425 podStartE2EDuration="15.596180755s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.016444043 +0000 UTC m=+1185.768359227" lastFinishedPulling="2026-03-14 07:18:55.025453373 +0000 UTC m=+1197.777368557" observedRunningTime="2026-03-14 07:18:56.588528734 +0000 UTC m=+1199.340443918" watchObservedRunningTime="2026-03-14 07:18:56.596180755 +0000 UTC m=+1199.348095939" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.647318 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" podStartSLOduration=3.34804628 podStartE2EDuration="15.647301379s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.208648255 +0000 UTC m=+1185.960563439" lastFinishedPulling="2026-03-14 07:18:55.507903354 +0000 UTC m=+1198.259818538" observedRunningTime="2026-03-14 07:18:56.6444262 +0000 UTC m=+1199.396341394" watchObservedRunningTime="2026-03-14 07:18:56.647301379 +0000 UTC m=+1199.399216563" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.676688 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" podStartSLOduration=3.118636006 podStartE2EDuration="15.676671657s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:42.907507848 +0000 UTC m=+1185.659423032" lastFinishedPulling="2026-03-14 07:18:55.465543489 +0000 UTC m=+1198.217458683" observedRunningTime="2026-03-14 07:18:56.675778163 +0000 UTC m=+1199.427693347" watchObservedRunningTime="2026-03-14 07:18:56.676671657 +0000 UTC m=+1199.428586841" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.699899 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" podStartSLOduration=6.737711557 podStartE2EDuration="15.699884885s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:42.825453803 +0000 UTC m=+1185.577368987" lastFinishedPulling="2026-03-14 07:18:51.787627131 +0000 UTC m=+1194.539542315" observedRunningTime="2026-03-14 07:18:56.695655289 +0000 UTC m=+1199.447570483" watchObservedRunningTime="2026-03-14 07:18:56.699884885 +0000 UTC m=+1199.451800069" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.757979 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" podStartSLOduration=3.200560708 podStartE2EDuration="15.757964602s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:42.950474499 +0000 UTC m=+1185.702389683" lastFinishedPulling="2026-03-14 07:18:55.507878393 +0000 UTC m=+1198.259793577" observedRunningTime="2026-03-14 07:18:56.740154992 +0000 UTC m=+1199.492070186" watchObservedRunningTime="2026-03-14 07:18:56.757964602 +0000 UTC m=+1199.509879786" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.759659 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" podStartSLOduration=3.703735957 podStartE2EDuration="15.759652728s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.452401614 +0000 UTC m=+1186.204316798" lastFinishedPulling="2026-03-14 07:18:55.508318385 +0000 UTC m=+1198.260233569" observedRunningTime="2026-03-14 07:18:56.756985495 +0000 UTC m=+1199.508900689" watchObservedRunningTime="2026-03-14 07:18:56.759652728 +0000 UTC m=+1199.511567902" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.797207 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" podStartSLOduration=3.644925782 podStartE2EDuration="15.7971926s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.442729999 +0000 UTC m=+1186.194645183" lastFinishedPulling="2026-03-14 07:18:55.594996807 +0000 UTC m=+1198.346912001" observedRunningTime="2026-03-14 07:18:56.787420331 +0000 UTC m=+1199.539335515" watchObservedRunningTime="2026-03-14 07:18:56.7971926 +0000 UTC m=+1199.549107784" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.875313 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" podStartSLOduration=3.796447966 podStartE2EDuration="15.875291636s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.429018792 +0000 UTC m=+1186.180933976" lastFinishedPulling="2026-03-14 07:18:55.507862462 +0000 UTC m=+1198.259777646" observedRunningTime="2026-03-14 07:18:56.846727272 +0000 UTC m=+1199.598642446" watchObservedRunningTime="2026-03-14 07:18:56.875291636 +0000 UTC m=+1199.627206820" Mar 14 07:18:56 crc kubenswrapper[5129]: I0314 07:18:56.917319 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" podStartSLOduration=3.462701472 podStartE2EDuration="15.917304331s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.010981962 +0000 UTC m=+1185.762897146" lastFinishedPulling="2026-03-14 07:18:55.465584821 +0000 UTC m=+1198.217500005" observedRunningTime="2026-03-14 07:18:56.888310394 +0000 UTC m=+1199.640225578" watchObservedRunningTime="2026-03-14 07:18:56.917304331 +0000 UTC m=+1199.669219515" Mar 14 07:18:57 crc kubenswrapper[5129]: I0314 07:18:57.500219 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:18:57 crc kubenswrapper[5129]: E0314 07:18:57.500471 5129 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:57 crc kubenswrapper[5129]: E0314 07:18:57.500526 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert podName:5b7ec223-c34a-45c1-926c-e957e8cd3086 nodeName:}" failed. No retries permitted until 2026-03-14 07:19:13.500507101 +0000 UTC m=+1216.252422285 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8b67l" (UID: "5b7ec223-c34a-45c1-926c-e957e8cd3086") : secret "infra-operator-webhook-server-cert" not found Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.005701 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.010666 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e789f354-e686-4cc9-a705-3af685a25988-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7744kbgv\" (UID: \"e789f354-e686-4cc9-a705-3af685a25988\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.172175 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rw4p9" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.180866 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.311089 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.312429 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.330504 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.336214 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4484eedf-8b6d-45a2-af19-09ada3258a22-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-jwxjp\" (UID: \"4484eedf-8b6d-45a2-af19-09ada3258a22\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.366364 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mfbhx" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.374896 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.658156 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" podStartSLOduration=7.101804405 podStartE2EDuration="17.658139068s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:42.636549281 +0000 UTC m=+1185.388464465" lastFinishedPulling="2026-03-14 07:18:53.192883944 +0000 UTC m=+1195.944799128" observedRunningTime="2026-03-14 07:18:56.925515087 +0000 UTC m=+1199.677430281" watchObservedRunningTime="2026-03-14 07:18:58.658139068 +0000 UTC m=+1201.410054252" Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.662488 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv"] Mar 14 07:18:58 crc kubenswrapper[5129]: W0314 07:18:58.663030 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode789f354_e686_4cc9_a705_3af685a25988.slice/crio-129875d41ba0bd5544082efbb81e1200b949fc7300c541fb5d6cceed53a3d056 WatchSource:0}: Error finding container 129875d41ba0bd5544082efbb81e1200b949fc7300c541fb5d6cceed53a3d056: Status 404 returned error can't find the container with id 129875d41ba0bd5544082efbb81e1200b949fc7300c541fb5d6cceed53a3d056 Mar 14 07:18:58 crc kubenswrapper[5129]: I0314 07:18:58.809661 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp"] Mar 14 07:18:58 crc kubenswrapper[5129]: W0314 07:18:58.821746 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4484eedf_8b6d_45a2_af19_09ada3258a22.slice/crio-74e1dbdee0d6287a016c3d7ee4b650d51ddd0f53fbc599668a7e20bb5762a65d WatchSource:0}: Error finding container 74e1dbdee0d6287a016c3d7ee4b650d51ddd0f53fbc599668a7e20bb5762a65d: Status 404 returned error can't find the container with id 74e1dbdee0d6287a016c3d7ee4b650d51ddd0f53fbc599668a7e20bb5762a65d Mar 14 07:18:59 crc kubenswrapper[5129]: I0314 07:18:59.305116 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" event={"ID":"e789f354-e686-4cc9-a705-3af685a25988","Type":"ContainerStarted","Data":"129875d41ba0bd5544082efbb81e1200b949fc7300c541fb5d6cceed53a3d056"} Mar 14 07:18:59 crc kubenswrapper[5129]: I0314 07:18:59.307117 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" event={"ID":"4484eedf-8b6d-45a2-af19-09ada3258a22","Type":"ContainerStarted","Data":"2cee6d852a19f16a01683b97bcb8043337362b7c0e284d669c4542fb6f026c24"} Mar 14 07:18:59 crc kubenswrapper[5129]: I0314 07:18:59.307154 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" event={"ID":"4484eedf-8b6d-45a2-af19-09ada3258a22","Type":"ContainerStarted","Data":"74e1dbdee0d6287a016c3d7ee4b650d51ddd0f53fbc599668a7e20bb5762a65d"} Mar 14 07:18:59 crc kubenswrapper[5129]: I0314 07:18:59.307308 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:18:59 crc kubenswrapper[5129]: I0314 07:18:59.342763 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" podStartSLOduration=17.342743205 podStartE2EDuration="17.342743205s" podCreationTimestamp="2026-03-14 07:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:18:59.332157074 +0000 UTC m=+1202.084072258" watchObservedRunningTime="2026-03-14 07:18:59.342743205 +0000 UTC m=+1202.094658389" Mar 14 07:19:01 crc kubenswrapper[5129]: I0314 07:19:01.784809 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8dzz" Mar 14 07:19:01 crc kubenswrapper[5129]: I0314 07:19:01.822688 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-6vg2z" Mar 14 07:19:01 crc kubenswrapper[5129]: I0314 07:19:01.826867 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-rkcbz" Mar 14 07:19:01 crc kubenswrapper[5129]: I0314 07:19:01.886025 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rzk97" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.017666 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-cwhxs" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.076033 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-fb6px" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.107433 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-kvm84" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.134289 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-lprxs" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.176473 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gmxp8" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.209203 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fjnr7" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.245220 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-2dmkf" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.272956 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-wzdf7" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.281770 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-n2qxq" Mar 14 07:19:02 crc kubenswrapper[5129]: I0314 07:19:02.651339 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-kpb92" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.358640 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" event={"ID":"1d8c0991-2d49-42c4-bed5-62c86ef72f24","Type":"ContainerStarted","Data":"494164e71380a43775b4364420a993fc8d92810c5d26bd487a88ca23655d0b50"} Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.359285 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.360326 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" event={"ID":"3a4f21bd-d487-4af5-b741-4971cf4b11d1","Type":"ContainerStarted","Data":"d32c7555b67b1bf8ef557a17ff55e4d5b2105324e75e01eb1a8de7945c8d9226"} Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.360440 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.361624 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" event={"ID":"1bf2bb51-4fd1-4d88-b663-3d41e4236ecd","Type":"ContainerStarted","Data":"0066286e1f9d84487e0ef42f236c329200d56824a415d51a7a88fdbd98a556c4"} Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.361788 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.364074 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" event={"ID":"e789f354-e686-4cc9-a705-3af685a25988","Type":"ContainerStarted","Data":"e0db7c78bcc805badc5ea2827e6530d80adc203d2a1747929834e096bdfb1924"} Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.364211 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.365443 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" event={"ID":"b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f","Type":"ContainerStarted","Data":"d7f7c4e195ca8994e0c671153b014be168f788acd5abadffdeeeb8440f11b57f"} Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.365637 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.366848 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" event={"ID":"e21e58a4-940c-4131-9d23-645393687367","Type":"ContainerStarted","Data":"48304b972562e4b60cfa9db26d0be0a08d0909ab4250ef56285b26303c95d26d"} Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.367039 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.378557 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" podStartSLOduration=3.549571211 podStartE2EDuration="25.378537734s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.559764516 +0000 UTC m=+1186.311679700" lastFinishedPulling="2026-03-14 07:19:05.388731009 +0000 UTC m=+1208.140646223" observedRunningTime="2026-03-14 07:19:06.377885336 +0000 UTC m=+1209.129800520" watchObservedRunningTime="2026-03-14 07:19:06.378537734 +0000 UTC m=+1209.130452918" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.406277 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" podStartSLOduration=18.682135324 podStartE2EDuration="25.406261366s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:58.664978477 +0000 UTC m=+1201.416893661" lastFinishedPulling="2026-03-14 07:19:05.389104479 +0000 UTC m=+1208.141019703" observedRunningTime="2026-03-14 07:19:06.404458746 +0000 UTC m=+1209.156373940" watchObservedRunningTime="2026-03-14 07:19:06.406261366 +0000 UTC m=+1209.158176550" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.425781 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" podStartSLOduration=3.598499575 podStartE2EDuration="25.425762392s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.586758917 +0000 UTC m=+1186.338674101" lastFinishedPulling="2026-03-14 07:19:05.414021734 +0000 UTC m=+1208.165936918" observedRunningTime="2026-03-14 07:19:06.425263819 +0000 UTC m=+1209.177179013" watchObservedRunningTime="2026-03-14 07:19:06.425762392 +0000 UTC m=+1209.177677586" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.483732 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" podStartSLOduration=2.665866737 podStartE2EDuration="24.483713605s" podCreationTimestamp="2026-03-14 07:18:42 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.575901259 +0000 UTC m=+1186.327816443" lastFinishedPulling="2026-03-14 07:19:05.393748127 +0000 UTC m=+1208.145663311" observedRunningTime="2026-03-14 07:19:06.479203651 +0000 UTC m=+1209.231118835" watchObservedRunningTime="2026-03-14 07:19:06.483713605 +0000 UTC m=+1209.235628789" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.484565 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" podStartSLOduration=6.526529543 podStartE2EDuration="25.484557918s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.554373038 +0000 UTC m=+1186.306288222" lastFinishedPulling="2026-03-14 07:19:02.512401413 +0000 UTC m=+1205.264316597" observedRunningTime="2026-03-14 07:19:06.456433485 +0000 UTC m=+1209.208348669" watchObservedRunningTime="2026-03-14 07:19:06.484557918 +0000 UTC m=+1209.236473102" Mar 14 07:19:06 crc kubenswrapper[5129]: I0314 07:19:06.501698 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" podStartSLOduration=3.658406691 podStartE2EDuration="25.501679399s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.54503636 +0000 UTC m=+1186.296951544" lastFinishedPulling="2026-03-14 07:19:05.388309048 +0000 UTC m=+1208.140224252" observedRunningTime="2026-03-14 07:19:06.495357585 +0000 UTC m=+1209.247272769" watchObservedRunningTime="2026-03-14 07:19:06.501679399 +0000 UTC m=+1209.253594583" Mar 14 07:19:07 crc kubenswrapper[5129]: I0314 07:19:07.376936 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" event={"ID":"0be8f03b-22f7-421f-9da8-a3653e323613","Type":"ContainerStarted","Data":"4e62e7593a7dd05736af25a36c4a6f2cb80b01c5d64df06d9928fde3999b5acd"} Mar 14 07:19:07 crc kubenswrapper[5129]: I0314 07:19:07.397658 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xgnfq" podStartSLOduration=2.708291783 podStartE2EDuration="25.397587373s" podCreationTimestamp="2026-03-14 07:18:42 +0000 UTC" firstStartedPulling="2026-03-14 07:18:43.573962806 +0000 UTC m=+1186.325878000" lastFinishedPulling="2026-03-14 07:19:06.263258406 +0000 UTC m=+1209.015173590" observedRunningTime="2026-03-14 07:19:07.390452276 +0000 UTC m=+1210.142367480" watchObservedRunningTime="2026-03-14 07:19:07.397587373 +0000 UTC m=+1210.149502567" Mar 14 07:19:08 crc kubenswrapper[5129]: I0314 07:19:08.383038 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-jwxjp" Mar 14 07:19:12 crc kubenswrapper[5129]: I0314 07:19:12.593694 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-9z48t" Mar 14 07:19:12 crc kubenswrapper[5129]: I0314 07:19:12.623458 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-h5zmn" Mar 14 07:19:12 crc kubenswrapper[5129]: I0314 07:19:12.635105 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-4tm5b" Mar 14 07:19:12 crc kubenswrapper[5129]: I0314 07:19:12.723208 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nmvb4" Mar 14 07:19:12 crc kubenswrapper[5129]: I0314 07:19:12.727262 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-58xnz" Mar 14 07:19:13 crc kubenswrapper[5129]: I0314 07:19:13.527576 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:19:13 crc kubenswrapper[5129]: I0314 07:19:13.534317 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b7ec223-c34a-45c1-926c-e957e8cd3086-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8b67l\" (UID: \"5b7ec223-c34a-45c1-926c-e957e8cd3086\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:19:13 crc kubenswrapper[5129]: I0314 07:19:13.738828 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-b82fx" Mar 14 07:19:13 crc kubenswrapper[5129]: I0314 07:19:13.747919 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:19:14 crc kubenswrapper[5129]: I0314 07:19:14.248903 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l"] Mar 14 07:19:14 crc kubenswrapper[5129]: I0314 07:19:14.257297 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:19:14 crc kubenswrapper[5129]: I0314 07:19:14.441765 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" event={"ID":"5b7ec223-c34a-45c1-926c-e957e8cd3086","Type":"ContainerStarted","Data":"d8b94e88788593f5d595d00e79620f6c52cafd516a363265a1925a556bd8f8ef"} Mar 14 07:19:26 crc kubenswrapper[5129]: I0314 07:19:18.186115 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" Mar 14 07:19:28 crc kubenswrapper[5129]: I0314 07:19:28.547384 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" event={"ID":"5b7ec223-c34a-45c1-926c-e957e8cd3086","Type":"ContainerStarted","Data":"93cf4eb55a11af5ac1e086ba7fef706e77d73debe66eb5bd7038448fa6c466fb"} Mar 14 07:19:29 crc kubenswrapper[5129]: I0314 07:19:29.556652 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:19:33 crc kubenswrapper[5129]: I0314 07:19:33.753141 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" Mar 14 07:19:33 crc kubenswrapper[5129]: I0314 07:19:33.776272 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8b67l" podStartSLOduration=38.709472844 podStartE2EDuration="52.776256412s" podCreationTimestamp="2026-03-14 07:18:41 +0000 UTC" firstStartedPulling="2026-03-14 07:19:14.257106708 +0000 UTC m=+1217.009021882" lastFinishedPulling="2026-03-14 07:19:28.323890256 +0000 UTC m=+1231.075805450" observedRunningTime="2026-03-14 07:19:29.586709104 +0000 UTC m=+1232.338624338" watchObservedRunningTime="2026-03-14 07:19:33.776256412 +0000 UTC m=+1236.528171596" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.450314 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-kddtf"] Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.452420 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.458315 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-trlg4" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.458501 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.458651 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.458957 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.461288 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-kddtf"] Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.516189 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-d5jt6"] Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.517873 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.520618 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.533473 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-d5jt6"] Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.633321 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jws7c\" (UniqueName: \"kubernetes.io/projected/665de518-5068-49e6-a169-4fa6185f0712-kube-api-access-jws7c\") pod \"dnsmasq-dns-5448ff6dc7-kddtf\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.633385 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxfgk\" (UniqueName: \"kubernetes.io/projected/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-kube-api-access-zxfgk\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.633413 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/665de518-5068-49e6-a169-4fa6185f0712-config\") pod \"dnsmasq-dns-5448ff6dc7-kddtf\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.633450 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-dns-svc\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.633473 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-config\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.734834 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/665de518-5068-49e6-a169-4fa6185f0712-config\") pod \"dnsmasq-dns-5448ff6dc7-kddtf\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.734900 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-dns-svc\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.734927 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-config\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.734989 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jws7c\" (UniqueName: \"kubernetes.io/projected/665de518-5068-49e6-a169-4fa6185f0712-kube-api-access-jws7c\") pod \"dnsmasq-dns-5448ff6dc7-kddtf\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.735012 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxfgk\" (UniqueName: \"kubernetes.io/projected/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-kube-api-access-zxfgk\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.736244 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/665de518-5068-49e6-a169-4fa6185f0712-config\") pod \"dnsmasq-dns-5448ff6dc7-kddtf\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.736709 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-dns-svc\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.737114 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-config\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.754174 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jws7c\" (UniqueName: \"kubernetes.io/projected/665de518-5068-49e6-a169-4fa6185f0712-kube-api-access-jws7c\") pod \"dnsmasq-dns-5448ff6dc7-kddtf\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.754575 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxfgk\" (UniqueName: \"kubernetes.io/projected/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-kube-api-access-zxfgk\") pod \"dnsmasq-dns-64696987c5-d5jt6\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.775626 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:19:48 crc kubenswrapper[5129]: I0314 07:19:48.835978 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:19:49 crc kubenswrapper[5129]: I0314 07:19:49.205101 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-kddtf"] Mar 14 07:19:49 crc kubenswrapper[5129]: W0314 07:19:49.206981 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod665de518_5068_49e6_a169_4fa6185f0712.slice/crio-6d4e9168ed9a3344eb9bd790ede789a0f36f7f73e498f0b5fe0284b6c982e9fc WatchSource:0}: Error finding container 6d4e9168ed9a3344eb9bd790ede789a0f36f7f73e498f0b5fe0284b6c982e9fc: Status 404 returned error can't find the container with id 6d4e9168ed9a3344eb9bd790ede789a0f36f7f73e498f0b5fe0284b6c982e9fc Mar 14 07:19:49 crc kubenswrapper[5129]: I0314 07:19:49.285840 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-d5jt6"] Mar 14 07:19:49 crc kubenswrapper[5129]: W0314 07:19:49.294801 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e7e5c36_ac27_46ce_b7b1_5a540d94236b.slice/crio-5a96323f3cddfffccc4bdfb8ec593199e92b67a96449fa8e4397cda40e14c3c2 WatchSource:0}: Error finding container 5a96323f3cddfffccc4bdfb8ec593199e92b67a96449fa8e4397cda40e14c3c2: Status 404 returned error can't find the container with id 5a96323f3cddfffccc4bdfb8ec593199e92b67a96449fa8e4397cda40e14c3c2 Mar 14 07:19:49 crc kubenswrapper[5129]: I0314 07:19:49.701660 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-d5jt6" event={"ID":"8e7e5c36-ac27-46ce-b7b1-5a540d94236b","Type":"ContainerStarted","Data":"5a96323f3cddfffccc4bdfb8ec593199e92b67a96449fa8e4397cda40e14c3c2"} Mar 14 07:19:49 crc kubenswrapper[5129]: I0314 07:19:49.703119 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" event={"ID":"665de518-5068-49e6-a169-4fa6185f0712","Type":"ContainerStarted","Data":"6d4e9168ed9a3344eb9bd790ede789a0f36f7f73e498f0b5fe0284b6c982e9fc"} Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.052690 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-kddtf"] Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.083320 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-gqmzs"] Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.084858 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.102248 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-gqmzs"] Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.181257 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6dl\" (UniqueName: \"kubernetes.io/projected/fa25a41c-461a-4e47-b408-37f7a30eef64-kube-api-access-nj6dl\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.181505 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-config\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.181535 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.282468 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6dl\" (UniqueName: \"kubernetes.io/projected/fa25a41c-461a-4e47-b408-37f7a30eef64-kube-api-access-nj6dl\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.282530 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-config\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.282558 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.283766 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.287635 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-config\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.319187 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6dl\" (UniqueName: \"kubernetes.io/projected/fa25a41c-461a-4e47-b408-37f7a30eef64-kube-api-access-nj6dl\") pod \"dnsmasq-dns-658f55c9f5-gqmzs\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.374864 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-d5jt6"] Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.414683 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-d2ct6"] Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.415887 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.435430 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-d2ct6"] Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.469339 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.586851 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.586945 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-config\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.586977 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvgt\" (UniqueName: \"kubernetes.io/projected/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-kube-api-access-ddvgt\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.688469 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.688783 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-config\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.688811 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvgt\" (UniqueName: \"kubernetes.io/projected/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-kube-api-access-ddvgt\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.689542 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-config\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.689543 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.725431 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvgt\" (UniqueName: \"kubernetes.io/projected/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-kube-api-access-ddvgt\") pod \"dnsmasq-dns-54b5dffb47-d2ct6\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:51 crc kubenswrapper[5129]: I0314 07:19:51.765329 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.030736 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-d2ct6"] Mar 14 07:19:52 crc kubenswrapper[5129]: W0314 07:19:52.032132 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab4cf8c8_94d5_4e58_b268_a5ba23eaa4cd.slice/crio-ddad2764100808dc9a248b0272f875d8b7e0ef9b8709fe2756dcde9d2c494abf WatchSource:0}: Error finding container ddad2764100808dc9a248b0272f875d8b7e0ef9b8709fe2756dcde9d2c494abf: Status 404 returned error can't find the container with id ddad2764100808dc9a248b0272f875d8b7e0ef9b8709fe2756dcde9d2c494abf Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.054246 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-gqmzs"] Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.270069 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.277536 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.280989 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.281154 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.281673 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.282566 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.282811 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.283092 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-626g5" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.285415 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.291316 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.402969 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403018 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403047 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6261e6b-f331-4dcb-8380-167e8f547e1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403065 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7zg6\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-kube-api-access-m7zg6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403091 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403112 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6261e6b-f331-4dcb-8380-167e8f547e1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403140 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403176 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403255 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403284 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.403326 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.504765 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7zg6\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-kube-api-access-m7zg6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.504838 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.504872 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6261e6b-f331-4dcb-8380-167e8f547e1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.504910 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.504931 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.504989 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.505017 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.505047 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.505080 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.505112 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.505136 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6261e6b-f331-4dcb-8380-167e8f547e1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.505831 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.506406 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.507302 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.507485 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.509190 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.509430 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.514314 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6261e6b-f331-4dcb-8380-167e8f547e1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.515192 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.516676 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.521332 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6261e6b-f331-4dcb-8380-167e8f547e1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.522814 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7zg6\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-kube-api-access-m7zg6\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.531339 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.578572 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.579871 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.585228 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.585362 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.585478 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.585481 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.585594 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-v5pfk" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.585934 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.586221 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.605134 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.657198 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.707910 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.707976 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708021 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708048 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708085 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708103 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708135 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708178 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhbj\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-kube-api-access-ndhbj\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708194 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708242 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.708672 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.762361 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" event={"ID":"fa25a41c-461a-4e47-b408-37f7a30eef64","Type":"ContainerStarted","Data":"a0adae411e2f9f7bc6c5779fae4a6acd805d52fdbdb8756543a5f5dfeb13aee0"} Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.763694 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" event={"ID":"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd","Type":"ContainerStarted","Data":"ddad2764100808dc9a248b0272f875d8b7e0ef9b8709fe2756dcde9d2c494abf"} Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810309 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810355 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810375 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810414 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810450 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhbj\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-kube-api-access-ndhbj\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810472 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810635 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810690 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810714 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810745 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.810785 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.813908 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.821884 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.822378 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.822981 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.823761 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.824007 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.824929 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.825167 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.825884 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.830630 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.856228 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhbj\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-kube-api-access-ndhbj\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.864668 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " pod="openstack/rabbitmq-server-0" Mar 14 07:19:52 crc kubenswrapper[5129]: I0314 07:19:52.913856 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.920807 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.923095 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.928281 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gq57p" Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.928517 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.928855 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.929632 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.929696 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:19:53 crc kubenswrapper[5129]: I0314 07:19:53.939140 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032379 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kolla-config\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032425 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbr7t\" (UniqueName: \"kubernetes.io/projected/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kube-api-access-mbr7t\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032456 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032571 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032593 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032648 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032669 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.032695 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-default\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.134676 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-default\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.134750 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kolla-config\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.134773 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbr7t\" (UniqueName: \"kubernetes.io/projected/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kube-api-access-mbr7t\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135059 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135128 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135154 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135210 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135241 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135539 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135689 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-default\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135767 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kolla-config\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.135946 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.137101 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.141433 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.157440 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.166326 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.189150 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbr7t\" (UniqueName: \"kubernetes.io/projected/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kube-api-access-mbr7t\") pod \"openstack-galera-0\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " pod="openstack/openstack-galera-0" Mar 14 07:19:54 crc kubenswrapper[5129]: I0314 07:19:54.254420 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.277892 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.279761 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.282673 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.282979 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.283274 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nvvs9" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.283432 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.302855 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.352470 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.352518 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.352551 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/54d01ce1-11e5-4fc7-a120-44d9d3407142-kube-api-access-k4mph\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.352784 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.352858 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.353003 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.353099 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.353130 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458060 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458210 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458293 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458318 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458381 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458440 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458691 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/54d01ce1-11e5-4fc7-a120-44d9d3407142-kube-api-access-k4mph\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.458767 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.459291 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.459706 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.462315 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.463050 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.465454 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.467927 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.474307 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.488281 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/54d01ce1-11e5-4fc7-a120-44d9d3407142-kube-api-access-k4mph\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.491784 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.600028 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.762960 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.764038 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.770509 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.777694 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-n2m52" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.778007 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.778178 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.864962 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-config-data\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.865001 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl5dm\" (UniqueName: \"kubernetes.io/projected/659ee685-6b83-4af2-bd2e-e5ce9372e408-kube-api-access-wl5dm\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.865032 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-kolla-config\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.865064 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-combined-ca-bundle\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.865196 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-memcached-tls-certs\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.967402 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-config-data\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.967441 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl5dm\" (UniqueName: \"kubernetes.io/projected/659ee685-6b83-4af2-bd2e-e5ce9372e408-kube-api-access-wl5dm\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.967471 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-kolla-config\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.967503 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-combined-ca-bundle\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.967540 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-memcached-tls-certs\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.968260 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-config-data\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.968577 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-kolla-config\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.973233 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-combined-ca-bundle\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.980122 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-memcached-tls-certs\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:55 crc kubenswrapper[5129]: I0314 07:19:55.984712 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl5dm\" (UniqueName: \"kubernetes.io/projected/659ee685-6b83-4af2-bd2e-e5ce9372e408-kube-api-access-wl5dm\") pod \"memcached-0\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " pod="openstack/memcached-0" Mar 14 07:19:56 crc kubenswrapper[5129]: I0314 07:19:56.105908 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.659514 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.660406 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.664625 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-w9qw7" Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.672234 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.795877 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frx8j\" (UniqueName: \"kubernetes.io/projected/99b46a46-8c64-44f3-b7d7-b07c09be258d-kube-api-access-frx8j\") pod \"kube-state-metrics-0\" (UID: \"99b46a46-8c64-44f3-b7d7-b07c09be258d\") " pod="openstack/kube-state-metrics-0" Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.897294 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frx8j\" (UniqueName: \"kubernetes.io/projected/99b46a46-8c64-44f3-b7d7-b07c09be258d-kube-api-access-frx8j\") pod \"kube-state-metrics-0\" (UID: \"99b46a46-8c64-44f3-b7d7-b07c09be258d\") " pod="openstack/kube-state-metrics-0" Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.930468 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frx8j\" (UniqueName: \"kubernetes.io/projected/99b46a46-8c64-44f3-b7d7-b07c09be258d-kube-api-access-frx8j\") pod \"kube-state-metrics-0\" (UID: \"99b46a46-8c64-44f3-b7d7-b07c09be258d\") " pod="openstack/kube-state-metrics-0" Mar 14 07:19:57 crc kubenswrapper[5129]: I0314 07:19:57.976122 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.035973 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9bsx2"] Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.038226 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.047240 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.047625 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bsx2"] Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.047774 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.047812 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-b6dvj" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.099968 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cfdh9"] Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.101902 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.107658 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cfdh9"] Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.114693 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-scripts\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.114737 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7kq\" (UniqueName: \"kubernetes.io/projected/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-kube-api-access-xf7kq\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.114809 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-combined-ca-bundle\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.114842 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-ovn-controller-tls-certs\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.114890 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.114914 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-log-ovn\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.114931 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run-ovn\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216067 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-combined-ca-bundle\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216116 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-ovn-controller-tls-certs\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216153 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwl68\" (UniqueName: \"kubernetes.io/projected/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-kube-api-access-hwl68\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216173 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-run\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216207 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216230 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run-ovn\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216244 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-log-ovn\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216276 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-etc-ovs\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216297 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-log\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216319 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-scripts\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216338 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-scripts\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216359 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7kq\" (UniqueName: \"kubernetes.io/projected/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-kube-api-access-xf7kq\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.216376 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-lib\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.217491 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.217540 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run-ovn\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.217617 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-log-ovn\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.218874 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-scripts\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.221295 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-ovn-controller-tls-certs\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.223891 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-combined-ca-bundle\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.236415 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7kq\" (UniqueName: \"kubernetes.io/projected/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-kube-api-access-xf7kq\") pod \"ovn-controller-9bsx2\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.317716 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-lib\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.317818 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwl68\" (UniqueName: \"kubernetes.io/projected/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-kube-api-access-hwl68\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.317851 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-run\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.317919 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-etc-ovs\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.317952 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-log\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.317985 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-scripts\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.318177 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-lib\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.318274 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-run\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.318471 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-log\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.318562 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-etc-ovs\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.320730 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-scripts\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.346509 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwl68\" (UniqueName: \"kubernetes.io/projected/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-kube-api-access-hwl68\") pod \"ovn-controller-ovs-cfdh9\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.353899 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2" Mar 14 07:19:59 crc kubenswrapper[5129]: I0314 07:19:59.419334 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.126327 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557880-jmwkx"] Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.130346 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.132822 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.133139 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.135224 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.136763 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-jmwkx"] Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.232964 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65746\" (UniqueName: \"kubernetes.io/projected/acac1a33-44b5-4200-b5c1-91a2339283b9-kube-api-access-65746\") pod \"auto-csr-approver-29557880-jmwkx\" (UID: \"acac1a33-44b5-4200-b5c1-91a2339283b9\") " pod="openshift-infra/auto-csr-approver-29557880-jmwkx" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.334911 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65746\" (UniqueName: \"kubernetes.io/projected/acac1a33-44b5-4200-b5c1-91a2339283b9-kube-api-access-65746\") pod \"auto-csr-approver-29557880-jmwkx\" (UID: \"acac1a33-44b5-4200-b5c1-91a2339283b9\") " pod="openshift-infra/auto-csr-approver-29557880-jmwkx" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.370288 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65746\" (UniqueName: \"kubernetes.io/projected/acac1a33-44b5-4200-b5c1-91a2339283b9-kube-api-access-65746\") pod \"auto-csr-approver-29557880-jmwkx\" (UID: \"acac1a33-44b5-4200-b5c1-91a2339283b9\") " pod="openshift-infra/auto-csr-approver-29557880-jmwkx" Mar 14 07:20:00 crc kubenswrapper[5129]: I0314 07:20:00.451273 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.480844 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.483066 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.491466 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.521777 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.521915 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.522068 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.522175 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.522486 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-shznb" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623423 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623485 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623523 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623545 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623652 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623718 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-config\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623775 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptn6\" (UniqueName: \"kubernetes.io/projected/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-kube-api-access-7ptn6\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.623861 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725348 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725398 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725430 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725451 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725473 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725497 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-config\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725524 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptn6\" (UniqueName: \"kubernetes.io/projected/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-kube-api-access-7ptn6\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.725554 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.726424 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.726933 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-config\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.727386 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.727620 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.737732 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.737746 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.738051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.747307 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptn6\" (UniqueName: \"kubernetes.io/projected/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-kube-api-access-7ptn6\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.749231 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:02 crc kubenswrapper[5129]: I0314 07:20:02.853233 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:04 crc kubenswrapper[5129]: E0314 07:20:04.092795 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 14 07:20:04 crc kubenswrapper[5129]: E0314 07:20:04.092998 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jws7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-kddtf_openstack(665de518-5068-49e6-a169-4fa6185f0712): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:20:04 crc kubenswrapper[5129]: E0314 07:20:04.094940 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" podUID="665de518-5068-49e6-a169-4fa6185f0712" Mar 14 07:20:04 crc kubenswrapper[5129]: E0314 07:20:04.101132 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 14 07:20:04 crc kubenswrapper[5129]: E0314 07:20:04.101324 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxfgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-d5jt6_openstack(8e7e5c36-ac27-46ce-b7b1-5a540d94236b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:20:04 crc kubenswrapper[5129]: E0314 07:20:04.102551 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-d5jt6" podUID="8e7e5c36-ac27-46ce-b7b1-5a540d94236b" Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.588237 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.687627 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.824462 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:20:04 crc kubenswrapper[5129]: W0314 07:20:04.830003 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99b46a46_8c64_44f3_b7d7_b07c09be258d.slice/crio-d0044acc42d94bfe174400345b683041caad94b064e53d549876e206c543139a WatchSource:0}: Error finding container d0044acc42d94bfe174400345b683041caad94b064e53d549876e206c543139a: Status 404 returned error can't find the container with id d0044acc42d94bfe174400345b683041caad94b064e53d549876e206c543139a Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.850265 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 07:20:04 crc kubenswrapper[5129]: W0314 07:20:04.859704 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod659ee685_6b83_4af2_bd2e_e5ce9372e408.slice/crio-e2e2589920cfdc3b915a03e0c59ca4d3599190e1d0272f7ed29588b81322f69e WatchSource:0}: Error finding container e2e2589920cfdc3b915a03e0c59ca4d3599190e1d0272f7ed29588b81322f69e: Status 404 returned error can't find the container with id e2e2589920cfdc3b915a03e0c59ca4d3599190e1d0272f7ed29588b81322f69e Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.863844 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bsx2"] Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.874348 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.876215 5129 generic.go:334] "Generic (PLEG): container finished" podID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerID="b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6" exitCode=0 Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.876269 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" event={"ID":"fa25a41c-461a-4e47-b408-37f7a30eef64","Type":"ContainerDied","Data":"b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6"} Mar 14 07:20:04 crc kubenswrapper[5129]: W0314 07:20:04.877507 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6261e6b_f331_4dcb_8380_167e8f547e1b.slice/crio-600515935a33ea69c5e4fbd0d79d4b199a7f7e89cf5337f3055894057821f3c4 WatchSource:0}: Error finding container 600515935a33ea69c5e4fbd0d79d4b199a7f7e89cf5337f3055894057821f3c4: Status 404 returned error can't find the container with id 600515935a33ea69c5e4fbd0d79d4b199a7f7e89cf5337f3055894057821f3c4 Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.877958 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d291cef2-24d8-4ae6-aa4f-dfa8e782db15","Type":"ContainerStarted","Data":"680615c8f16f3b261768db9c4ddfd8b7d973d879da64bbb56cb94b530c17a8bb"} Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.882185 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99b46a46-8c64-44f3-b7d7-b07c09be258d","Type":"ContainerStarted","Data":"d0044acc42d94bfe174400345b683041caad94b064e53d549876e206c543139a"} Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.886101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eca83e14-f023-4dbd-b646-c9fc5a9e177e","Type":"ContainerStarted","Data":"42e109de26a9b91954d843c71030ca80112572c8f504742616b8e1b0269362a0"} Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.887559 5129 generic.go:334] "Generic (PLEG): container finished" podID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerID="652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5" exitCode=0 Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.888603 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" event={"ID":"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd","Type":"ContainerDied","Data":"652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5"} Mar 14 07:20:04 crc kubenswrapper[5129]: I0314 07:20:04.896029 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-jmwkx"] Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.020331 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.037524 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.039288 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.046541 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.046812 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8sn7c" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.046992 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.047162 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.107125 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.166657 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.166722 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.166760 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.166793 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.166826 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.167975 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8chx\" (UniqueName: \"kubernetes.io/projected/d992f450-3800-45e2-abf4-41597a15f0c3-kube-api-access-n8chx\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.168098 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.168136 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.192680 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:20:05 crc kubenswrapper[5129]: E0314 07:20:05.238643 5129 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 14 07:20:05 crc kubenswrapper[5129]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/fa25a41c-461a-4e47-b408-37f7a30eef64/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 14 07:20:05 crc kubenswrapper[5129]: > podSandboxID="a0adae411e2f9f7bc6c5779fae4a6acd805d52fdbdb8756543a5f5dfeb13aee0" Mar 14 07:20:05 crc kubenswrapper[5129]: E0314 07:20:05.239016 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:20:05 crc kubenswrapper[5129]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nj6dl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-gqmzs_openstack(fa25a41c-461a-4e47-b408-37f7a30eef64): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/fa25a41c-461a-4e47-b408-37f7a30eef64/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 14 07:20:05 crc kubenswrapper[5129]: > logger="UnhandledError" Mar 14 07:20:05 crc kubenswrapper[5129]: E0314 07:20:05.240195 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/fa25a41c-461a-4e47-b408-37f7a30eef64/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" Mar 14 07:20:05 crc kubenswrapper[5129]: W0314 07:20:05.270937 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2172c89_1ef3_451d_abc8_2ee8ac3bb4a7.slice/crio-0ec6ffd0dcd362ce56136bf653ba0a2b11b2c895fb574f364bb8252a5efab785 WatchSource:0}: Error finding container 0ec6ffd0dcd362ce56136bf653ba0a2b11b2c895fb574f364bb8252a5efab785: Status 404 returned error can't find the container with id 0ec6ffd0dcd362ce56136bf653ba0a2b11b2c895fb574f364bb8252a5efab785 Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.271518 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8chx\" (UniqueName: \"kubernetes.io/projected/d992f450-3800-45e2-abf4-41597a15f0c3-kube-api-access-n8chx\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.271603 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.271642 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.273172 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.273241 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.273280 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.273304 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.273340 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.273365 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.274241 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.275077 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.278194 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.279502 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.279713 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.280244 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.298413 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8chx\" (UniqueName: \"kubernetes.io/projected/d992f450-3800-45e2-abf4-41597a15f0c3-kube-api-access-n8chx\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.299381 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.360988 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.367312 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.445764 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.476035 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jws7c\" (UniqueName: \"kubernetes.io/projected/665de518-5068-49e6-a169-4fa6185f0712-kube-api-access-jws7c\") pod \"665de518-5068-49e6-a169-4fa6185f0712\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.476270 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxfgk\" (UniqueName: \"kubernetes.io/projected/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-kube-api-access-zxfgk\") pod \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.476340 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-dns-svc\") pod \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.476378 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/665de518-5068-49e6-a169-4fa6185f0712-config\") pod \"665de518-5068-49e6-a169-4fa6185f0712\" (UID: \"665de518-5068-49e6-a169-4fa6185f0712\") " Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.476426 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-config\") pod \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\" (UID: \"8e7e5c36-ac27-46ce-b7b1-5a540d94236b\") " Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.476867 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/665de518-5068-49e6-a169-4fa6185f0712-config" (OuterVolumeSpecName: "config") pod "665de518-5068-49e6-a169-4fa6185f0712" (UID: "665de518-5068-49e6-a169-4fa6185f0712"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.476893 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e7e5c36-ac27-46ce-b7b1-5a540d94236b" (UID: "8e7e5c36-ac27-46ce-b7b1-5a540d94236b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.477291 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-config" (OuterVolumeSpecName: "config") pod "8e7e5c36-ac27-46ce-b7b1-5a540d94236b" (UID: "8e7e5c36-ac27-46ce-b7b1-5a540d94236b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.480188 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665de518-5068-49e6-a169-4fa6185f0712-kube-api-access-jws7c" (OuterVolumeSpecName: "kube-api-access-jws7c") pod "665de518-5068-49e6-a169-4fa6185f0712" (UID: "665de518-5068-49e6-a169-4fa6185f0712"). InnerVolumeSpecName "kube-api-access-jws7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.480224 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-kube-api-access-zxfgk" (OuterVolumeSpecName: "kube-api-access-zxfgk") pod "8e7e5c36-ac27-46ce-b7b1-5a540d94236b" (UID: "8e7e5c36-ac27-46ce-b7b1-5a540d94236b"). InnerVolumeSpecName "kube-api-access-zxfgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.579227 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.579293 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/665de518-5068-49e6-a169-4fa6185f0712-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.579302 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.579313 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jws7c\" (UniqueName: \"kubernetes.io/projected/665de518-5068-49e6-a169-4fa6185f0712-kube-api-access-jws7c\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.579325 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxfgk\" (UniqueName: \"kubernetes.io/projected/8e7e5c36-ac27-46ce-b7b1-5a540d94236b-kube-api-access-zxfgk\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.897347 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" event={"ID":"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd","Type":"ContainerStarted","Data":"0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.900461 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.902423 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-d5jt6" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.902432 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-d5jt6" event={"ID":"8e7e5c36-ac27-46ce-b7b1-5a540d94236b","Type":"ContainerDied","Data":"5a96323f3cddfffccc4bdfb8ec593199e92b67a96449fa8e4397cda40e14c3c2"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.912426 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" event={"ID":"665de518-5068-49e6-a169-4fa6185f0712","Type":"ContainerDied","Data":"6d4e9168ed9a3344eb9bd790ede789a0f36f7f73e498f0b5fe0284b6c982e9fc"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.912672 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-kddtf" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.916451 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" podStartSLOduration=2.65257473 podStartE2EDuration="14.916436591s" podCreationTimestamp="2026-03-14 07:19:51 +0000 UTC" firstStartedPulling="2026-03-14 07:19:52.043173955 +0000 UTC m=+1254.795089139" lastFinishedPulling="2026-03-14 07:20:04.307035816 +0000 UTC m=+1267.058951000" observedRunningTime="2026-03-14 07:20:05.915270179 +0000 UTC m=+1268.667185363" watchObservedRunningTime="2026-03-14 07:20:05.916436591 +0000 UTC m=+1268.668351775" Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.920444 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bsx2" event={"ID":"b6a688fe-1537-4ed7-a1ae-2070ba6b1219","Type":"ContainerStarted","Data":"73e2eb2a448314f0941c9b89f3971d708261829a3793e4b70a331517faf92a07"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.923252 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7","Type":"ContainerStarted","Data":"0ec6ffd0dcd362ce56136bf653ba0a2b11b2c895fb574f364bb8252a5efab785"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.924491 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6261e6b-f331-4dcb-8380-167e8f547e1b","Type":"ContainerStarted","Data":"600515935a33ea69c5e4fbd0d79d4b199a7f7e89cf5337f3055894057821f3c4"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.926162 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" event={"ID":"acac1a33-44b5-4200-b5c1-91a2339283b9","Type":"ContainerStarted","Data":"e9951518597538552fafb37c3c0d1d84db68594bd58a3296029dd87244cd6fb8"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.927850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"659ee685-6b83-4af2-bd2e-e5ce9372e408","Type":"ContainerStarted","Data":"e2e2589920cfdc3b915a03e0c59ca4d3599190e1d0272f7ed29588b81322f69e"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.930251 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"54d01ce1-11e5-4fc7-a120-44d9d3407142","Type":"ContainerStarted","Data":"d285c2fb5c214972150d8fcfeedc394a9c958449f997ac0d0e5f4e88262012ec"} Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.967745 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-d5jt6"] Mar 14 07:20:05 crc kubenswrapper[5129]: I0314 07:20:05.974223 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-d5jt6"] Mar 14 07:20:06 crc kubenswrapper[5129]: I0314 07:20:06.047211 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7e5c36-ac27-46ce-b7b1-5a540d94236b" path="/var/lib/kubelet/pods/8e7e5c36-ac27-46ce-b7b1-5a540d94236b/volumes" Mar 14 07:20:06 crc kubenswrapper[5129]: I0314 07:20:06.047543 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-kddtf"] Mar 14 07:20:06 crc kubenswrapper[5129]: I0314 07:20:06.047563 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-kddtf"] Mar 14 07:20:06 crc kubenswrapper[5129]: I0314 07:20:06.090333 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cfdh9"] Mar 14 07:20:06 crc kubenswrapper[5129]: I0314 07:20:06.681354 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:20:07 crc kubenswrapper[5129]: W0314 07:20:07.113555 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda67a2a4b_fa14_43cf_983b_45df5afc8e4e.slice/crio-dbd0a143bc3475d85459ab0b07ed8da56f8674c1d3e4aee9e8f813091b079dff WatchSource:0}: Error finding container dbd0a143bc3475d85459ab0b07ed8da56f8674c1d3e4aee9e8f813091b079dff: Status 404 returned error can't find the container with id dbd0a143bc3475d85459ab0b07ed8da56f8674c1d3e4aee9e8f813091b079dff Mar 14 07:20:07 crc kubenswrapper[5129]: W0314 07:20:07.172740 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd992f450_3800_45e2_abf4_41597a15f0c3.slice/crio-1fb694959f161f9b05e2ba82059447ae12e5957805fb401864e8376aa812141b WatchSource:0}: Error finding container 1fb694959f161f9b05e2ba82059447ae12e5957805fb401864e8376aa812141b: Status 404 returned error can't find the container with id 1fb694959f161f9b05e2ba82059447ae12e5957805fb401864e8376aa812141b Mar 14 07:20:07 crc kubenswrapper[5129]: I0314 07:20:07.945666 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerStarted","Data":"dbd0a143bc3475d85459ab0b07ed8da56f8674c1d3e4aee9e8f813091b079dff"} Mar 14 07:20:07 crc kubenswrapper[5129]: I0314 07:20:07.947364 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d992f450-3800-45e2-abf4-41597a15f0c3","Type":"ContainerStarted","Data":"1fb694959f161f9b05e2ba82059447ae12e5957805fb401864e8376aa812141b"} Mar 14 07:20:08 crc kubenswrapper[5129]: I0314 07:20:08.059052 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="665de518-5068-49e6-a169-4fa6185f0712" path="/var/lib/kubelet/pods/665de518-5068-49e6-a169-4fa6185f0712/volumes" Mar 14 07:20:11 crc kubenswrapper[5129]: I0314 07:20:11.766856 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:20:11 crc kubenswrapper[5129]: I0314 07:20:11.854733 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-gqmzs"] Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.984452 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99b46a46-8c64-44f3-b7d7-b07c09be258d","Type":"ContainerStarted","Data":"d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0"} Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.985089 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.986433 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bsx2" event={"ID":"b6a688fe-1537-4ed7-a1ae-2070ba6b1219","Type":"ContainerStarted","Data":"59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28"} Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.986536 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9bsx2" Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.988575 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eca83e14-f023-4dbd-b646-c9fc5a9e177e","Type":"ContainerStarted","Data":"629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac"} Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.991925 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7","Type":"ContainerStarted","Data":"dab8a5ebc5177ef76172886ed36e2482dbb0fd550653e8a3c821777fa2ad984e"} Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.997563 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerName="dnsmasq-dns" containerID="cri-o://845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a" gracePeriod=10 Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.997971 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" event={"ID":"fa25a41c-461a-4e47-b408-37f7a30eef64","Type":"ContainerStarted","Data":"845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a"} Mar 14 07:20:12 crc kubenswrapper[5129]: I0314 07:20:12.998027 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.020089 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerStarted","Data":"0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc"} Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.024871 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"54d01ce1-11e5-4fc7-a120-44d9d3407142","Type":"ContainerStarted","Data":"2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937"} Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.028567 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.379381706 podStartE2EDuration="16.028547317s" podCreationTimestamp="2026-03-14 07:19:57 +0000 UTC" firstStartedPulling="2026-03-14 07:20:04.834862378 +0000 UTC m=+1267.586777562" lastFinishedPulling="2026-03-14 07:20:12.484027989 +0000 UTC m=+1275.235943173" observedRunningTime="2026-03-14 07:20:13.023215212 +0000 UTC m=+1275.775130396" watchObservedRunningTime="2026-03-14 07:20:13.028547317 +0000 UTC m=+1275.780462491" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.035631 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" event={"ID":"acac1a33-44b5-4200-b5c1-91a2339283b9","Type":"ContainerStarted","Data":"eed30ff2173035b830ab6b839b3e1b27368ac2ee4887106fc76a47c2d18dd2ca"} Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.039353 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" podStartSLOduration=9.793001279 podStartE2EDuration="22.039339722s" podCreationTimestamp="2026-03-14 07:19:51 +0000 UTC" firstStartedPulling="2026-03-14 07:19:52.066417529 +0000 UTC m=+1254.818332713" lastFinishedPulling="2026-03-14 07:20:04.312755972 +0000 UTC m=+1267.064671156" observedRunningTime="2026-03-14 07:20:13.038050126 +0000 UTC m=+1275.789965310" watchObservedRunningTime="2026-03-14 07:20:13.039339722 +0000 UTC m=+1275.791254906" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.039636 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"659ee685-6b83-4af2-bd2e-e5ce9372e408","Type":"ContainerStarted","Data":"babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73"} Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.039681 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.041955 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d992f450-3800-45e2-abf4-41597a15f0c3","Type":"ContainerStarted","Data":"27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c"} Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.093636 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9bsx2" podStartSLOduration=6.961997033 podStartE2EDuration="14.093615732s" podCreationTimestamp="2026-03-14 07:19:59 +0000 UTC" firstStartedPulling="2026-03-14 07:20:04.873428061 +0000 UTC m=+1267.625343245" lastFinishedPulling="2026-03-14 07:20:12.00504676 +0000 UTC m=+1274.756961944" observedRunningTime="2026-03-14 07:20:13.090528788 +0000 UTC m=+1275.842443982" watchObservedRunningTime="2026-03-14 07:20:13.093615732 +0000 UTC m=+1275.845530916" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.103986 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" podStartSLOduration=6.013438628 podStartE2EDuration="13.103968565s" podCreationTimestamp="2026-03-14 07:20:00 +0000 UTC" firstStartedPulling="2026-03-14 07:20:04.914487942 +0000 UTC m=+1267.666403126" lastFinishedPulling="2026-03-14 07:20:12.005017879 +0000 UTC m=+1274.756933063" observedRunningTime="2026-03-14 07:20:13.10157451 +0000 UTC m=+1275.853489694" watchObservedRunningTime="2026-03-14 07:20:13.103968565 +0000 UTC m=+1275.855883749" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.138672 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.67942082 podStartE2EDuration="18.138647341s" podCreationTimestamp="2026-03-14 07:19:55 +0000 UTC" firstStartedPulling="2026-03-14 07:20:04.865438103 +0000 UTC m=+1267.617353287" lastFinishedPulling="2026-03-14 07:20:11.324664614 +0000 UTC m=+1274.076579808" observedRunningTime="2026-03-14 07:20:13.134324184 +0000 UTC m=+1275.886239368" watchObservedRunningTime="2026-03-14 07:20:13.138647341 +0000 UTC m=+1275.890562525" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.726751 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.836093 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc\") pod \"fa25a41c-461a-4e47-b408-37f7a30eef64\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.836162 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-config\") pod \"fa25a41c-461a-4e47-b408-37f7a30eef64\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.836195 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj6dl\" (UniqueName: \"kubernetes.io/projected/fa25a41c-461a-4e47-b408-37f7a30eef64-kube-api-access-nj6dl\") pod \"fa25a41c-461a-4e47-b408-37f7a30eef64\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.841625 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa25a41c-461a-4e47-b408-37f7a30eef64-kube-api-access-nj6dl" (OuterVolumeSpecName: "kube-api-access-nj6dl") pod "fa25a41c-461a-4e47-b408-37f7a30eef64" (UID: "fa25a41c-461a-4e47-b408-37f7a30eef64"). InnerVolumeSpecName "kube-api-access-nj6dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:13 crc kubenswrapper[5129]: E0314 07:20:13.870980 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc podName:fa25a41c-461a-4e47-b408-37f7a30eef64 nodeName:}" failed. No retries permitted until 2026-03-14 07:20:14.370951534 +0000 UTC m=+1277.122866718 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc") pod "fa25a41c-461a-4e47-b408-37f7a30eef64" (UID: "fa25a41c-461a-4e47-b408-37f7a30eef64") : error deleting /var/lib/kubelet/pods/fa25a41c-461a-4e47-b408-37f7a30eef64/volume-subpaths: remove /var/lib/kubelet/pods/fa25a41c-461a-4e47-b408-37f7a30eef64/volume-subpaths: no such file or directory Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.871401 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-config" (OuterVolumeSpecName: "config") pod "fa25a41c-461a-4e47-b408-37f7a30eef64" (UID: "fa25a41c-461a-4e47-b408-37f7a30eef64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.941512 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:13 crc kubenswrapper[5129]: I0314 07:20:13.941579 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj6dl\" (UniqueName: \"kubernetes.io/projected/fa25a41c-461a-4e47-b408-37f7a30eef64-kube-api-access-nj6dl\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.058988 5129 generic.go:334] "Generic (PLEG): container finished" podID="acac1a33-44b5-4200-b5c1-91a2339283b9" containerID="eed30ff2173035b830ab6b839b3e1b27368ac2ee4887106fc76a47c2d18dd2ca" exitCode=0 Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.059054 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" event={"ID":"acac1a33-44b5-4200-b5c1-91a2339283b9","Type":"ContainerDied","Data":"eed30ff2173035b830ab6b839b3e1b27368ac2ee4887106fc76a47c2d18dd2ca"} Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.061466 5129 generic.go:334] "Generic (PLEG): container finished" podID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerID="0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc" exitCode=0 Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.061513 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerDied","Data":"0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc"} Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.064965 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6261e6b-f331-4dcb-8380-167e8f547e1b","Type":"ContainerStarted","Data":"50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0"} Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.066784 5129 generic.go:334] "Generic (PLEG): container finished" podID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerID="845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a" exitCode=0 Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.066833 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" event={"ID":"fa25a41c-461a-4e47-b408-37f7a30eef64","Type":"ContainerDied","Data":"845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a"} Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.066853 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" event={"ID":"fa25a41c-461a-4e47-b408-37f7a30eef64","Type":"ContainerDied","Data":"a0adae411e2f9f7bc6c5779fae4a6acd805d52fdbdb8756543a5f5dfeb13aee0"} Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.066872 5129 scope.go:117] "RemoveContainer" containerID="845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.066986 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-gqmzs" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.077414 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d291cef2-24d8-4ae6-aa4f-dfa8e782db15","Type":"ContainerStarted","Data":"6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122"} Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.116304 5129 scope.go:117] "RemoveContainer" containerID="b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.144794 5129 scope.go:117] "RemoveContainer" containerID="845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a" Mar 14 07:20:14 crc kubenswrapper[5129]: E0314 07:20:14.145344 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a\": container with ID starting with 845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a not found: ID does not exist" containerID="845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.145386 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a"} err="failed to get container status \"845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a\": rpc error: code = NotFound desc = could not find container \"845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a\": container with ID starting with 845deb2394e31401b64334262cac181728a60ca426441c66490e4e8d2b9e439a not found: ID does not exist" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.145433 5129 scope.go:117] "RemoveContainer" containerID="b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6" Mar 14 07:20:14 crc kubenswrapper[5129]: E0314 07:20:14.146508 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6\": container with ID starting with b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6 not found: ID does not exist" containerID="b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.146559 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6"} err="failed to get container status \"b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6\": rpc error: code = NotFound desc = could not find container \"b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6\": container with ID starting with b40da2f8dca73f725bcd044cc15a587cd9193ff2e2676da319a8026f3d2e37c6 not found: ID does not exist" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.449867 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc\") pod \"fa25a41c-461a-4e47-b408-37f7a30eef64\" (UID: \"fa25a41c-461a-4e47-b408-37f7a30eef64\") " Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.450404 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa25a41c-461a-4e47-b408-37f7a30eef64" (UID: "fa25a41c-461a-4e47-b408-37f7a30eef64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.450727 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa25a41c-461a-4e47-b408-37f7a30eef64-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.697388 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-gqmzs"] Mar 14 07:20:14 crc kubenswrapper[5129]: I0314 07:20:14.703307 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-gqmzs"] Mar 14 07:20:14 crc kubenswrapper[5129]: E0314 07:20:14.799245 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa25a41c_461a_4e47_b408_37f7a30eef64.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa25a41c_461a_4e47_b408_37f7a30eef64.slice/crio-a0adae411e2f9f7bc6c5779fae4a6acd805d52fdbdb8756543a5f5dfeb13aee0\": RecentStats: unable to find data in memory cache]" Mar 14 07:20:15 crc kubenswrapper[5129]: I0314 07:20:15.087766 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerStarted","Data":"dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62"} Mar 14 07:20:15 crc kubenswrapper[5129]: I0314 07:20:15.088088 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerStarted","Data":"d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef"} Mar 14 07:20:15 crc kubenswrapper[5129]: I0314 07:20:15.552280 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" Mar 14 07:20:15 crc kubenswrapper[5129]: I0314 07:20:15.569999 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cfdh9" podStartSLOduration=11.927530756 podStartE2EDuration="16.569981224s" podCreationTimestamp="2026-03-14 07:19:59 +0000 UTC" firstStartedPulling="2026-03-14 07:20:07.11627986 +0000 UTC m=+1269.868195084" lastFinishedPulling="2026-03-14 07:20:11.758730368 +0000 UTC m=+1274.510645552" observedRunningTime="2026-03-14 07:20:15.110992189 +0000 UTC m=+1277.862907383" watchObservedRunningTime="2026-03-14 07:20:15.569981224 +0000 UTC m=+1278.321896418" Mar 14 07:20:15 crc kubenswrapper[5129]: I0314 07:20:15.670217 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65746\" (UniqueName: \"kubernetes.io/projected/acac1a33-44b5-4200-b5c1-91a2339283b9-kube-api-access-65746\") pod \"acac1a33-44b5-4200-b5c1-91a2339283b9\" (UID: \"acac1a33-44b5-4200-b5c1-91a2339283b9\") " Mar 14 07:20:15 crc kubenswrapper[5129]: I0314 07:20:15.676422 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acac1a33-44b5-4200-b5c1-91a2339283b9-kube-api-access-65746" (OuterVolumeSpecName: "kube-api-access-65746") pod "acac1a33-44b5-4200-b5c1-91a2339283b9" (UID: "acac1a33-44b5-4200-b5c1-91a2339283b9"). InnerVolumeSpecName "kube-api-access-65746". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:15 crc kubenswrapper[5129]: I0314 07:20:15.771664 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65746\" (UniqueName: \"kubernetes.io/projected/acac1a33-44b5-4200-b5c1-91a2339283b9-kube-api-access-65746\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.051762 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" path="/var/lib/kubelet/pods/fa25a41c-461a-4e47-b408-37f7a30eef64/volumes" Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.102386 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.102480 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557880-jmwkx" event={"ID":"acac1a33-44b5-4200-b5c1-91a2339283b9","Type":"ContainerDied","Data":"e9951518597538552fafb37c3c0d1d84db68594bd58a3296029dd87244cd6fb8"} Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.102520 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9951518597538552fafb37c3c0d1d84db68594bd58a3296029dd87244cd6fb8" Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.102542 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.102591 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.634489 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-pdjsp"] Mar 14 07:20:16 crc kubenswrapper[5129]: I0314 07:20:16.640468 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557874-pdjsp"] Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.108923 5129 generic.go:334] "Generic (PLEG): container finished" podID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerID="2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937" exitCode=0 Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.108992 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"54d01ce1-11e5-4fc7-a120-44d9d3407142","Type":"ContainerDied","Data":"2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937"} Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.110973 5129 generic.go:334] "Generic (PLEG): container finished" podID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerID="629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac" exitCode=0 Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.111005 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eca83e14-f023-4dbd-b646-c9fc5a9e177e","Type":"ContainerDied","Data":"629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac"} Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.117312 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7","Type":"ContainerStarted","Data":"ae523da99b25c8f299915e49323351372ae554b450e86eb9aa2d7dd42d03c9b4"} Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.119741 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d992f450-3800-45e2-abf4-41597a15f0c3","Type":"ContainerStarted","Data":"9e3b177f1620bb347c28bfeccd9f33da12c8938fc3fa807f4c5eaa40e5094650"} Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.178058 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.319323604 podStartE2EDuration="16.178035012s" podCreationTimestamp="2026-03-14 07:20:01 +0000 UTC" firstStartedPulling="2026-03-14 07:20:05.273292042 +0000 UTC m=+1268.025207226" lastFinishedPulling="2026-03-14 07:20:16.13200345 +0000 UTC m=+1278.883918634" observedRunningTime="2026-03-14 07:20:17.169498559 +0000 UTC m=+1279.921413773" watchObservedRunningTime="2026-03-14 07:20:17.178035012 +0000 UTC m=+1279.929950236" Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.191228 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.219521596 podStartE2EDuration="13.191208292s" podCreationTimestamp="2026-03-14 07:20:04 +0000 UTC" firstStartedPulling="2026-03-14 07:20:07.19245487 +0000 UTC m=+1269.944370054" lastFinishedPulling="2026-03-14 07:20:16.164141526 +0000 UTC m=+1278.916056750" observedRunningTime="2026-03-14 07:20:17.190281867 +0000 UTC m=+1279.942197111" watchObservedRunningTime="2026-03-14 07:20:17.191208292 +0000 UTC m=+1279.943123476" Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.446769 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.503508 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.854073 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.854134 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.924782 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:17 crc kubenswrapper[5129]: I0314 07:20:17.981105 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.060095 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5741c7-590a-44e3-bdf6-b71f5f7aec49" path="/var/lib/kubelet/pods/0b5741c7-590a-44e3-bdf6-b71f5f7aec49/volumes" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.128969 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"54d01ce1-11e5-4fc7-a120-44d9d3407142","Type":"ContainerStarted","Data":"347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c"} Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.131430 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eca83e14-f023-4dbd-b646-c9fc5a9e177e","Type":"ContainerStarted","Data":"d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f"} Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.132083 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.170788 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.416032286 podStartE2EDuration="24.170537704s" podCreationTimestamp="2026-03-14 07:19:54 +0000 UTC" firstStartedPulling="2026-03-14 07:20:05.047970213 +0000 UTC m=+1267.799885397" lastFinishedPulling="2026-03-14 07:20:11.802475631 +0000 UTC m=+1274.554390815" observedRunningTime="2026-03-14 07:20:18.149931682 +0000 UTC m=+1280.901846906" watchObservedRunningTime="2026-03-14 07:20:18.170537704 +0000 UTC m=+1280.922452918" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.179420 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.180122 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.206843499 podStartE2EDuration="26.180108266s" podCreationTimestamp="2026-03-14 07:19:52 +0000 UTC" firstStartedPulling="2026-03-14 07:20:04.700654296 +0000 UTC m=+1267.452569470" lastFinishedPulling="2026-03-14 07:20:11.673919053 +0000 UTC m=+1274.425834237" observedRunningTime="2026-03-14 07:20:18.169970559 +0000 UTC m=+1280.921885763" watchObservedRunningTime="2026-03-14 07:20:18.180108266 +0000 UTC m=+1280.932023460" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.187869 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.369730 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-trjzl"] Mar 14 07:20:18 crc kubenswrapper[5129]: E0314 07:20:18.370467 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerName="dnsmasq-dns" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.370489 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerName="dnsmasq-dns" Mar 14 07:20:18 crc kubenswrapper[5129]: E0314 07:20:18.370505 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerName="init" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.370513 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerName="init" Mar 14 07:20:18 crc kubenswrapper[5129]: E0314 07:20:18.370567 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acac1a33-44b5-4200-b5c1-91a2339283b9" containerName="oc" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.370576 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="acac1a33-44b5-4200-b5c1-91a2339283b9" containerName="oc" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.370802 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa25a41c-461a-4e47-b408-37f7a30eef64" containerName="dnsmasq-dns" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.370842 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="acac1a33-44b5-4200-b5c1-91a2339283b9" containerName="oc" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.371812 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.374536 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.394403 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-trjzl"] Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.422791 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-p2z68"] Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.423777 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.432010 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.442386 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p2z68"] Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.498907 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-trjzl"] Mar 14 07:20:18 crc kubenswrapper[5129]: E0314 07:20:18.499577 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-s56dk ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7988f9db49-trjzl" podUID="2d8ce01f-2ddf-4640-999a-3d6c79eb66da" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519271 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56dk\" (UniqueName: \"kubernetes.io/projected/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-kube-api-access-s56dk\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519320 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-config\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519346 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovs-rundir\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519379 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-combined-ca-bundle\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519403 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d934f07-49c3-4356-ae16-0c35f0935625-config\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519429 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phz5b\" (UniqueName: \"kubernetes.io/projected/9d934f07-49c3-4356-ae16-0c35f0935625-kube-api-access-phz5b\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519495 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519510 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519529 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-dns-svc\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.519579 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovn-rundir\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.579934 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-rg5wh"] Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.583875 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.586476 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.588015 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.588414 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.598125 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.598800 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.598820 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-p4smh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.599136 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.621865 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.621907 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.621959 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-dns-svc\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.622030 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovn-rundir\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.622058 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56dk\" (UniqueName: \"kubernetes.io/projected/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-kube-api-access-s56dk\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.622102 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-config\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.622125 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovs-rundir\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.622150 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-combined-ca-bundle\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.622194 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d934f07-49c3-4356-ae16-0c35f0935625-config\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.622218 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phz5b\" (UniqueName: \"kubernetes.io/projected/9d934f07-49c3-4356-ae16-0c35f0935625-kube-api-access-phz5b\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.623043 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.623682 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-config\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.624272 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-dns-svc\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.624348 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d934f07-49c3-4356-ae16-0c35f0935625-config\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.624520 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovn-rundir\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.624540 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovs-rundir\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.624553 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-rg5wh"] Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.642195 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.646107 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-combined-ca-bundle\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.649785 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phz5b\" (UniqueName: \"kubernetes.io/projected/9d934f07-49c3-4356-ae16-0c35f0935625-kube-api-access-phz5b\") pod \"ovn-controller-metrics-p2z68\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.681056 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56dk\" (UniqueName: \"kubernetes.io/projected/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-kube-api-access-s56dk\") pod \"dnsmasq-dns-7988f9db49-trjzl\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.711789 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728460 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728516 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-config\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728536 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728556 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728616 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728651 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728674 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv4l6\" (UniqueName: \"kubernetes.io/projected/72ad7859-e34b-4393-b696-548fd7ac8e1d-kube-api-access-qv4l6\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728712 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-scripts\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728756 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-config\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728776 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwf85\" (UniqueName: \"kubernetes.io/projected/b1d4c67b-973d-44fd-9da3-8752c61ac482-kube-api-access-pwf85\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.728805 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.747200 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.830686 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831078 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv4l6\" (UniqueName: \"kubernetes.io/projected/72ad7859-e34b-4393-b696-548fd7ac8e1d-kube-api-access-qv4l6\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831120 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-scripts\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831175 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-config\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831204 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwf85\" (UniqueName: \"kubernetes.io/projected/b1d4c67b-973d-44fd-9da3-8752c61ac482-kube-api-access-pwf85\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831243 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831295 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831319 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-config\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831341 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831367 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831395 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.831423 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.832810 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-config\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.832915 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-config\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.833096 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.833145 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.833228 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.833266 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-scripts\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.833515 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.835547 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.839031 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.839105 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.851030 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwf85\" (UniqueName: \"kubernetes.io/projected/b1d4c67b-973d-44fd-9da3-8752c61ac482-kube-api-access-pwf85\") pod \"dnsmasq-dns-5d944d7b75-rg5wh\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.880945 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv4l6\" (UniqueName: \"kubernetes.io/projected/72ad7859-e34b-4393-b696-548fd7ac8e1d-kube-api-access-qv4l6\") pod \"ovn-northd-0\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " pod="openstack/ovn-northd-0" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.911034 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:18 crc kubenswrapper[5129]: I0314 07:20:18.926731 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.141138 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.151223 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.237213 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-config\") pod \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.237450 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-dns-svc\") pod \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.237511 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-ovsdbserver-sb\") pod \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.237672 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s56dk\" (UniqueName: \"kubernetes.io/projected/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-kube-api-access-s56dk\") pod \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\" (UID: \"2d8ce01f-2ddf-4640-999a-3d6c79eb66da\") " Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.239158 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d8ce01f-2ddf-4640-999a-3d6c79eb66da" (UID: "2d8ce01f-2ddf-4640-999a-3d6c79eb66da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.239515 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-config" (OuterVolumeSpecName: "config") pod "2d8ce01f-2ddf-4640-999a-3d6c79eb66da" (UID: "2d8ce01f-2ddf-4640-999a-3d6c79eb66da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.240134 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d8ce01f-2ddf-4640-999a-3d6c79eb66da" (UID: "2d8ce01f-2ddf-4640-999a-3d6c79eb66da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.241892 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.241960 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.241973 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.243326 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-kube-api-access-s56dk" (OuterVolumeSpecName: "kube-api-access-s56dk") pod "2d8ce01f-2ddf-4640-999a-3d6c79eb66da" (UID: "2d8ce01f-2ddf-4640-999a-3d6c79eb66da"). InnerVolumeSpecName "kube-api-access-s56dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.317961 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p2z68"] Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.324879 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.343155 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s56dk\" (UniqueName: \"kubernetes.io/projected/2d8ce01f-2ddf-4640-999a-3d6c79eb66da-kube-api-access-s56dk\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:19 crc kubenswrapper[5129]: I0314 07:20:19.443669 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-rg5wh"] Mar 14 07:20:19 crc kubenswrapper[5129]: W0314 07:20:19.448580 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1d4c67b_973d_44fd_9da3_8752c61ac482.slice/crio-b8de59cf531c4f1cbd587a162820b60fe4411105bb5d5e3455a3623c5041f825 WatchSource:0}: Error finding container b8de59cf531c4f1cbd587a162820b60fe4411105bb5d5e3455a3623c5041f825: Status 404 returned error can't find the container with id b8de59cf531c4f1cbd587a162820b60fe4411105bb5d5e3455a3623c5041f825 Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.152478 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerID="83c0dc244bf1a008bf5058a0d244744eb013628d4c8cf0d0b5a110e8bff34c62" exitCode=0 Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.152817 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" event={"ID":"b1d4c67b-973d-44fd-9da3-8752c61ac482","Type":"ContainerDied","Data":"83c0dc244bf1a008bf5058a0d244744eb013628d4c8cf0d0b5a110e8bff34c62"} Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.152848 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" event={"ID":"b1d4c67b-973d-44fd-9da3-8752c61ac482","Type":"ContainerStarted","Data":"b8de59cf531c4f1cbd587a162820b60fe4411105bb5d5e3455a3623c5041f825"} Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.157552 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p2z68" event={"ID":"9d934f07-49c3-4356-ae16-0c35f0935625","Type":"ContainerStarted","Data":"252a4f363d08c5b6cdff2d3573a9472b02174c2666aea18e478d986cc1704a09"} Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.157584 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p2z68" event={"ID":"9d934f07-49c3-4356-ae16-0c35f0935625","Type":"ContainerStarted","Data":"c9654dbe22fe7af31a336802f519f1aa82324ec7af4b67905af69fa1889657f2"} Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.166871 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-trjzl" Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.166868 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"72ad7859-e34b-4393-b696-548fd7ac8e1d","Type":"ContainerStarted","Data":"d5ea91c26a8f8e3e2cdddf83fca8f1d7c2ee1497148b94cdde0cc7199396cb35"} Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.200165 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-p2z68" podStartSLOduration=2.200142656 podStartE2EDuration="2.200142656s" podCreationTimestamp="2026-03-14 07:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:20.195622322 +0000 UTC m=+1282.947537536" watchObservedRunningTime="2026-03-14 07:20:20.200142656 +0000 UTC m=+1282.952057840" Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.366695 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-trjzl"] Mar 14 07:20:20 crc kubenswrapper[5129]: I0314 07:20:20.373908 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-trjzl"] Mar 14 07:20:21 crc kubenswrapper[5129]: I0314 07:20:21.108011 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 14 07:20:21 crc kubenswrapper[5129]: I0314 07:20:21.187132 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" event={"ID":"b1d4c67b-973d-44fd-9da3-8752c61ac482","Type":"ContainerStarted","Data":"f8f371762fe7278072fb2755d3e60c04930e3dc686a5e4d293c407946595bc92"} Mar 14 07:20:21 crc kubenswrapper[5129]: I0314 07:20:21.188086 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:21 crc kubenswrapper[5129]: I0314 07:20:21.195259 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"72ad7859-e34b-4393-b696-548fd7ac8e1d","Type":"ContainerStarted","Data":"c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66"} Mar 14 07:20:21 crc kubenswrapper[5129]: I0314 07:20:21.195313 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"72ad7859-e34b-4393-b696-548fd7ac8e1d","Type":"ContainerStarted","Data":"d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395"} Mar 14 07:20:21 crc kubenswrapper[5129]: I0314 07:20:21.220099 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" podStartSLOduration=3.220068356 podStartE2EDuration="3.220068356s" podCreationTimestamp="2026-03-14 07:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:21.218437631 +0000 UTC m=+1283.970352825" watchObservedRunningTime="2026-03-14 07:20:21.220068356 +0000 UTC m=+1283.971983580" Mar 14 07:20:21 crc kubenswrapper[5129]: I0314 07:20:21.245871 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.416359795 podStartE2EDuration="3.245849929s" podCreationTimestamp="2026-03-14 07:20:18 +0000 UTC" firstStartedPulling="2026-03-14 07:20:19.33173986 +0000 UTC m=+1282.083655044" lastFinishedPulling="2026-03-14 07:20:20.161229994 +0000 UTC m=+1282.913145178" observedRunningTime="2026-03-14 07:20:21.241171093 +0000 UTC m=+1283.993086277" watchObservedRunningTime="2026-03-14 07:20:21.245849929 +0000 UTC m=+1283.997765113" Mar 14 07:20:22 crc kubenswrapper[5129]: I0314 07:20:22.044169 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8ce01f-2ddf-4640-999a-3d6c79eb66da" path="/var/lib/kubelet/pods/2d8ce01f-2ddf-4640-999a-3d6c79eb66da/volumes" Mar 14 07:20:22 crc kubenswrapper[5129]: I0314 07:20:22.200371 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 14 07:20:24 crc kubenswrapper[5129]: I0314 07:20:24.255409 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 14 07:20:24 crc kubenswrapper[5129]: I0314 07:20:24.255833 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 14 07:20:25 crc kubenswrapper[5129]: I0314 07:20:25.600900 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 14 07:20:25 crc kubenswrapper[5129]: I0314 07:20:25.601229 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 14 07:20:25 crc kubenswrapper[5129]: I0314 07:20:25.684349 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 14 07:20:26 crc kubenswrapper[5129]: I0314 07:20:26.322146 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 14 07:20:27 crc kubenswrapper[5129]: I0314 07:20:27.965565 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-rg5wh"] Mar 14 07:20:27 crc kubenswrapper[5129]: I0314 07:20:27.966145 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerName="dnsmasq-dns" containerID="cri-o://f8f371762fe7278072fb2755d3e60c04930e3dc686a5e4d293c407946595bc92" gracePeriod=10 Mar 14 07:20:27 crc kubenswrapper[5129]: I0314 07:20:27.967526 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:27 crc kubenswrapper[5129]: I0314 07:20:27.991982 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-2bjf4"] Mar 14 07:20:27 crc kubenswrapper[5129]: I0314 07:20:27.993251 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.019831 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-2bjf4"] Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.098009 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-config\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.098073 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph762\" (UniqueName: \"kubernetes.io/projected/baf0d43c-9c51-470c-9863-406b5a2a523b-kube-api-access-ph762\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.098111 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.098173 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.098206 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.199634 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph762\" (UniqueName: \"kubernetes.io/projected/baf0d43c-9c51-470c-9863-406b5a2a523b-kube-api-access-ph762\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.199803 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.200002 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.200086 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.200157 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-config\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.200918 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.200942 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-config\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.201009 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.201157 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.221467 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph762\" (UniqueName: \"kubernetes.io/projected/baf0d43c-9c51-470c-9863-406b5a2a523b-kube-api-access-ph762\") pod \"dnsmasq-dns-7b9fd7d84c-2bjf4\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.245779 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerID="f8f371762fe7278072fb2755d3e60c04930e3dc686a5e4d293c407946595bc92" exitCode=0 Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.245856 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" event={"ID":"b1d4c67b-973d-44fd-9da3-8752c61ac482","Type":"ContainerDied","Data":"f8f371762fe7278072fb2755d3e60c04930e3dc686a5e4d293c407946595bc92"} Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.318964 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.835510 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-2bjf4"] Mar 14 07:20:28 crc kubenswrapper[5129]: I0314 07:20:28.936527 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.129008 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.136552 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.138838 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.139125 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.139374 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-g7j4t" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.139563 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.158113 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.218409 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-cache\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.218467 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ppz\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-kube-api-access-42ppz\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.218527 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-lock\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.218568 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.218620 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.218827 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.253826 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" event={"ID":"baf0d43c-9c51-470c-9863-406b5a2a523b","Type":"ContainerStarted","Data":"042644784f815af94ab0940d761153215bad47eaf29f3a3a872e611049a5db0d"} Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.320354 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.320502 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-cache\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.320545 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42ppz\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-kube-api-access-42ppz\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: E0314 07:20:29.320544 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:29 crc kubenswrapper[5129]: E0314 07:20:29.320644 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.320657 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-lock\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: E0314 07:20:29.320697 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:20:29.820680776 +0000 UTC m=+1292.572595960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : configmap "swift-ring-files" not found Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.320726 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.320777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.321118 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-lock\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.321236 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.321312 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-cache\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.330843 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.340919 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ppz\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-kube-api-access-42ppz\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.352319 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.479016 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-66xz6"] Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.480152 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.481833 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.482736 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.482798 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.496273 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-66xz6"] Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.523659 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-combined-ca-bundle\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.523742 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-ring-data-devices\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.523769 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-dispersionconf\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.523807 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/babd1361-ef0f-4921-905f-cbb1af24beea-etc-swift\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.523848 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmsz2\" (UniqueName: \"kubernetes.io/projected/babd1361-ef0f-4921-905f-cbb1af24beea-kube-api-access-hmsz2\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.523882 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-swiftconf\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.523982 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-scripts\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.625343 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-combined-ca-bundle\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.625702 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-ring-data-devices\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.625720 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-dispersionconf\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.625749 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/babd1361-ef0f-4921-905f-cbb1af24beea-etc-swift\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.625779 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmsz2\" (UniqueName: \"kubernetes.io/projected/babd1361-ef0f-4921-905f-cbb1af24beea-kube-api-access-hmsz2\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.625809 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-swiftconf\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.625870 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-scripts\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.626882 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-scripts\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.627159 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-ring-data-devices\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.627783 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/babd1361-ef0f-4921-905f-cbb1af24beea-etc-swift\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.631384 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-dispersionconf\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.632291 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-swiftconf\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.632531 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-combined-ca-bundle\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.657250 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmsz2\" (UniqueName: \"kubernetes.io/projected/babd1361-ef0f-4921-905f-cbb1af24beea-kube-api-access-hmsz2\") pod \"swift-ring-rebalance-66xz6\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.743048 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.808733 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.827940 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-sb\") pod \"b1d4c67b-973d-44fd-9da3-8752c61ac482\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.827992 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwf85\" (UniqueName: \"kubernetes.io/projected/b1d4c67b-973d-44fd-9da3-8752c61ac482-kube-api-access-pwf85\") pod \"b1d4c67b-973d-44fd-9da3-8752c61ac482\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.828066 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-config\") pod \"b1d4c67b-973d-44fd-9da3-8752c61ac482\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.828130 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-dns-svc\") pod \"b1d4c67b-973d-44fd-9da3-8752c61ac482\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.828169 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-nb\") pod \"b1d4c67b-973d-44fd-9da3-8752c61ac482\" (UID: \"b1d4c67b-973d-44fd-9da3-8752c61ac482\") " Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.828596 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:29 crc kubenswrapper[5129]: E0314 07:20:29.828801 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:29 crc kubenswrapper[5129]: E0314 07:20:29.828828 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:29 crc kubenswrapper[5129]: E0314 07:20:29.828982 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:20:30.828963324 +0000 UTC m=+1293.580878508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : configmap "swift-ring-files" not found Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.833484 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d4c67b-973d-44fd-9da3-8752c61ac482-kube-api-access-pwf85" (OuterVolumeSpecName: "kube-api-access-pwf85") pod "b1d4c67b-973d-44fd-9da3-8752c61ac482" (UID: "b1d4c67b-973d-44fd-9da3-8752c61ac482"). InnerVolumeSpecName "kube-api-access-pwf85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.872095 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1d4c67b-973d-44fd-9da3-8752c61ac482" (UID: "b1d4c67b-973d-44fd-9da3-8752c61ac482"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.881473 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-config" (OuterVolumeSpecName: "config") pod "b1d4c67b-973d-44fd-9da3-8752c61ac482" (UID: "b1d4c67b-973d-44fd-9da3-8752c61ac482"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.885435 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1d4c67b-973d-44fd-9da3-8752c61ac482" (UID: "b1d4c67b-973d-44fd-9da3-8752c61ac482"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.892794 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1d4c67b-973d-44fd-9da3-8752c61ac482" (UID: "b1d4c67b-973d-44fd-9da3-8752c61ac482"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.929788 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.930023 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.930034 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.930043 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwf85\" (UniqueName: \"kubernetes.io/projected/b1d4c67b-973d-44fd-9da3-8752c61ac482-kube-api-access-pwf85\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:29 crc kubenswrapper[5129]: I0314 07:20:29.930054 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d4c67b-973d-44fd-9da3-8752c61ac482-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.231006 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-66xz6"] Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.264219 5129 generic.go:334] "Generic (PLEG): container finished" podID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerID="f7d6ee3f639fd61f9fb30fbfb9f3da3ff3a7d2253225681de5df251f37862907" exitCode=0 Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.264304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" event={"ID":"baf0d43c-9c51-470c-9863-406b5a2a523b","Type":"ContainerDied","Data":"f7d6ee3f639fd61f9fb30fbfb9f3da3ff3a7d2253225681de5df251f37862907"} Mar 14 07:20:30 crc kubenswrapper[5129]: W0314 07:20:30.265285 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbabd1361_ef0f_4921_905f_cbb1af24beea.slice/crio-d5df789972239db935363b82381ca252c4ead2710517cc58af9f3e637f9e2d04 WatchSource:0}: Error finding container d5df789972239db935363b82381ca252c4ead2710517cc58af9f3e637f9e2d04: Status 404 returned error can't find the container with id d5df789972239db935363b82381ca252c4ead2710517cc58af9f3e637f9e2d04 Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.268297 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" event={"ID":"b1d4c67b-973d-44fd-9da3-8752c61ac482","Type":"ContainerDied","Data":"b8de59cf531c4f1cbd587a162820b60fe4411105bb5d5e3455a3623c5041f825"} Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.268341 5129 scope.go:117] "RemoveContainer" containerID="f8f371762fe7278072fb2755d3e60c04930e3dc686a5e4d293c407946595bc92" Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.268403 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-rg5wh" Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.304361 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-rg5wh"] Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.314390 5129 scope.go:117] "RemoveContainer" containerID="83c0dc244bf1a008bf5058a0d244744eb013628d4c8cf0d0b5a110e8bff34c62" Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.321560 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-rg5wh"] Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.848804 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:30 crc kubenswrapper[5129]: E0314 07:20:30.849068 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:30 crc kubenswrapper[5129]: E0314 07:20:30.849312 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:30 crc kubenswrapper[5129]: E0314 07:20:30.849395 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:20:32.849369708 +0000 UTC m=+1295.601284902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : configmap "swift-ring-files" not found Mar 14 07:20:30 crc kubenswrapper[5129]: I0314 07:20:30.991205 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.121223 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.279372 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-66xz6" event={"ID":"babd1361-ef0f-4921-905f-cbb1af24beea","Type":"ContainerStarted","Data":"d5df789972239db935363b82381ca252c4ead2710517cc58af9f3e637f9e2d04"} Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.526704 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nhsz7"] Mar 14 07:20:31 crc kubenswrapper[5129]: E0314 07:20:31.527102 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerName="dnsmasq-dns" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.527121 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerName="dnsmasq-dns" Mar 14 07:20:31 crc kubenswrapper[5129]: E0314 07:20:31.527167 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerName="init" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.527175 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerName="init" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.527350 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" containerName="dnsmasq-dns" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.528062 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.534401 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f3f5-account-create-update-t48fl"] Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.547049 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nhsz7"] Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.547159 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.549304 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.553912 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f3f5-account-create-update-t48fl"] Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.570284 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmfgn\" (UniqueName: \"kubernetes.io/projected/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-kube-api-access-cmfgn\") pod \"glance-db-create-nhsz7\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.570472 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-operator-scripts\") pod \"glance-db-create-nhsz7\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.672751 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d80f5-edb1-4774-8072-6af87a16888f-operator-scripts\") pod \"glance-f3f5-account-create-update-t48fl\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.673120 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-operator-scripts\") pod \"glance-db-create-nhsz7\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.673193 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx9jg\" (UniqueName: \"kubernetes.io/projected/ba7d80f5-edb1-4774-8072-6af87a16888f-kube-api-access-kx9jg\") pod \"glance-f3f5-account-create-update-t48fl\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.673234 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmfgn\" (UniqueName: \"kubernetes.io/projected/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-kube-api-access-cmfgn\") pod \"glance-db-create-nhsz7\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.674261 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-operator-scripts\") pod \"glance-db-create-nhsz7\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.706696 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmfgn\" (UniqueName: \"kubernetes.io/projected/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-kube-api-access-cmfgn\") pod \"glance-db-create-nhsz7\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.774733 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx9jg\" (UniqueName: \"kubernetes.io/projected/ba7d80f5-edb1-4774-8072-6af87a16888f-kube-api-access-kx9jg\") pod \"glance-f3f5-account-create-update-t48fl\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.774862 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d80f5-edb1-4774-8072-6af87a16888f-operator-scripts\") pod \"glance-f3f5-account-create-update-t48fl\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.775530 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d80f5-edb1-4774-8072-6af87a16888f-operator-scripts\") pod \"glance-f3f5-account-create-update-t48fl\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.792170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx9jg\" (UniqueName: \"kubernetes.io/projected/ba7d80f5-edb1-4774-8072-6af87a16888f-kube-api-access-kx9jg\") pod \"glance-f3f5-account-create-update-t48fl\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.848147 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:31 crc kubenswrapper[5129]: I0314 07:20:31.861699 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.050204 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d4c67b-973d-44fd-9da3-8752c61ac482" path="/var/lib/kubelet/pods/b1d4c67b-973d-44fd-9da3-8752c61ac482/volumes" Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.291025 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" event={"ID":"baf0d43c-9c51-470c-9863-406b5a2a523b","Type":"ContainerStarted","Data":"94622b0d2d04ac62d54bff88e740d79ee2b30b8094d80870f9821e12c72f54b6"} Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.291289 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.326717 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" podStartSLOduration=5.32669586 podStartE2EDuration="5.32669586s" podCreationTimestamp="2026-03-14 07:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:32.311912136 +0000 UTC m=+1295.063827330" watchObservedRunningTime="2026-03-14 07:20:32.32669586 +0000 UTC m=+1295.078611064" Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.365553 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nhsz7"] Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.374958 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f3f5-account-create-update-t48fl"] Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.894994 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:32 crc kubenswrapper[5129]: E0314 07:20:32.895226 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:32 crc kubenswrapper[5129]: E0314 07:20:32.895501 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:32 crc kubenswrapper[5129]: E0314 07:20:32.895567 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:20:36.895544802 +0000 UTC m=+1299.647459986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : configmap "swift-ring-files" not found Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.897001 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wmhsw"] Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.897920 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.900270 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.915023 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wmhsw"] Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.996891 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl6h4\" (UniqueName: \"kubernetes.io/projected/1d57f145-47a6-44dd-9e6a-f2574f3d589f-kube-api-access-nl6h4\") pod \"root-account-create-update-wmhsw\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:32 crc kubenswrapper[5129]: I0314 07:20:32.996938 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d57f145-47a6-44dd-9e6a-f2574f3d589f-operator-scripts\") pod \"root-account-create-update-wmhsw\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:33 crc kubenswrapper[5129]: I0314 07:20:33.099009 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl6h4\" (UniqueName: \"kubernetes.io/projected/1d57f145-47a6-44dd-9e6a-f2574f3d589f-kube-api-access-nl6h4\") pod \"root-account-create-update-wmhsw\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:33 crc kubenswrapper[5129]: I0314 07:20:33.099073 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d57f145-47a6-44dd-9e6a-f2574f3d589f-operator-scripts\") pod \"root-account-create-update-wmhsw\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:33 crc kubenswrapper[5129]: I0314 07:20:33.100356 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d57f145-47a6-44dd-9e6a-f2574f3d589f-operator-scripts\") pod \"root-account-create-update-wmhsw\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:33 crc kubenswrapper[5129]: I0314 07:20:33.122654 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl6h4\" (UniqueName: \"kubernetes.io/projected/1d57f145-47a6-44dd-9e6a-f2574f3d589f-kube-api-access-nl6h4\") pod \"root-account-create-update-wmhsw\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:33 crc kubenswrapper[5129]: I0314 07:20:33.258771 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:33 crc kubenswrapper[5129]: I0314 07:20:33.298882 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f3f5-account-create-update-t48fl" event={"ID":"ba7d80f5-edb1-4774-8072-6af87a16888f","Type":"ContainerStarted","Data":"fd0b708c36292614c9ec32553528882e75998aa088ebf0dafcac02f2f58f98b7"} Mar 14 07:20:34 crc kubenswrapper[5129]: W0314 07:20:34.077745 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4545cab3_7bdb_4e55_95b6_cbb057d2bbf9.slice/crio-6082f20dffb1f335aff049b7e7f3cd7a41039d4dcde2ce43958a41801a7a0d6b WatchSource:0}: Error finding container 6082f20dffb1f335aff049b7e7f3cd7a41039d4dcde2ce43958a41801a7a0d6b: Status 404 returned error can't find the container with id 6082f20dffb1f335aff049b7e7f3cd7a41039d4dcde2ce43958a41801a7a0d6b Mar 14 07:20:34 crc kubenswrapper[5129]: I0314 07:20:34.308005 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nhsz7" event={"ID":"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9","Type":"ContainerStarted","Data":"6082f20dffb1f335aff049b7e7f3cd7a41039d4dcde2ce43958a41801a7a0d6b"} Mar 14 07:20:34 crc kubenswrapper[5129]: I0314 07:20:34.519359 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wmhsw"] Mar 14 07:20:34 crc kubenswrapper[5129]: W0314 07:20:34.530353 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d57f145_47a6_44dd_9e6a_f2574f3d589f.slice/crio-6e694e628e25dec0d5047cc24e10b4b9a6f3ba4126973e9add8bb474575d3aaa WatchSource:0}: Error finding container 6e694e628e25dec0d5047cc24e10b4b9a6f3ba4126973e9add8bb474575d3aaa: Status 404 returned error can't find the container with id 6e694e628e25dec0d5047cc24e10b4b9a6f3ba4126973e9add8bb474575d3aaa Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.317830 5129 generic.go:334] "Generic (PLEG): container finished" podID="1d57f145-47a6-44dd-9e6a-f2574f3d589f" containerID="37ea063e6c3a1ea8642e41cfcdf5d7f9ece1f10520882047ed8dde1ce66791ee" exitCode=0 Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.317936 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wmhsw" event={"ID":"1d57f145-47a6-44dd-9e6a-f2574f3d589f","Type":"ContainerDied","Data":"37ea063e6c3a1ea8642e41cfcdf5d7f9ece1f10520882047ed8dde1ce66791ee"} Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.318240 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wmhsw" event={"ID":"1d57f145-47a6-44dd-9e6a-f2574f3d589f","Type":"ContainerStarted","Data":"6e694e628e25dec0d5047cc24e10b4b9a6f3ba4126973e9add8bb474575d3aaa"} Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.319776 5129 generic.go:334] "Generic (PLEG): container finished" podID="4545cab3-7bdb-4e55-95b6-cbb057d2bbf9" containerID="62bd5a629f85e5a8e103427ea5018ceb216c75ee329d681eba7351287bdeff37" exitCode=0 Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.319816 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nhsz7" event={"ID":"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9","Type":"ContainerDied","Data":"62bd5a629f85e5a8e103427ea5018ceb216c75ee329d681eba7351287bdeff37"} Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.321137 5129 generic.go:334] "Generic (PLEG): container finished" podID="ba7d80f5-edb1-4774-8072-6af87a16888f" containerID="3db82d92800e77d3b7ece09cef6001114bf8a07f1cf5ac6893920c938bc328a9" exitCode=0 Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.321205 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f3f5-account-create-update-t48fl" event={"ID":"ba7d80f5-edb1-4774-8072-6af87a16888f","Type":"ContainerDied","Data":"3db82d92800e77d3b7ece09cef6001114bf8a07f1cf5ac6893920c938bc328a9"} Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.323274 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-66xz6" event={"ID":"babd1361-ef0f-4921-905f-cbb1af24beea","Type":"ContainerStarted","Data":"74d5ec83fe89138910b7e8dfccdd1837156e7d6178650fedbb4f80fe74cdc1ff"} Mar 14 07:20:35 crc kubenswrapper[5129]: I0314 07:20:35.371619 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-66xz6" podStartSLOduration=2.429481048 podStartE2EDuration="6.371589905s" podCreationTimestamp="2026-03-14 07:20:29 +0000 UTC" firstStartedPulling="2026-03-14 07:20:30.269136585 +0000 UTC m=+1293.021051769" lastFinishedPulling="2026-03-14 07:20:34.211245432 +0000 UTC m=+1296.963160626" observedRunningTime="2026-03-14 07:20:35.367087282 +0000 UTC m=+1298.119002476" watchObservedRunningTime="2026-03-14 07:20:35.371589905 +0000 UTC m=+1298.123505089" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.728897 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.769660 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl6h4\" (UniqueName: \"kubernetes.io/projected/1d57f145-47a6-44dd-9e6a-f2574f3d589f-kube-api-access-nl6h4\") pod \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.769783 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d57f145-47a6-44dd-9e6a-f2574f3d589f-operator-scripts\") pod \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\" (UID: \"1d57f145-47a6-44dd-9e6a-f2574f3d589f\") " Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.771418 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d57f145-47a6-44dd-9e6a-f2574f3d589f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d57f145-47a6-44dd-9e6a-f2574f3d589f" (UID: "1d57f145-47a6-44dd-9e6a-f2574f3d589f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.788865 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d57f145-47a6-44dd-9e6a-f2574f3d589f-kube-api-access-nl6h4" (OuterVolumeSpecName: "kube-api-access-nl6h4") pod "1d57f145-47a6-44dd-9e6a-f2574f3d589f" (UID: "1d57f145-47a6-44dd-9e6a-f2574f3d589f"). InnerVolumeSpecName "kube-api-access-nl6h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.841595 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mqldm"] Mar 14 07:20:36 crc kubenswrapper[5129]: E0314 07:20:36.842026 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d57f145-47a6-44dd-9e6a-f2574f3d589f" containerName="mariadb-account-create-update" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.842046 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d57f145-47a6-44dd-9e6a-f2574f3d589f" containerName="mariadb-account-create-update" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.842253 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d57f145-47a6-44dd-9e6a-f2574f3d589f" containerName="mariadb-account-create-update" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.842863 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.860755 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mqldm"] Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.870477 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.871350 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qp5\" (UniqueName: \"kubernetes.io/projected/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-kube-api-access-g9qp5\") pod \"keystone-db-create-mqldm\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.871401 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-operator-scripts\") pod \"keystone-db-create-mqldm\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.871500 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl6h4\" (UniqueName: \"kubernetes.io/projected/1d57f145-47a6-44dd-9e6a-f2574f3d589f-kube-api-access-nl6h4\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.871515 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d57f145-47a6-44dd-9e6a-f2574f3d589f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.878106 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.960864 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5ea2-account-create-update-pzlww"] Mar 14 07:20:36 crc kubenswrapper[5129]: E0314 07:20:36.961224 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4545cab3-7bdb-4e55-95b6-cbb057d2bbf9" containerName="mariadb-database-create" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.961236 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4545cab3-7bdb-4e55-95b6-cbb057d2bbf9" containerName="mariadb-database-create" Mar 14 07:20:36 crc kubenswrapper[5129]: E0314 07:20:36.961254 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7d80f5-edb1-4774-8072-6af87a16888f" containerName="mariadb-account-create-update" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.961260 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7d80f5-edb1-4774-8072-6af87a16888f" containerName="mariadb-account-create-update" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.961415 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7d80f5-edb1-4774-8072-6af87a16888f" containerName="mariadb-account-create-update" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.961427 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4545cab3-7bdb-4e55-95b6-cbb057d2bbf9" containerName="mariadb-database-create" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.961943 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.966193 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.971107 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5ea2-account-create-update-pzlww"] Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.974242 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-operator-scripts\") pod \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.974463 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx9jg\" (UniqueName: \"kubernetes.io/projected/ba7d80f5-edb1-4774-8072-6af87a16888f-kube-api-access-kx9jg\") pod \"ba7d80f5-edb1-4774-8072-6af87a16888f\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.974499 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d80f5-edb1-4774-8072-6af87a16888f-operator-scripts\") pod \"ba7d80f5-edb1-4774-8072-6af87a16888f\" (UID: \"ba7d80f5-edb1-4774-8072-6af87a16888f\") " Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.974557 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmfgn\" (UniqueName: \"kubernetes.io/projected/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-kube-api-access-cmfgn\") pod \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\" (UID: \"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9\") " Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.974885 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9qp5\" (UniqueName: \"kubernetes.io/projected/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-kube-api-access-g9qp5\") pod \"keystone-db-create-mqldm\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.974962 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-operator-scripts\") pod \"keystone-db-create-mqldm\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.975073 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.975076 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7d80f5-edb1-4774-8072-6af87a16888f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba7d80f5-edb1-4774-8072-6af87a16888f" (UID: "ba7d80f5-edb1-4774-8072-6af87a16888f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.975169 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d80f5-edb1-4774-8072-6af87a16888f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:36 crc kubenswrapper[5129]: E0314 07:20:36.975407 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:20:36 crc kubenswrapper[5129]: E0314 07:20:36.975421 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 07:20:36 crc kubenswrapper[5129]: E0314 07:20:36.975457 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:20:44.975444288 +0000 UTC m=+1307.727359472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : configmap "swift-ring-files" not found Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.975880 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-operator-scripts\") pod \"keystone-db-create-mqldm\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.976194 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4545cab3-7bdb-4e55-95b6-cbb057d2bbf9" (UID: "4545cab3-7bdb-4e55-95b6-cbb057d2bbf9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:36 crc kubenswrapper[5129]: I0314 07:20:36.978226 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7d80f5-edb1-4774-8072-6af87a16888f-kube-api-access-kx9jg" (OuterVolumeSpecName: "kube-api-access-kx9jg") pod "ba7d80f5-edb1-4774-8072-6af87a16888f" (UID: "ba7d80f5-edb1-4774-8072-6af87a16888f"). InnerVolumeSpecName "kube-api-access-kx9jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.004883 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9qp5\" (UniqueName: \"kubernetes.io/projected/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-kube-api-access-g9qp5\") pod \"keystone-db-create-mqldm\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.006776 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-kube-api-access-cmfgn" (OuterVolumeSpecName: "kube-api-access-cmfgn") pod "4545cab3-7bdb-4e55-95b6-cbb057d2bbf9" (UID: "4545cab3-7bdb-4e55-95b6-cbb057d2bbf9"). InnerVolumeSpecName "kube-api-access-cmfgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.076389 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8js4z"] Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.089287 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.089510 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k6w9\" (UniqueName: \"kubernetes.io/projected/68b5edf2-4ac6-4377-8bde-268655955533-kube-api-access-8k6w9\") pod \"keystone-5ea2-account-create-update-pzlww\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.089823 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68b5edf2-4ac6-4377-8bde-268655955533-operator-scripts\") pod \"keystone-5ea2-account-create-update-pzlww\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.089909 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx9jg\" (UniqueName: \"kubernetes.io/projected/ba7d80f5-edb1-4774-8072-6af87a16888f-kube-api-access-kx9jg\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.089930 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmfgn\" (UniqueName: \"kubernetes.io/projected/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-kube-api-access-cmfgn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.089945 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.105989 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8js4z"] Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.144044 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ee9b-account-create-update-xxxtd"] Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.145012 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.147315 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.151323 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ee9b-account-create-update-xxxtd"] Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.191739 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj4kr\" (UniqueName: \"kubernetes.io/projected/189b1a66-f6a7-4db2-bb51-b22d1725a41b-kube-api-access-fj4kr\") pod \"placement-db-create-8js4z\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.191782 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189b1a66-f6a7-4db2-bb51-b22d1725a41b-operator-scripts\") pod \"placement-db-create-8js4z\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.191825 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68b5edf2-4ac6-4377-8bde-268655955533-operator-scripts\") pod \"keystone-5ea2-account-create-update-pzlww\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.191853 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1bbd954-27ab-473e-b15f-42a06ad72887-operator-scripts\") pod \"placement-ee9b-account-create-update-xxxtd\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.191879 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nfxk\" (UniqueName: \"kubernetes.io/projected/c1bbd954-27ab-473e-b15f-42a06ad72887-kube-api-access-6nfxk\") pod \"placement-ee9b-account-create-update-xxxtd\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.191903 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k6w9\" (UniqueName: \"kubernetes.io/projected/68b5edf2-4ac6-4377-8bde-268655955533-kube-api-access-8k6w9\") pod \"keystone-5ea2-account-create-update-pzlww\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.192672 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.192893 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68b5edf2-4ac6-4377-8bde-268655955533-operator-scripts\") pod \"keystone-5ea2-account-create-update-pzlww\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.211795 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k6w9\" (UniqueName: \"kubernetes.io/projected/68b5edf2-4ac6-4377-8bde-268655955533-kube-api-access-8k6w9\") pod \"keystone-5ea2-account-create-update-pzlww\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.291492 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.292615 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj4kr\" (UniqueName: \"kubernetes.io/projected/189b1a66-f6a7-4db2-bb51-b22d1725a41b-kube-api-access-fj4kr\") pod \"placement-db-create-8js4z\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.292653 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189b1a66-f6a7-4db2-bb51-b22d1725a41b-operator-scripts\") pod \"placement-db-create-8js4z\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.292700 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1bbd954-27ab-473e-b15f-42a06ad72887-operator-scripts\") pod \"placement-ee9b-account-create-update-xxxtd\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.292747 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nfxk\" (UniqueName: \"kubernetes.io/projected/c1bbd954-27ab-473e-b15f-42a06ad72887-kube-api-access-6nfxk\") pod \"placement-ee9b-account-create-update-xxxtd\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.293688 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1bbd954-27ab-473e-b15f-42a06ad72887-operator-scripts\") pod \"placement-ee9b-account-create-update-xxxtd\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.294064 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189b1a66-f6a7-4db2-bb51-b22d1725a41b-operator-scripts\") pod \"placement-db-create-8js4z\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.309098 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj4kr\" (UniqueName: \"kubernetes.io/projected/189b1a66-f6a7-4db2-bb51-b22d1725a41b-kube-api-access-fj4kr\") pod \"placement-db-create-8js4z\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.311358 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nfxk\" (UniqueName: \"kubernetes.io/projected/c1bbd954-27ab-473e-b15f-42a06ad72887-kube-api-access-6nfxk\") pod \"placement-ee9b-account-create-update-xxxtd\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.345136 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f3f5-account-create-update-t48fl" event={"ID":"ba7d80f5-edb1-4774-8072-6af87a16888f","Type":"ContainerDied","Data":"fd0b708c36292614c9ec32553528882e75998aa088ebf0dafcac02f2f58f98b7"} Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.345178 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0b708c36292614c9ec32553528882e75998aa088ebf0dafcac02f2f58f98b7" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.345246 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f3f5-account-create-update-t48fl" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.349742 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wmhsw" event={"ID":"1d57f145-47a6-44dd-9e6a-f2574f3d589f","Type":"ContainerDied","Data":"6e694e628e25dec0d5047cc24e10b4b9a6f3ba4126973e9add8bb474575d3aaa"} Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.349776 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e694e628e25dec0d5047cc24e10b4b9a6f3ba4126973e9add8bb474575d3aaa" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.349823 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wmhsw" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.351623 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nhsz7" event={"ID":"4545cab3-7bdb-4e55-95b6-cbb057d2bbf9","Type":"ContainerDied","Data":"6082f20dffb1f335aff049b7e7f3cd7a41039d4dcde2ce43958a41801a7a0d6b"} Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.351637 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6082f20dffb1f335aff049b7e7f3cd7a41039d4dcde2ce43958a41801a7a0d6b" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.351694 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nhsz7" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.407393 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8js4z" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.475030 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.623425 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mqldm"] Mar 14 07:20:37 crc kubenswrapper[5129]: W0314 07:20:37.625520 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod680430ff_d825_4af7_bb3d_d8cfe32f1e4f.slice/crio-ad9d1d7d90be7017e931ccbefeba581632857a06804fadcc8fe0fec2d91ef31b WatchSource:0}: Error finding container ad9d1d7d90be7017e931ccbefeba581632857a06804fadcc8fe0fec2d91ef31b: Status 404 returned error can't find the container with id ad9d1d7d90be7017e931ccbefeba581632857a06804fadcc8fe0fec2d91ef31b Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.764031 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5ea2-account-create-update-pzlww"] Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.864982 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8js4z"] Mar 14 07:20:37 crc kubenswrapper[5129]: W0314 07:20:37.894885 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189b1a66_f6a7_4db2_bb51_b22d1725a41b.slice/crio-391ea6b06a92e5101f296dbdea79851846f53a45b43012c5acc4d047db871555 WatchSource:0}: Error finding container 391ea6b06a92e5101f296dbdea79851846f53a45b43012c5acc4d047db871555: Status 404 returned error can't find the container with id 391ea6b06a92e5101f296dbdea79851846f53a45b43012c5acc4d047db871555 Mar 14 07:20:37 crc kubenswrapper[5129]: I0314 07:20:37.974543 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ee9b-account-create-update-xxxtd"] Mar 14 07:20:37 crc kubenswrapper[5129]: W0314 07:20:37.977980 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1bbd954_27ab_473e_b15f_42a06ad72887.slice/crio-1fe0feb474d94a84385cc627e03cf0fa80bde584fb58577d7fd14249faf4beef WatchSource:0}: Error finding container 1fe0feb474d94a84385cc627e03cf0fa80bde584fb58577d7fd14249faf4beef: Status 404 returned error can't find the container with id 1fe0feb474d94a84385cc627e03cf0fa80bde584fb58577d7fd14249faf4beef Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.320703 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.363164 5129 generic.go:334] "Generic (PLEG): container finished" podID="68b5edf2-4ac6-4377-8bde-268655955533" containerID="66af8e6e95a3ad382ac1cd343ee394ecbadd2bfbe11f4bccced2734d290357c4" exitCode=0 Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.363228 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ea2-account-create-update-pzlww" event={"ID":"68b5edf2-4ac6-4377-8bde-268655955533","Type":"ContainerDied","Data":"66af8e6e95a3ad382ac1cd343ee394ecbadd2bfbe11f4bccced2734d290357c4"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.363287 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ea2-account-create-update-pzlww" event={"ID":"68b5edf2-4ac6-4377-8bde-268655955533","Type":"ContainerStarted","Data":"264da87cd05ee84d684fe3f1b32b07c957b0060b19f760227f0372755a973eb2"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.365158 5129 generic.go:334] "Generic (PLEG): container finished" podID="189b1a66-f6a7-4db2-bb51-b22d1725a41b" containerID="793f4948eea9bbde654020a67c725e8c5e272cc69a7a8ca7eabd48ce098086fb" exitCode=0 Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.365229 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8js4z" event={"ID":"189b1a66-f6a7-4db2-bb51-b22d1725a41b","Type":"ContainerDied","Data":"793f4948eea9bbde654020a67c725e8c5e272cc69a7a8ca7eabd48ce098086fb"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.365255 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8js4z" event={"ID":"189b1a66-f6a7-4db2-bb51-b22d1725a41b","Type":"ContainerStarted","Data":"391ea6b06a92e5101f296dbdea79851846f53a45b43012c5acc4d047db871555"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.367747 5129 generic.go:334] "Generic (PLEG): container finished" podID="680430ff-d825-4af7-bb3d-d8cfe32f1e4f" containerID="a7b578ca7c856174b66ad424c57c60fbf8244f1fe4ddbf628dbbf6c90301a784" exitCode=0 Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.367809 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mqldm" event={"ID":"680430ff-d825-4af7-bb3d-d8cfe32f1e4f","Type":"ContainerDied","Data":"a7b578ca7c856174b66ad424c57c60fbf8244f1fe4ddbf628dbbf6c90301a784"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.367835 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mqldm" event={"ID":"680430ff-d825-4af7-bb3d-d8cfe32f1e4f","Type":"ContainerStarted","Data":"ad9d1d7d90be7017e931ccbefeba581632857a06804fadcc8fe0fec2d91ef31b"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.376930 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ee9b-account-create-update-xxxtd" event={"ID":"c1bbd954-27ab-473e-b15f-42a06ad72887","Type":"ContainerStarted","Data":"7dfeaff5aa0d4fc1216ce7e841aba0740560b55e377ce234c07c519df6a88449"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.376991 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ee9b-account-create-update-xxxtd" event={"ID":"c1bbd954-27ab-473e-b15f-42a06ad72887","Type":"ContainerStarted","Data":"1fe0feb474d94a84385cc627e03cf0fa80bde584fb58577d7fd14249faf4beef"} Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.387263 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-d2ct6"] Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.387588 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" podUID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerName="dnsmasq-dns" containerID="cri-o://0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3" gracePeriod=10 Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.435193 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-ee9b-account-create-update-xxxtd" podStartSLOduration=1.43517486 podStartE2EDuration="1.43517486s" podCreationTimestamp="2026-03-14 07:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:38.42896152 +0000 UTC m=+1301.180876704" watchObservedRunningTime="2026-03-14 07:20:38.43517486 +0000 UTC m=+1301.187090034" Mar 14 07:20:38 crc kubenswrapper[5129]: I0314 07:20:38.993961 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.017117 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.176319 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-dns-svc\") pod \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.176493 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddvgt\" (UniqueName: \"kubernetes.io/projected/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-kube-api-access-ddvgt\") pod \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.176535 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-config\") pod \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\" (UID: \"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.182158 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-kube-api-access-ddvgt" (OuterVolumeSpecName: "kube-api-access-ddvgt") pod "ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" (UID: "ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd"). InnerVolumeSpecName "kube-api-access-ddvgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.214328 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" (UID: "ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.241230 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-config" (OuterVolumeSpecName: "config") pod "ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" (UID: "ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.279142 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddvgt\" (UniqueName: \"kubernetes.io/projected/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-kube-api-access-ddvgt\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.279183 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.279195 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.385805 5129 generic.go:334] "Generic (PLEG): container finished" podID="c1bbd954-27ab-473e-b15f-42a06ad72887" containerID="7dfeaff5aa0d4fc1216ce7e841aba0740560b55e377ce234c07c519df6a88449" exitCode=0 Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.385860 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ee9b-account-create-update-xxxtd" event={"ID":"c1bbd954-27ab-473e-b15f-42a06ad72887","Type":"ContainerDied","Data":"7dfeaff5aa0d4fc1216ce7e841aba0740560b55e377ce234c07c519df6a88449"} Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.389507 5129 generic.go:334] "Generic (PLEG): container finished" podID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerID="0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3" exitCode=0 Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.389689 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.390383 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" event={"ID":"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd","Type":"ContainerDied","Data":"0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3"} Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.390482 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-d2ct6" event={"ID":"ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd","Type":"ContainerDied","Data":"ddad2764100808dc9a248b0272f875d8b7e0ef9b8709fe2756dcde9d2c494abf"} Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.390560 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wmhsw"] Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.390821 5129 scope.go:117] "RemoveContainer" containerID="0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.402277 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wmhsw"] Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.439418 5129 scope.go:117] "RemoveContainer" containerID="652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.440269 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-d2ct6"] Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.454209 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-d2ct6"] Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.473387 5129 scope.go:117] "RemoveContainer" containerID="0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3" Mar 14 07:20:39 crc kubenswrapper[5129]: E0314 07:20:39.478574 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3\": container with ID starting with 0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3 not found: ID does not exist" containerID="0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.478641 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3"} err="failed to get container status \"0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3\": rpc error: code = NotFound desc = could not find container \"0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3\": container with ID starting with 0d372268522dc1fcfce8e19f1e697638c8cc7793c3e38c091a0a66668a3c33d3 not found: ID does not exist" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.478670 5129 scope.go:117] "RemoveContainer" containerID="652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5" Mar 14 07:20:39 crc kubenswrapper[5129]: E0314 07:20:39.478922 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5\": container with ID starting with 652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5 not found: ID does not exist" containerID="652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.478945 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5"} err="failed to get container status \"652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5\": rpc error: code = NotFound desc = could not find container \"652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5\": container with ID starting with 652c41f508e2dd85291629291a3a63140db9fc98ae3737d6075dc9f8a1397fa5 not found: ID does not exist" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.794647 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.882124 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.890521 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k6w9\" (UniqueName: \"kubernetes.io/projected/68b5edf2-4ac6-4377-8bde-268655955533-kube-api-access-8k6w9\") pod \"68b5edf2-4ac6-4377-8bde-268655955533\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.890793 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68b5edf2-4ac6-4377-8bde-268655955533-operator-scripts\") pod \"68b5edf2-4ac6-4377-8bde-268655955533\" (UID: \"68b5edf2-4ac6-4377-8bde-268655955533\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.892483 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b5edf2-4ac6-4377-8bde-268655955533-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68b5edf2-4ac6-4377-8bde-268655955533" (UID: "68b5edf2-4ac6-4377-8bde-268655955533"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.892735 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8js4z" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.910828 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b5edf2-4ac6-4377-8bde-268655955533-kube-api-access-8k6w9" (OuterVolumeSpecName: "kube-api-access-8k6w9") pod "68b5edf2-4ac6-4377-8bde-268655955533" (UID: "68b5edf2-4ac6-4377-8bde-268655955533"). InnerVolumeSpecName "kube-api-access-8k6w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.992407 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-operator-scripts\") pod \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.992477 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj4kr\" (UniqueName: \"kubernetes.io/projected/189b1a66-f6a7-4db2-bb51-b22d1725a41b-kube-api-access-fj4kr\") pod \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.992530 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9qp5\" (UniqueName: \"kubernetes.io/projected/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-kube-api-access-g9qp5\") pod \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\" (UID: \"680430ff-d825-4af7-bb3d-d8cfe32f1e4f\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.992629 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189b1a66-f6a7-4db2-bb51-b22d1725a41b-operator-scripts\") pod \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\" (UID: \"189b1a66-f6a7-4db2-bb51-b22d1725a41b\") " Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.993021 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68b5edf2-4ac6-4377-8bde-268655955533-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.993047 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k6w9\" (UniqueName: \"kubernetes.io/projected/68b5edf2-4ac6-4377-8bde-268655955533-kube-api-access-8k6w9\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.993477 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/189b1a66-f6a7-4db2-bb51-b22d1725a41b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "189b1a66-f6a7-4db2-bb51-b22d1725a41b" (UID: "189b1a66-f6a7-4db2-bb51-b22d1725a41b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.994329 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "680430ff-d825-4af7-bb3d-d8cfe32f1e4f" (UID: "680430ff-d825-4af7-bb3d-d8cfe32f1e4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.996076 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-kube-api-access-g9qp5" (OuterVolumeSpecName: "kube-api-access-g9qp5") pod "680430ff-d825-4af7-bb3d-d8cfe32f1e4f" (UID: "680430ff-d825-4af7-bb3d-d8cfe32f1e4f"). InnerVolumeSpecName "kube-api-access-g9qp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:39 crc kubenswrapper[5129]: I0314 07:20:39.996786 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189b1a66-f6a7-4db2-bb51-b22d1725a41b-kube-api-access-fj4kr" (OuterVolumeSpecName: "kube-api-access-fj4kr") pod "189b1a66-f6a7-4db2-bb51-b22d1725a41b" (UID: "189b1a66-f6a7-4db2-bb51-b22d1725a41b"). InnerVolumeSpecName "kube-api-access-fj4kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.045925 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d57f145-47a6-44dd-9e6a-f2574f3d589f" path="/var/lib/kubelet/pods/1d57f145-47a6-44dd-9e6a-f2574f3d589f/volumes" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.046478 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" path="/var/lib/kubelet/pods/ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd/volumes" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.095172 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.095212 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj4kr\" (UniqueName: \"kubernetes.io/projected/189b1a66-f6a7-4db2-bb51-b22d1725a41b-kube-api-access-fj4kr\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.095228 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9qp5\" (UniqueName: \"kubernetes.io/projected/680430ff-d825-4af7-bb3d-d8cfe32f1e4f-kube-api-access-g9qp5\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.095239 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/189b1a66-f6a7-4db2-bb51-b22d1725a41b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.401080 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ea2-account-create-update-pzlww" event={"ID":"68b5edf2-4ac6-4377-8bde-268655955533","Type":"ContainerDied","Data":"264da87cd05ee84d684fe3f1b32b07c957b0060b19f760227f0372755a973eb2"} Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.401148 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="264da87cd05ee84d684fe3f1b32b07c957b0060b19f760227f0372755a973eb2" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.401088 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-pzlww" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.402969 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8js4z" event={"ID":"189b1a66-f6a7-4db2-bb51-b22d1725a41b","Type":"ContainerDied","Data":"391ea6b06a92e5101f296dbdea79851846f53a45b43012c5acc4d047db871555"} Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.403047 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="391ea6b06a92e5101f296dbdea79851846f53a45b43012c5acc4d047db871555" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.403136 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8js4z" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.406112 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mqldm" event={"ID":"680430ff-d825-4af7-bb3d-d8cfe32f1e4f","Type":"ContainerDied","Data":"ad9d1d7d90be7017e931ccbefeba581632857a06804fadcc8fe0fec2d91ef31b"} Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.406158 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9d1d7d90be7017e931ccbefeba581632857a06804fadcc8fe0fec2d91ef31b" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.406226 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqldm" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.779693 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.912946 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1bbd954-27ab-473e-b15f-42a06ad72887-operator-scripts\") pod \"c1bbd954-27ab-473e-b15f-42a06ad72887\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.913031 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nfxk\" (UniqueName: \"kubernetes.io/projected/c1bbd954-27ab-473e-b15f-42a06ad72887-kube-api-access-6nfxk\") pod \"c1bbd954-27ab-473e-b15f-42a06ad72887\" (UID: \"c1bbd954-27ab-473e-b15f-42a06ad72887\") " Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.914733 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1bbd954-27ab-473e-b15f-42a06ad72887-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1bbd954-27ab-473e-b15f-42a06ad72887" (UID: "c1bbd954-27ab-473e-b15f-42a06ad72887"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:40 crc kubenswrapper[5129]: I0314 07:20:40.916812 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1bbd954-27ab-473e-b15f-42a06ad72887-kube-api-access-6nfxk" (OuterVolumeSpecName: "kube-api-access-6nfxk") pod "c1bbd954-27ab-473e-b15f-42a06ad72887" (UID: "c1bbd954-27ab-473e-b15f-42a06ad72887"). InnerVolumeSpecName "kube-api-access-6nfxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.021268 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1bbd954-27ab-473e-b15f-42a06ad72887-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.021835 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nfxk\" (UniqueName: \"kubernetes.io/projected/c1bbd954-27ab-473e-b15f-42a06ad72887-kube-api-access-6nfxk\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.419805 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ee9b-account-create-update-xxxtd" event={"ID":"c1bbd954-27ab-473e-b15f-42a06ad72887","Type":"ContainerDied","Data":"1fe0feb474d94a84385cc627e03cf0fa80bde584fb58577d7fd14249faf4beef"} Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.419843 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fe0feb474d94a84385cc627e03cf0fa80bde584fb58577d7fd14249faf4beef" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.419893 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ee9b-account-create-update-xxxtd" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.677140 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-llsfr"] Mar 14 07:20:41 crc kubenswrapper[5129]: E0314 07:20:41.677753 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerName="dnsmasq-dns" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.677784 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerName="dnsmasq-dns" Mar 14 07:20:41 crc kubenswrapper[5129]: E0314 07:20:41.677801 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680430ff-d825-4af7-bb3d-d8cfe32f1e4f" containerName="mariadb-database-create" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.677813 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="680430ff-d825-4af7-bb3d-d8cfe32f1e4f" containerName="mariadb-database-create" Mar 14 07:20:41 crc kubenswrapper[5129]: E0314 07:20:41.677835 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bbd954-27ab-473e-b15f-42a06ad72887" containerName="mariadb-account-create-update" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.677847 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bbd954-27ab-473e-b15f-42a06ad72887" containerName="mariadb-account-create-update" Mar 14 07:20:41 crc kubenswrapper[5129]: E0314 07:20:41.677868 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerName="init" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.677878 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerName="init" Mar 14 07:20:41 crc kubenswrapper[5129]: E0314 07:20:41.677901 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189b1a66-f6a7-4db2-bb51-b22d1725a41b" containerName="mariadb-database-create" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.677911 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="189b1a66-f6a7-4db2-bb51-b22d1725a41b" containerName="mariadb-database-create" Mar 14 07:20:41 crc kubenswrapper[5129]: E0314 07:20:41.677931 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b5edf2-4ac6-4377-8bde-268655955533" containerName="mariadb-account-create-update" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.677943 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b5edf2-4ac6-4377-8bde-268655955533" containerName="mariadb-account-create-update" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.678178 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="680430ff-d825-4af7-bb3d-d8cfe32f1e4f" containerName="mariadb-database-create" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.678195 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1bbd954-27ab-473e-b15f-42a06ad72887" containerName="mariadb-account-create-update" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.678206 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4cf8c8-94d5-4e58-b268-a5ba23eaa4cd" containerName="dnsmasq-dns" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.678222 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="189b1a66-f6a7-4db2-bb51-b22d1725a41b" containerName="mariadb-database-create" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.678239 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b5edf2-4ac6-4377-8bde-268655955533" containerName="mariadb-account-create-update" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.678872 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.681059 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.681250 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gptdj" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.741128 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-combined-ca-bundle\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.741196 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-db-sync-config-data\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.741236 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-config-data\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.741258 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk8zl\" (UniqueName: \"kubernetes.io/projected/2c8203ff-5259-4d83-a96b-362be3884609-kube-api-access-rk8zl\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.779717 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-llsfr"] Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.843530 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-combined-ca-bundle\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.843601 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-db-sync-config-data\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.843670 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-config-data\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.843697 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk8zl\" (UniqueName: \"kubernetes.io/projected/2c8203ff-5259-4d83-a96b-362be3884609-kube-api-access-rk8zl\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.863452 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-config-data\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.870219 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-combined-ca-bundle\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.870844 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk8zl\" (UniqueName: \"kubernetes.io/projected/2c8203ff-5259-4d83-a96b-362be3884609-kube-api-access-rk8zl\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.877101 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-db-sync-config-data\") pod \"glance-db-sync-llsfr\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:41 crc kubenswrapper[5129]: I0314 07:20:41.995248 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-llsfr" Mar 14 07:20:42 crc kubenswrapper[5129]: I0314 07:20:42.433525 5129 generic.go:334] "Generic (PLEG): container finished" podID="babd1361-ef0f-4921-905f-cbb1af24beea" containerID="74d5ec83fe89138910b7e8dfccdd1837156e7d6178650fedbb4f80fe74cdc1ff" exitCode=0 Mar 14 07:20:42 crc kubenswrapper[5129]: I0314 07:20:42.433836 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-66xz6" event={"ID":"babd1361-ef0f-4921-905f-cbb1af24beea","Type":"ContainerDied","Data":"74d5ec83fe89138910b7e8dfccdd1837156e7d6178650fedbb4f80fe74cdc1ff"} Mar 14 07:20:42 crc kubenswrapper[5129]: I0314 07:20:42.552081 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-llsfr"] Mar 14 07:20:42 crc kubenswrapper[5129]: W0314 07:20:42.553121 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c8203ff_5259_4d83_a96b_362be3884609.slice/crio-fd62d03c8993213f0f417e73b17fedb978316fe1312124f179bd57ebf70ebcf7 WatchSource:0}: Error finding container fd62d03c8993213f0f417e73b17fedb978316fe1312124f179bd57ebf70ebcf7: Status 404 returned error can't find the container with id fd62d03c8993213f0f417e73b17fedb978316fe1312124f179bd57ebf70ebcf7 Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.443086 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-llsfr" event={"ID":"2c8203ff-5259-4d83-a96b-362be3884609","Type":"ContainerStarted","Data":"fd62d03c8993213f0f417e73b17fedb978316fe1312124f179bd57ebf70ebcf7"} Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.745656 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.881301 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-scripts\") pod \"babd1361-ef0f-4921-905f-cbb1af24beea\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.881981 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-dispersionconf\") pod \"babd1361-ef0f-4921-905f-cbb1af24beea\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.882045 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-combined-ca-bundle\") pod \"babd1361-ef0f-4921-905f-cbb1af24beea\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.882134 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-ring-data-devices\") pod \"babd1361-ef0f-4921-905f-cbb1af24beea\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.882167 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/babd1361-ef0f-4921-905f-cbb1af24beea-etc-swift\") pod \"babd1361-ef0f-4921-905f-cbb1af24beea\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.882843 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "babd1361-ef0f-4921-905f-cbb1af24beea" (UID: "babd1361-ef0f-4921-905f-cbb1af24beea"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.882851 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmsz2\" (UniqueName: \"kubernetes.io/projected/babd1361-ef0f-4921-905f-cbb1af24beea-kube-api-access-hmsz2\") pod \"babd1361-ef0f-4921-905f-cbb1af24beea\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.882892 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-swiftconf\") pod \"babd1361-ef0f-4921-905f-cbb1af24beea\" (UID: \"babd1361-ef0f-4921-905f-cbb1af24beea\") " Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.883529 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.887717 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/babd1361-ef0f-4921-905f-cbb1af24beea-kube-api-access-hmsz2" (OuterVolumeSpecName: "kube-api-access-hmsz2") pod "babd1361-ef0f-4921-905f-cbb1af24beea" (UID: "babd1361-ef0f-4921-905f-cbb1af24beea"). InnerVolumeSpecName "kube-api-access-hmsz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.889198 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/babd1361-ef0f-4921-905f-cbb1af24beea-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "babd1361-ef0f-4921-905f-cbb1af24beea" (UID: "babd1361-ef0f-4921-905f-cbb1af24beea"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.891666 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "babd1361-ef0f-4921-905f-cbb1af24beea" (UID: "babd1361-ef0f-4921-905f-cbb1af24beea"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.906890 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "babd1361-ef0f-4921-905f-cbb1af24beea" (UID: "babd1361-ef0f-4921-905f-cbb1af24beea"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.909441 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "babd1361-ef0f-4921-905f-cbb1af24beea" (UID: "babd1361-ef0f-4921-905f-cbb1af24beea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.911988 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-scripts" (OuterVolumeSpecName: "scripts") pod "babd1361-ef0f-4921-905f-cbb1af24beea" (UID: "babd1361-ef0f-4921-905f-cbb1af24beea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.985572 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/babd1361-ef0f-4921-905f-cbb1af24beea-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.985620 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmsz2\" (UniqueName: \"kubernetes.io/projected/babd1361-ef0f-4921-905f-cbb1af24beea-kube-api-access-hmsz2\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.985635 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.985642 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/babd1361-ef0f-4921-905f-cbb1af24beea-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.985650 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:43 crc kubenswrapper[5129]: I0314 07:20:43.985658 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babd1361-ef0f-4921-905f-cbb1af24beea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.381691 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5dxcc"] Mar 14 07:20:44 crc kubenswrapper[5129]: E0314 07:20:44.381993 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babd1361-ef0f-4921-905f-cbb1af24beea" containerName="swift-ring-rebalance" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.382005 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="babd1361-ef0f-4921-905f-cbb1af24beea" containerName="swift-ring-rebalance" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.382160 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="babd1361-ef0f-4921-905f-cbb1af24beea" containerName="swift-ring-rebalance" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.382687 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.389284 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.411725 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5dxcc"] Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.416141 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9bsx2" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerName="ovn-controller" probeResult="failure" output=< Mar 14 07:20:44 crc kubenswrapper[5129]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 07:20:44 crc kubenswrapper[5129]: > Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.451962 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-66xz6" event={"ID":"babd1361-ef0f-4921-905f-cbb1af24beea","Type":"ContainerDied","Data":"d5df789972239db935363b82381ca252c4ead2710517cc58af9f3e637f9e2d04"} Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.452001 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5df789972239db935363b82381ca252c4ead2710517cc58af9f3e637f9e2d04" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.452024 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-66xz6" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.458216 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.465100 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.495999 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lt4m\" (UniqueName: \"kubernetes.io/projected/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-kube-api-access-7lt4m\") pod \"root-account-create-update-5dxcc\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.496053 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-operator-scripts\") pod \"root-account-create-update-5dxcc\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.597914 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lt4m\" (UniqueName: \"kubernetes.io/projected/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-kube-api-access-7lt4m\") pod \"root-account-create-update-5dxcc\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.597977 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-operator-scripts\") pod \"root-account-create-update-5dxcc\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.599241 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-operator-scripts\") pod \"root-account-create-update-5dxcc\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.614993 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lt4m\" (UniqueName: \"kubernetes.io/projected/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-kube-api-access-7lt4m\") pod \"root-account-create-update-5dxcc\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.673844 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9bsx2-config-xd5g9"] Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.675252 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.683981 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.688277 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bsx2-config-xd5g9"] Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.699024 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.808072 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-scripts\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.808209 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run-ovn\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.808269 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n77z8\" (UniqueName: \"kubernetes.io/projected/72f2a1c1-55c3-4c98-80b6-b457645d525b-kube-api-access-n77z8\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.808304 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.808337 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-log-ovn\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.808375 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-additional-scripts\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.910230 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n77z8\" (UniqueName: \"kubernetes.io/projected/72f2a1c1-55c3-4c98-80b6-b457645d525b-kube-api-access-n77z8\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.910282 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.910327 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-log-ovn\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.910363 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-additional-scripts\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.910431 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-scripts\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.910499 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run-ovn\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.910891 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run-ovn\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.911243 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.911299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-log-ovn\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.912137 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-additional-scripts\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.914213 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-scripts\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.950356 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n77z8\" (UniqueName: \"kubernetes.io/projected/72f2a1c1-55c3-4c98-80b6-b457645d525b-kube-api-access-n77z8\") pod \"ovn-controller-9bsx2-config-xd5g9\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:44 crc kubenswrapper[5129]: I0314 07:20:44.995856 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.012043 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.027227 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"swift-storage-0\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " pod="openstack/swift-storage-0" Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.060223 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.248377 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5dxcc"] Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.464501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5dxcc" event={"ID":"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7","Type":"ContainerStarted","Data":"b854816ffdb413b1f325152b1eea18a46d450c72ee8805a2b0a3db09e84d6a28"} Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.465510 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5dxcc" event={"ID":"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7","Type":"ContainerStarted","Data":"089a5227c9569fa5fd2770bc347850d8492e25bf896785e295532d613d4bc3fb"} Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.483858 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bsx2-config-xd5g9"] Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.492492 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-5dxcc" podStartSLOduration=1.49247273 podStartE2EDuration="1.49247273s" podCreationTimestamp="2026-03-14 07:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:20:45.478018106 +0000 UTC m=+1308.229933330" watchObservedRunningTime="2026-03-14 07:20:45.49247273 +0000 UTC m=+1308.244387914" Mar 14 07:20:45 crc kubenswrapper[5129]: I0314 07:20:45.617633 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:20:45 crc kubenswrapper[5129]: W0314 07:20:45.705380 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c0c6778_6f35_4243_9e82_ca3c8f3968fc.slice/crio-1a8ac9864a22197bd62d032b41808e93c4f88caf9eb643f52eb3ddfe8310a45c WatchSource:0}: Error finding container 1a8ac9864a22197bd62d032b41808e93c4f88caf9eb643f52eb3ddfe8310a45c: Status 404 returned error can't find the container with id 1a8ac9864a22197bd62d032b41808e93c4f88caf9eb643f52eb3ddfe8310a45c Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.471340 5129 generic.go:334] "Generic (PLEG): container finished" podID="5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7" containerID="b854816ffdb413b1f325152b1eea18a46d450c72ee8805a2b0a3db09e84d6a28" exitCode=0 Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.471576 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5dxcc" event={"ID":"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7","Type":"ContainerDied","Data":"b854816ffdb413b1f325152b1eea18a46d450c72ee8805a2b0a3db09e84d6a28"} Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.474867 5129 generic.go:334] "Generic (PLEG): container finished" podID="72f2a1c1-55c3-4c98-80b6-b457645d525b" containerID="280e70bcc439b2c014146604829b8ba14a084f751f5b8f4de23c94c4c3333070" exitCode=0 Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.474963 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bsx2-config-xd5g9" event={"ID":"72f2a1c1-55c3-4c98-80b6-b457645d525b","Type":"ContainerDied","Data":"280e70bcc439b2c014146604829b8ba14a084f751f5b8f4de23c94c4c3333070"} Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.474984 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bsx2-config-xd5g9" event={"ID":"72f2a1c1-55c3-4c98-80b6-b457645d525b","Type":"ContainerStarted","Data":"f9359c3feb5b13dd7731af29089aa2c79dad122156078a07a40e3850a3a0bfb3"} Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.476383 5129 generic.go:334] "Generic (PLEG): container finished" podID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerID="50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0" exitCode=0 Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.476422 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6261e6b-f331-4dcb-8380-167e8f547e1b","Type":"ContainerDied","Data":"50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0"} Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.478679 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"1a8ac9864a22197bd62d032b41808e93c4f88caf9eb643f52eb3ddfe8310a45c"} Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.480860 5129 generic.go:334] "Generic (PLEG): container finished" podID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerID="6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122" exitCode=0 Mar 14 07:20:46 crc kubenswrapper[5129]: I0314 07:20:46.480885 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d291cef2-24d8-4ae6-aa4f-dfa8e782db15","Type":"ContainerDied","Data":"6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122"} Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.529888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d291cef2-24d8-4ae6-aa4f-dfa8e782db15","Type":"ContainerStarted","Data":"dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1"} Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.530419 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.531867 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6261e6b-f331-4dcb-8380-167e8f547e1b","Type":"ContainerStarted","Data":"277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb"} Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.532177 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.537962 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"92720726c9e708458b5da5758e478b9521c17f64ffaa2789a7359d2da3f264d0"} Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.537993 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"f1efa38796d71cf5fcb64262b9513b8dbb5d0472b2f9b02765e6507ec1e1670d"} Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.538005 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"89c397e9fa15d1498f1dfb3620f6d0877fdad87ff4762961a8804fdb1503a9fc"} Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.538014 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"89fd8247fe552c4e8c71705aad2c39088f4d80948c9776ee701e127517dfcd2e"} Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.562314 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.87726679 podStartE2EDuration="56.562296949s" podCreationTimestamp="2026-03-14 07:19:51 +0000 UTC" firstStartedPulling="2026-03-14 07:20:04.68870401 +0000 UTC m=+1267.440619194" lastFinishedPulling="2026-03-14 07:20:12.373734169 +0000 UTC m=+1275.125649353" observedRunningTime="2026-03-14 07:20:47.548612165 +0000 UTC m=+1310.300527349" watchObservedRunningTime="2026-03-14 07:20:47.562296949 +0000 UTC m=+1310.314212133" Mar 14 07:20:47 crc kubenswrapper[5129]: I0314 07:20:47.580755 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.789138182 podStartE2EDuration="56.580735202s" podCreationTimestamp="2026-03-14 07:19:51 +0000 UTC" firstStartedPulling="2026-03-14 07:20:04.882279352 +0000 UTC m=+1267.634194536" lastFinishedPulling="2026-03-14 07:20:11.673876372 +0000 UTC m=+1274.425791556" observedRunningTime="2026-03-14 07:20:47.579304343 +0000 UTC m=+1310.331219527" watchObservedRunningTime="2026-03-14 07:20:47.580735202 +0000 UTC m=+1310.332650386" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.111947 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.114399 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208198 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-operator-scripts\") pod \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208238 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-log-ovn\") pod \"72f2a1c1-55c3-4c98-80b6-b457645d525b\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208321 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-additional-scripts\") pod \"72f2a1c1-55c3-4c98-80b6-b457645d525b\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208355 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-scripts\") pod \"72f2a1c1-55c3-4c98-80b6-b457645d525b\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208382 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lt4m\" (UniqueName: \"kubernetes.io/projected/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-kube-api-access-7lt4m\") pod \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\" (UID: \"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208438 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n77z8\" (UniqueName: \"kubernetes.io/projected/72f2a1c1-55c3-4c98-80b6-b457645d525b-kube-api-access-n77z8\") pod \"72f2a1c1-55c3-4c98-80b6-b457645d525b\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208453 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run\") pod \"72f2a1c1-55c3-4c98-80b6-b457645d525b\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208472 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run-ovn\") pod \"72f2a1c1-55c3-4c98-80b6-b457645d525b\" (UID: \"72f2a1c1-55c3-4c98-80b6-b457645d525b\") " Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.208865 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "72f2a1c1-55c3-4c98-80b6-b457645d525b" (UID: "72f2a1c1-55c3-4c98-80b6-b457645d525b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.209927 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "72f2a1c1-55c3-4c98-80b6-b457645d525b" (UID: "72f2a1c1-55c3-4c98-80b6-b457645d525b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.210140 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-scripts" (OuterVolumeSpecName: "scripts") pod "72f2a1c1-55c3-4c98-80b6-b457645d525b" (UID: "72f2a1c1-55c3-4c98-80b6-b457645d525b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.210092 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run" (OuterVolumeSpecName: "var-run") pod "72f2a1c1-55c3-4c98-80b6-b457645d525b" (UID: "72f2a1c1-55c3-4c98-80b6-b457645d525b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.210372 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "72f2a1c1-55c3-4c98-80b6-b457645d525b" (UID: "72f2a1c1-55c3-4c98-80b6-b457645d525b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.210703 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7" (UID: "5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.216837 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-kube-api-access-7lt4m" (OuterVolumeSpecName: "kube-api-access-7lt4m") pod "5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7" (UID: "5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7"). InnerVolumeSpecName "kube-api-access-7lt4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.221329 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f2a1c1-55c3-4c98-80b6-b457645d525b-kube-api-access-n77z8" (OuterVolumeSpecName: "kube-api-access-n77z8") pod "72f2a1c1-55c3-4c98-80b6-b457645d525b" (UID: "72f2a1c1-55c3-4c98-80b6-b457645d525b"). InnerVolumeSpecName "kube-api-access-n77z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310253 5129 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310292 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72f2a1c1-55c3-4c98-80b6-b457645d525b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310302 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lt4m\" (UniqueName: \"kubernetes.io/projected/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-kube-api-access-7lt4m\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310314 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n77z8\" (UniqueName: \"kubernetes.io/projected/72f2a1c1-55c3-4c98-80b6-b457645d525b-kube-api-access-n77z8\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310323 5129 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310331 5129 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310339 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.310346 5129 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72f2a1c1-55c3-4c98-80b6-b457645d525b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.551168 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5dxcc" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.551386 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5dxcc" event={"ID":"5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7","Type":"ContainerDied","Data":"089a5227c9569fa5fd2770bc347850d8492e25bf896785e295532d613d4bc3fb"} Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.551415 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="089a5227c9569fa5fd2770bc347850d8492e25bf896785e295532d613d4bc3fb" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.558146 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2-config-xd5g9" Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.558171 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bsx2-config-xd5g9" event={"ID":"72f2a1c1-55c3-4c98-80b6-b457645d525b","Type":"ContainerDied","Data":"f9359c3feb5b13dd7731af29089aa2c79dad122156078a07a40e3850a3a0bfb3"} Mar 14 07:20:48 crc kubenswrapper[5129]: I0314 07:20:48.558261 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9359c3feb5b13dd7731af29089aa2c79dad122156078a07a40e3850a3a0bfb3" Mar 14 07:20:49 crc kubenswrapper[5129]: I0314 07:20:49.236726 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9bsx2-config-xd5g9"] Mar 14 07:20:49 crc kubenswrapper[5129]: I0314 07:20:49.249564 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9bsx2-config-xd5g9"] Mar 14 07:20:49 crc kubenswrapper[5129]: I0314 07:20:49.393867 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9bsx2" Mar 14 07:20:49 crc kubenswrapper[5129]: I0314 07:20:49.573996 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:20:49 crc kubenswrapper[5129]: I0314 07:20:49.574051 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:20:50 crc kubenswrapper[5129]: I0314 07:20:50.046788 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f2a1c1-55c3-4c98-80b6-b457645d525b" path="/var/lib/kubelet/pods/72f2a1c1-55c3-4c98-80b6-b457645d525b/volumes" Mar 14 07:20:58 crc kubenswrapper[5129]: E0314 07:20:58.118057 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120" Mar 14 07:20:58 crc kubenswrapper[5129]: E0314 07:20:58.118630 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rk8zl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-llsfr_openstack(2c8203ff-5259-4d83-a96b-362be3884609): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:20:58 crc kubenswrapper[5129]: E0314 07:20:58.119910 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-llsfr" podUID="2c8203ff-5259-4d83-a96b-362be3884609" Mar 14 07:20:58 crc kubenswrapper[5129]: I0314 07:20:58.642173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"3059f4424e061a65f5c7669f2d08d51ae956e9986395c5c1a40f1a6c63f29fe4"} Mar 14 07:20:58 crc kubenswrapper[5129]: I0314 07:20:58.642547 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"1bfacbb3c7f4e07f82826af5ace3d723b967a5f7ed7522a9988f754dcb73cd76"} Mar 14 07:20:58 crc kubenswrapper[5129]: I0314 07:20:58.642561 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"ea8689b6701500735737f4995bd8adb2dcef261718f8b5bd911de435e0c3a742"} Mar 14 07:20:58 crc kubenswrapper[5129]: E0314 07:20:58.643406 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120\\\"\"" pod="openstack/glance-db-sync-llsfr" podUID="2c8203ff-5259-4d83-a96b-362be3884609" Mar 14 07:20:59 crc kubenswrapper[5129]: I0314 07:20:59.656201 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"de4e9cf22659332ab4a321d87b2be0419c0f5ba9584d4883e24f44b8cd760689"} Mar 14 07:21:00 crc kubenswrapper[5129]: I0314 07:21:00.669176 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"1439671428e36f062ecec5e844057f831307ced7145f7e08d2685ea64612ca03"} Mar 14 07:21:00 crc kubenswrapper[5129]: I0314 07:21:00.669490 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"21d34fb8c8ae530f947029ee8db5c31bb1fff7d650c0adaa7c67025d5674e1a9"} Mar 14 07:21:00 crc kubenswrapper[5129]: I0314 07:21:00.669501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"d28de17046ef16ad3ac1f241f5db591353ee653132756124d068067d81af0578"} Mar 14 07:21:00 crc kubenswrapper[5129]: I0314 07:21:00.669509 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"3e9b50ce398570eef3d701e17b2478e6a2800a85b0238429627e5231237c15ba"} Mar 14 07:21:01 crc kubenswrapper[5129]: I0314 07:21:01.683795 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"387ec0f7caa0f65ef81af98a02901cb03b71d081ad6cad271dfe418e7919ac2a"} Mar 14 07:21:02 crc kubenswrapper[5129]: I0314 07:21:02.660969 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:21:02 crc kubenswrapper[5129]: I0314 07:21:02.713819 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"68f079c63627945e154f1a8ae1e17bd4c418f8de737ec76b1962ceb2dce0568b"} Mar 14 07:21:02 crc kubenswrapper[5129]: I0314 07:21:02.713917 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerStarted","Data":"8dfc0ae842b54e831f58e889aa510e0e67ccf64cc21775a958610f8a34d354da"} Mar 14 07:21:02 crc kubenswrapper[5129]: I0314 07:21:02.772582 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.855169177 podStartE2EDuration="34.772560767s" podCreationTimestamp="2026-03-14 07:20:28 +0000 UTC" firstStartedPulling="2026-03-14 07:20:45.708589888 +0000 UTC m=+1308.460505072" lastFinishedPulling="2026-03-14 07:20:59.625981478 +0000 UTC m=+1322.377896662" observedRunningTime="2026-03-14 07:21:02.772478205 +0000 UTC m=+1325.524393389" watchObservedRunningTime="2026-03-14 07:21:02.772560767 +0000 UTC m=+1325.524475971" Mar 14 07:21:02 crc kubenswrapper[5129]: I0314 07:21:02.917788 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.046774 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67754df655-mzts4"] Mar 14 07:21:03 crc kubenswrapper[5129]: E0314 07:21:03.047191 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7" containerName="mariadb-account-create-update" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.047213 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7" containerName="mariadb-account-create-update" Mar 14 07:21:03 crc kubenswrapper[5129]: E0314 07:21:03.047239 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f2a1c1-55c3-4c98-80b6-b457645d525b" containerName="ovn-config" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.047248 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f2a1c1-55c3-4c98-80b6-b457645d525b" containerName="ovn-config" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.047444 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f2a1c1-55c3-4c98-80b6-b457645d525b" containerName="ovn-config" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.047457 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7" containerName="mariadb-account-create-update" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.048531 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.050983 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.054791 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-mzts4"] Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.163872 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w26b4\" (UniqueName: \"kubernetes.io/projected/9f4da728-25d7-4876-8039-c6db1f4ee858-kube-api-access-w26b4\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.163955 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.164039 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-svc\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.164079 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.164110 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.164183 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.266176 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-svc\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.266244 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.266284 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.266340 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.266406 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w26b4\" (UniqueName: \"kubernetes.io/projected/9f4da728-25d7-4876-8039-c6db1f4ee858-kube-api-access-w26b4\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.266472 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.267785 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.267801 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-svc\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.267832 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.267875 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.268184 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.287140 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w26b4\" (UniqueName: \"kubernetes.io/projected/9f4da728-25d7-4876-8039-c6db1f4ee858-kube-api-access-w26b4\") pod \"dnsmasq-dns-67754df655-mzts4\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:03 crc kubenswrapper[5129]: I0314 07:21:03.366495 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.438998 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-mzts4"] Mar 14 07:21:04 crc kubenswrapper[5129]: W0314 07:21:04.459787 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f4da728_25d7_4876_8039_c6db1f4ee858.slice/crio-55723da1a4ff49141705706221a2538988f6d88f5e9eac53172988972287b560 WatchSource:0}: Error finding container 55723da1a4ff49141705706221a2538988f6d88f5e9eac53172988972287b560: Status 404 returned error can't find the container with id 55723da1a4ff49141705706221a2538988f6d88f5e9eac53172988972287b560 Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.607584 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lqhkw"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.608708 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.655578 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lqhkw"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.691413 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf76f7dd-0392-4e68-a441-c982f7055f24-operator-scripts\") pod \"cinder-db-create-lqhkw\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.691496 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9whb\" (UniqueName: \"kubernetes.io/projected/bf76f7dd-0392-4e68-a441-c982f7055f24-kube-api-access-q9whb\") pod \"cinder-db-create-lqhkw\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.721004 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9e6d-account-create-update-l4gqw"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.725495 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.727549 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.731571 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-mzts4" event={"ID":"9f4da728-25d7-4876-8039-c6db1f4ee858","Type":"ContainerStarted","Data":"55723da1a4ff49141705706221a2538988f6d88f5e9eac53172988972287b560"} Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.731687 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9e6d-account-create-update-l4gqw"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.792496 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf76f7dd-0392-4e68-a441-c982f7055f24-operator-scripts\") pod \"cinder-db-create-lqhkw\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.792543 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9whb\" (UniqueName: \"kubernetes.io/projected/bf76f7dd-0392-4e68-a441-c982f7055f24-kube-api-access-q9whb\") pod \"cinder-db-create-lqhkw\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.792589 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/dd17de58-336d-485a-aef2-dbab072a4007-kube-api-access-r8vjr\") pod \"cinder-9e6d-account-create-update-l4gqw\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.792628 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd17de58-336d-485a-aef2-dbab072a4007-operator-scripts\") pod \"cinder-9e6d-account-create-update-l4gqw\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.793161 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf76f7dd-0392-4e68-a441-c982f7055f24-operator-scripts\") pod \"cinder-db-create-lqhkw\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.806379 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7ggl5"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.807341 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.819237 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7ggl5"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.825673 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9whb\" (UniqueName: \"kubernetes.io/projected/bf76f7dd-0392-4e68-a441-c982f7055f24-kube-api-access-q9whb\") pod \"cinder-db-create-lqhkw\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.893973 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwh4m\" (UniqueName: \"kubernetes.io/projected/a9190972-0c7d-4d51-a42e-089c06e17395-kube-api-access-lwh4m\") pod \"barbican-db-create-7ggl5\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.894099 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9190972-0c7d-4d51-a42e-089c06e17395-operator-scripts\") pod \"barbican-db-create-7ggl5\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.894145 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/dd17de58-336d-485a-aef2-dbab072a4007-kube-api-access-r8vjr\") pod \"cinder-9e6d-account-create-update-l4gqw\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.894173 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd17de58-336d-485a-aef2-dbab072a4007-operator-scripts\") pod \"cinder-9e6d-account-create-update-l4gqw\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.895033 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd17de58-336d-485a-aef2-dbab072a4007-operator-scripts\") pod \"cinder-9e6d-account-create-update-l4gqw\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.906888 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fwnrp"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.907777 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.916349 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c51b-account-create-update-7gs2v"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.923782 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/dd17de58-336d-485a-aef2-dbab072a4007-kube-api-access-r8vjr\") pod \"cinder-9e6d-account-create-update-l4gqw\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.929490 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.939566 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.940315 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fwnrp"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.946163 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.959410 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c51b-account-create-update-7gs2v"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.997452 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rkl9q"] Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.997635 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534a50f3-4cdf-4532-9d16-d61c1792d403-operator-scripts\") pod \"barbican-c51b-account-create-update-7gs2v\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.997703 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9190972-0c7d-4d51-a42e-089c06e17395-operator-scripts\") pod \"barbican-db-create-7ggl5\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.997734 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx42h\" (UniqueName: \"kubernetes.io/projected/410747b6-e92a-4253-a502-6e43b9e6048b-kube-api-access-rx42h\") pod \"neutron-db-create-fwnrp\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.997761 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2d2s\" (UniqueName: \"kubernetes.io/projected/534a50f3-4cdf-4532-9d16-d61c1792d403-kube-api-access-r2d2s\") pod \"barbican-c51b-account-create-update-7gs2v\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.997853 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwh4m\" (UniqueName: \"kubernetes.io/projected/a9190972-0c7d-4d51-a42e-089c06e17395-kube-api-access-lwh4m\") pod \"barbican-db-create-7ggl5\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.997886 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410747b6-e92a-4253-a502-6e43b9e6048b-operator-scripts\") pod \"neutron-db-create-fwnrp\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.998525 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:04 crc kubenswrapper[5129]: I0314 07:21:04.998587 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9190972-0c7d-4d51-a42e-089c06e17395-operator-scripts\") pod \"barbican-db-create-7ggl5\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.001241 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.002064 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.002357 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r97ql" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.003087 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rkl9q"] Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.004179 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.023788 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwh4m\" (UniqueName: \"kubernetes.io/projected/a9190972-0c7d-4d51-a42e-089c06e17395-kube-api-access-lwh4m\") pod \"barbican-db-create-7ggl5\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.088958 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.100882 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534a50f3-4cdf-4532-9d16-d61c1792d403-operator-scripts\") pod \"barbican-c51b-account-create-update-7gs2v\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.100956 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx42h\" (UniqueName: \"kubernetes.io/projected/410747b6-e92a-4253-a502-6e43b9e6048b-kube-api-access-rx42h\") pod \"neutron-db-create-fwnrp\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.100984 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2d2s\" (UniqueName: \"kubernetes.io/projected/534a50f3-4cdf-4532-9d16-d61c1792d403-kube-api-access-r2d2s\") pod \"barbican-c51b-account-create-update-7gs2v\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.101058 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-642rk\" (UniqueName: \"kubernetes.io/projected/4cb44baa-73cf-418a-9cb5-4e7fa092524c-kube-api-access-642rk\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.101138 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-combined-ca-bundle\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.101270 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-config-data\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.101310 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410747b6-e92a-4253-a502-6e43b9e6048b-operator-scripts\") pod \"neutron-db-create-fwnrp\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.102036 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534a50f3-4cdf-4532-9d16-d61c1792d403-operator-scripts\") pod \"barbican-c51b-account-create-update-7gs2v\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.105028 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410747b6-e92a-4253-a502-6e43b9e6048b-operator-scripts\") pod \"neutron-db-create-fwnrp\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.122885 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b690-account-create-update-x9m2c"] Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.124012 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.124930 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2d2s\" (UniqueName: \"kubernetes.io/projected/534a50f3-4cdf-4532-9d16-d61c1792d403-kube-api-access-r2d2s\") pod \"barbican-c51b-account-create-update-7gs2v\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.129309 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.130027 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b690-account-create-update-x9m2c"] Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.133251 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx42h\" (UniqueName: \"kubernetes.io/projected/410747b6-e92a-4253-a502-6e43b9e6048b-kube-api-access-rx42h\") pod \"neutron-db-create-fwnrp\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.172556 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.202720 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7x4p\" (UniqueName: \"kubernetes.io/projected/f1170613-23db-4ba5-9e73-d74a8f68fa8e-kube-api-access-p7x4p\") pod \"neutron-b690-account-create-update-x9m2c\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.202805 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-642rk\" (UniqueName: \"kubernetes.io/projected/4cb44baa-73cf-418a-9cb5-4e7fa092524c-kube-api-access-642rk\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.202866 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-combined-ca-bundle\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.202899 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-config-data\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.202944 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1170613-23db-4ba5-9e73-d74a8f68fa8e-operator-scripts\") pod \"neutron-b690-account-create-update-x9m2c\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.207386 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-config-data\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.210581 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-combined-ca-bundle\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.222650 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-642rk\" (UniqueName: \"kubernetes.io/projected/4cb44baa-73cf-418a-9cb5-4e7fa092524c-kube-api-access-642rk\") pod \"keystone-db-sync-rkl9q\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.228134 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.260082 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.306372 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7x4p\" (UniqueName: \"kubernetes.io/projected/f1170613-23db-4ba5-9e73-d74a8f68fa8e-kube-api-access-p7x4p\") pod \"neutron-b690-account-create-update-x9m2c\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.306478 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1170613-23db-4ba5-9e73-d74a8f68fa8e-operator-scripts\") pod \"neutron-b690-account-create-update-x9m2c\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.307135 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1170613-23db-4ba5-9e73-d74a8f68fa8e-operator-scripts\") pod \"neutron-b690-account-create-update-x9m2c\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.313828 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.359147 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7x4p\" (UniqueName: \"kubernetes.io/projected/f1170613-23db-4ba5-9e73-d74a8f68fa8e-kube-api-access-p7x4p\") pod \"neutron-b690-account-create-update-x9m2c\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.442112 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lqhkw"] Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.470749 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.488033 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9e6d-account-create-update-l4gqw"] Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.562589 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7ggl5"] Mar 14 07:21:05 crc kubenswrapper[5129]: W0314 07:21:05.574532 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9190972_0c7d_4d51_a42e_089c06e17395.slice/crio-eb8473f037fae886a460b8b331c543a228a0a701aa8e64e02f5974ef9b59823b WatchSource:0}: Error finding container eb8473f037fae886a460b8b331c543a228a0a701aa8e64e02f5974ef9b59823b: Status 404 returned error can't find the container with id eb8473f037fae886a460b8b331c543a228a0a701aa8e64e02f5974ef9b59823b Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.656912 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fwnrp"] Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.739695 5129 generic.go:334] "Generic (PLEG): container finished" podID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerID="f1adc22aa4959719845b2e7c497fb8f4b284d7d2c33af89e5135942b5661c5a5" exitCode=0 Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.739779 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-mzts4" event={"ID":"9f4da728-25d7-4876-8039-c6db1f4ee858","Type":"ContainerDied","Data":"f1adc22aa4959719845b2e7c497fb8f4b284d7d2c33af89e5135942b5661c5a5"} Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.741935 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9e6d-account-create-update-l4gqw" event={"ID":"dd17de58-336d-485a-aef2-dbab072a4007","Type":"ContainerStarted","Data":"a88463f02852763f83fafa45c4219988197c9d3039389b7b618afa982ba3b35c"} Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.744211 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fwnrp" event={"ID":"410747b6-e92a-4253-a502-6e43b9e6048b","Type":"ContainerStarted","Data":"59c6407f153a9aa3e5661a10d724a07689264ca765d0fa66d02b735572cbff59"} Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.745638 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lqhkw" event={"ID":"bf76f7dd-0392-4e68-a441-c982f7055f24","Type":"ContainerStarted","Data":"429822ed9e5dbfa56fa8df65a2650e6417e4bf4afde015702d61ca98a82766c9"} Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.753717 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7ggl5" event={"ID":"a9190972-0c7d-4d51-a42e-089c06e17395","Type":"ContainerStarted","Data":"eb8473f037fae886a460b8b331c543a228a0a701aa8e64e02f5974ef9b59823b"} Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.895466 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c51b-account-create-update-7gs2v"] Mar 14 07:21:05 crc kubenswrapper[5129]: W0314 07:21:05.903331 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534a50f3_4cdf_4532_9d16_d61c1792d403.slice/crio-4880660c973ac5b02f5558cea7a59185f8daa8175ea706753a2fc9e23d0feeed WatchSource:0}: Error finding container 4880660c973ac5b02f5558cea7a59185f8daa8175ea706753a2fc9e23d0feeed: Status 404 returned error can't find the container with id 4880660c973ac5b02f5558cea7a59185f8daa8175ea706753a2fc9e23d0feeed Mar 14 07:21:05 crc kubenswrapper[5129]: I0314 07:21:05.960787 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rkl9q"] Mar 14 07:21:05 crc kubenswrapper[5129]: W0314 07:21:05.976166 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cb44baa_73cf_418a_9cb5_4e7fa092524c.slice/crio-0e8c0283b72b6c244c5e13f38fc019db18220ccb75bde50b09a280888da82bfa WatchSource:0}: Error finding container 0e8c0283b72b6c244c5e13f38fc019db18220ccb75bde50b09a280888da82bfa: Status 404 returned error can't find the container with id 0e8c0283b72b6c244c5e13f38fc019db18220ccb75bde50b09a280888da82bfa Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.124532 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b690-account-create-update-x9m2c"] Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.763865 5129 generic.go:334] "Generic (PLEG): container finished" podID="534a50f3-4cdf-4532-9d16-d61c1792d403" containerID="05a12761bd380c5b63fa4611c9eefc9182b51f8797441885cb0882e5f1428152" exitCode=0 Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.763950 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c51b-account-create-update-7gs2v" event={"ID":"534a50f3-4cdf-4532-9d16-d61c1792d403","Type":"ContainerDied","Data":"05a12761bd380c5b63fa4611c9eefc9182b51f8797441885cb0882e5f1428152"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.763980 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c51b-account-create-update-7gs2v" event={"ID":"534a50f3-4cdf-4532-9d16-d61c1792d403","Type":"ContainerStarted","Data":"4880660c973ac5b02f5558cea7a59185f8daa8175ea706753a2fc9e23d0feeed"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.767812 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-mzts4" event={"ID":"9f4da728-25d7-4876-8039-c6db1f4ee858","Type":"ContainerStarted","Data":"0d67e389bdca0ea85d32e76fea34d68d08559fb7bc1f65eb1207c6c50fa51da3"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.767979 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.770486 5129 generic.go:334] "Generic (PLEG): container finished" podID="f1170613-23db-4ba5-9e73-d74a8f68fa8e" containerID="b9a8d6affbde66a3c9a08a97803076b6f677cdee9cb6d3732d5df6ebc8f3d2ac" exitCode=0 Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.770558 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b690-account-create-update-x9m2c" event={"ID":"f1170613-23db-4ba5-9e73-d74a8f68fa8e","Type":"ContainerDied","Data":"b9a8d6affbde66a3c9a08a97803076b6f677cdee9cb6d3732d5df6ebc8f3d2ac"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.770586 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b690-account-create-update-x9m2c" event={"ID":"f1170613-23db-4ba5-9e73-d74a8f68fa8e","Type":"ContainerStarted","Data":"49675e26a1a1a13093aeb1a50a2333cd41b377db13042c9df99cb1d93f585d1c"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.772553 5129 generic.go:334] "Generic (PLEG): container finished" podID="dd17de58-336d-485a-aef2-dbab072a4007" containerID="6b11ff7c689d36d9d7972ebdf5768535187d1d46df224de6748d31adcb31b8af" exitCode=0 Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.772630 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9e6d-account-create-update-l4gqw" event={"ID":"dd17de58-336d-485a-aef2-dbab072a4007","Type":"ContainerDied","Data":"6b11ff7c689d36d9d7972ebdf5768535187d1d46df224de6748d31adcb31b8af"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.782304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkl9q" event={"ID":"4cb44baa-73cf-418a-9cb5-4e7fa092524c","Type":"ContainerStarted","Data":"0e8c0283b72b6c244c5e13f38fc019db18220ccb75bde50b09a280888da82bfa"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.786667 5129 generic.go:334] "Generic (PLEG): container finished" podID="410747b6-e92a-4253-a502-6e43b9e6048b" containerID="a51d9b57d69ec5670560c0fbc28cb54dabfd328b623f7c1ef7fa2ab822811ff1" exitCode=0 Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.786737 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fwnrp" event={"ID":"410747b6-e92a-4253-a502-6e43b9e6048b","Type":"ContainerDied","Data":"a51d9b57d69ec5670560c0fbc28cb54dabfd328b623f7c1ef7fa2ab822811ff1"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.788662 5129 generic.go:334] "Generic (PLEG): container finished" podID="bf76f7dd-0392-4e68-a441-c982f7055f24" containerID="e8ade8ce19ee6a7c918aab839e11b18d21502a596836bd2580c22973f1d8e9d4" exitCode=0 Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.788719 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lqhkw" event={"ID":"bf76f7dd-0392-4e68-a441-c982f7055f24","Type":"ContainerDied","Data":"e8ade8ce19ee6a7c918aab839e11b18d21502a596836bd2580c22973f1d8e9d4"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.797858 5129 generic.go:334] "Generic (PLEG): container finished" podID="a9190972-0c7d-4d51-a42e-089c06e17395" containerID="32afa5447dc7817f3bfebeca62cca531b5c3a36aacad13e6ca1180d089257887" exitCode=0 Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.797911 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7ggl5" event={"ID":"a9190972-0c7d-4d51-a42e-089c06e17395","Type":"ContainerDied","Data":"32afa5447dc7817f3bfebeca62cca531b5c3a36aacad13e6ca1180d089257887"} Mar 14 07:21:06 crc kubenswrapper[5129]: I0314 07:21:06.816241 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67754df655-mzts4" podStartSLOduration=3.816218095 podStartE2EDuration="3.816218095s" podCreationTimestamp="2026-03-14 07:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:06.812061972 +0000 UTC m=+1329.563977156" watchObservedRunningTime="2026-03-14 07:21:06.816218095 +0000 UTC m=+1329.568133289" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.010258 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.017944 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.024294 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.036966 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.043710 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.052464 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.070922 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/dd17de58-336d-485a-aef2-dbab072a4007-kube-api-access-r8vjr\") pod \"dd17de58-336d-485a-aef2-dbab072a4007\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071000 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7x4p\" (UniqueName: \"kubernetes.io/projected/f1170613-23db-4ba5-9e73-d74a8f68fa8e-kube-api-access-p7x4p\") pod \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071045 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9190972-0c7d-4d51-a42e-089c06e17395-operator-scripts\") pod \"a9190972-0c7d-4d51-a42e-089c06e17395\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071099 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534a50f3-4cdf-4532-9d16-d61c1792d403-operator-scripts\") pod \"534a50f3-4cdf-4532-9d16-d61c1792d403\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071142 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9whb\" (UniqueName: \"kubernetes.io/projected/bf76f7dd-0392-4e68-a441-c982f7055f24-kube-api-access-q9whb\") pod \"bf76f7dd-0392-4e68-a441-c982f7055f24\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071183 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd17de58-336d-485a-aef2-dbab072a4007-operator-scripts\") pod \"dd17de58-336d-485a-aef2-dbab072a4007\" (UID: \"dd17de58-336d-485a-aef2-dbab072a4007\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071211 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1170613-23db-4ba5-9e73-d74a8f68fa8e-operator-scripts\") pod \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\" (UID: \"f1170613-23db-4ba5-9e73-d74a8f68fa8e\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071229 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2d2s\" (UniqueName: \"kubernetes.io/projected/534a50f3-4cdf-4532-9d16-d61c1792d403-kube-api-access-r2d2s\") pod \"534a50f3-4cdf-4532-9d16-d61c1792d403\" (UID: \"534a50f3-4cdf-4532-9d16-d61c1792d403\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071311 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410747b6-e92a-4253-a502-6e43b9e6048b-operator-scripts\") pod \"410747b6-e92a-4253-a502-6e43b9e6048b\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071327 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx42h\" (UniqueName: \"kubernetes.io/projected/410747b6-e92a-4253-a502-6e43b9e6048b-kube-api-access-rx42h\") pod \"410747b6-e92a-4253-a502-6e43b9e6048b\" (UID: \"410747b6-e92a-4253-a502-6e43b9e6048b\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071346 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf76f7dd-0392-4e68-a441-c982f7055f24-operator-scripts\") pod \"bf76f7dd-0392-4e68-a441-c982f7055f24\" (UID: \"bf76f7dd-0392-4e68-a441-c982f7055f24\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.071362 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwh4m\" (UniqueName: \"kubernetes.io/projected/a9190972-0c7d-4d51-a42e-089c06e17395-kube-api-access-lwh4m\") pod \"a9190972-0c7d-4d51-a42e-089c06e17395\" (UID: \"a9190972-0c7d-4d51-a42e-089c06e17395\") " Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.073397 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1170613-23db-4ba5-9e73-d74a8f68fa8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1170613-23db-4ba5-9e73-d74a8f68fa8e" (UID: "f1170613-23db-4ba5-9e73-d74a8f68fa8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.075468 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534a50f3-4cdf-4532-9d16-d61c1792d403-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "534a50f3-4cdf-4532-9d16-d61c1792d403" (UID: "534a50f3-4cdf-4532-9d16-d61c1792d403"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.075562 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf76f7dd-0392-4e68-a441-c982f7055f24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf76f7dd-0392-4e68-a441-c982f7055f24" (UID: "bf76f7dd-0392-4e68-a441-c982f7055f24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.077661 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd17de58-336d-485a-aef2-dbab072a4007-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd17de58-336d-485a-aef2-dbab072a4007" (UID: "dd17de58-336d-485a-aef2-dbab072a4007"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.077401 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/410747b6-e92a-4253-a502-6e43b9e6048b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "410747b6-e92a-4253-a502-6e43b9e6048b" (UID: "410747b6-e92a-4253-a502-6e43b9e6048b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.077775 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9190972-0c7d-4d51-a42e-089c06e17395-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9190972-0c7d-4d51-a42e-089c06e17395" (UID: "a9190972-0c7d-4d51-a42e-089c06e17395"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.082700 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534a50f3-4cdf-4532-9d16-d61c1792d403-kube-api-access-r2d2s" (OuterVolumeSpecName: "kube-api-access-r2d2s") pod "534a50f3-4cdf-4532-9d16-d61c1792d403" (UID: "534a50f3-4cdf-4532-9d16-d61c1792d403"). InnerVolumeSpecName "kube-api-access-r2d2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.082879 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1170613-23db-4ba5-9e73-d74a8f68fa8e-kube-api-access-p7x4p" (OuterVolumeSpecName: "kube-api-access-p7x4p") pod "f1170613-23db-4ba5-9e73-d74a8f68fa8e" (UID: "f1170613-23db-4ba5-9e73-d74a8f68fa8e"). InnerVolumeSpecName "kube-api-access-p7x4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.083576 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd17de58-336d-485a-aef2-dbab072a4007-kube-api-access-r8vjr" (OuterVolumeSpecName: "kube-api-access-r8vjr") pod "dd17de58-336d-485a-aef2-dbab072a4007" (UID: "dd17de58-336d-485a-aef2-dbab072a4007"). InnerVolumeSpecName "kube-api-access-r8vjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.086087 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf76f7dd-0392-4e68-a441-c982f7055f24-kube-api-access-q9whb" (OuterVolumeSpecName: "kube-api-access-q9whb") pod "bf76f7dd-0392-4e68-a441-c982f7055f24" (UID: "bf76f7dd-0392-4e68-a441-c982f7055f24"). InnerVolumeSpecName "kube-api-access-q9whb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.094453 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410747b6-e92a-4253-a502-6e43b9e6048b-kube-api-access-rx42h" (OuterVolumeSpecName: "kube-api-access-rx42h") pod "410747b6-e92a-4253-a502-6e43b9e6048b" (UID: "410747b6-e92a-4253-a502-6e43b9e6048b"). InnerVolumeSpecName "kube-api-access-rx42h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.109064 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9190972-0c7d-4d51-a42e-089c06e17395-kube-api-access-lwh4m" (OuterVolumeSpecName: "kube-api-access-lwh4m") pod "a9190972-0c7d-4d51-a42e-089c06e17395" (UID: "a9190972-0c7d-4d51-a42e-089c06e17395"). InnerVolumeSpecName "kube-api-access-lwh4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173522 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd17de58-336d-485a-aef2-dbab072a4007-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173558 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1170613-23db-4ba5-9e73-d74a8f68fa8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173568 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2d2s\" (UniqueName: \"kubernetes.io/projected/534a50f3-4cdf-4532-9d16-d61c1792d403-kube-api-access-r2d2s\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173579 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410747b6-e92a-4253-a502-6e43b9e6048b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173587 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx42h\" (UniqueName: \"kubernetes.io/projected/410747b6-e92a-4253-a502-6e43b9e6048b-kube-api-access-rx42h\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173609 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf76f7dd-0392-4e68-a441-c982f7055f24-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173618 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwh4m\" (UniqueName: \"kubernetes.io/projected/a9190972-0c7d-4d51-a42e-089c06e17395-kube-api-access-lwh4m\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173626 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/dd17de58-336d-485a-aef2-dbab072a4007-kube-api-access-r8vjr\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173635 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7x4p\" (UniqueName: \"kubernetes.io/projected/f1170613-23db-4ba5-9e73-d74a8f68fa8e-kube-api-access-p7x4p\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173643 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9190972-0c7d-4d51-a42e-089c06e17395-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173651 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534a50f3-4cdf-4532-9d16-d61c1792d403-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.173659 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9whb\" (UniqueName: \"kubernetes.io/projected/bf76f7dd-0392-4e68-a441-c982f7055f24-kube-api-access-q9whb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.842792 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7ggl5" event={"ID":"a9190972-0c7d-4d51-a42e-089c06e17395","Type":"ContainerDied","Data":"eb8473f037fae886a460b8b331c543a228a0a701aa8e64e02f5974ef9b59823b"} Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.842845 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb8473f037fae886a460b8b331c543a228a0a701aa8e64e02f5974ef9b59823b" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.842915 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7ggl5" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.844592 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-7gs2v" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.844612 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c51b-account-create-update-7gs2v" event={"ID":"534a50f3-4cdf-4532-9d16-d61c1792d403","Type":"ContainerDied","Data":"4880660c973ac5b02f5558cea7a59185f8daa8175ea706753a2fc9e23d0feeed"} Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.844680 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4880660c973ac5b02f5558cea7a59185f8daa8175ea706753a2fc9e23d0feeed" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.846201 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b690-account-create-update-x9m2c" event={"ID":"f1170613-23db-4ba5-9e73-d74a8f68fa8e","Type":"ContainerDied","Data":"49675e26a1a1a13093aeb1a50a2333cd41b377db13042c9df99cb1d93f585d1c"} Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.846225 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b690-account-create-update-x9m2c" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.846241 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49675e26a1a1a13093aeb1a50a2333cd41b377db13042c9df99cb1d93f585d1c" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.847928 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9e6d-account-create-update-l4gqw" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.847943 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9e6d-account-create-update-l4gqw" event={"ID":"dd17de58-336d-485a-aef2-dbab072a4007","Type":"ContainerDied","Data":"a88463f02852763f83fafa45c4219988197c9d3039389b7b618afa982ba3b35c"} Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.847965 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a88463f02852763f83fafa45c4219988197c9d3039389b7b618afa982ba3b35c" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.850198 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fwnrp" event={"ID":"410747b6-e92a-4253-a502-6e43b9e6048b","Type":"ContainerDied","Data":"59c6407f153a9aa3e5661a10d724a07689264ca765d0fa66d02b735572cbff59"} Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.850225 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59c6407f153a9aa3e5661a10d724a07689264ca765d0fa66d02b735572cbff59" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.850385 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fwnrp" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.852180 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lqhkw" event={"ID":"bf76f7dd-0392-4e68-a441-c982f7055f24","Type":"ContainerDied","Data":"429822ed9e5dbfa56fa8df65a2650e6417e4bf4afde015702d61ca98a82766c9"} Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.852207 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="429822ed9e5dbfa56fa8df65a2650e6417e4bf4afde015702d61ca98a82766c9" Mar 14 07:21:11 crc kubenswrapper[5129]: I0314 07:21:11.852211 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lqhkw" Mar 14 07:21:12 crc kubenswrapper[5129]: I0314 07:21:12.864232 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkl9q" event={"ID":"4cb44baa-73cf-418a-9cb5-4e7fa092524c","Type":"ContainerStarted","Data":"60729269e9e74cce75eaa29022a5770b1873da001478e25f920eb77f5ad7805a"} Mar 14 07:21:12 crc kubenswrapper[5129]: I0314 07:21:12.890320 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rkl9q" podStartSLOduration=3.032786385 podStartE2EDuration="8.890287517s" podCreationTimestamp="2026-03-14 07:21:04 +0000 UTC" firstStartedPulling="2026-03-14 07:21:05.981002695 +0000 UTC m=+1328.732917879" lastFinishedPulling="2026-03-14 07:21:11.838503807 +0000 UTC m=+1334.590419011" observedRunningTime="2026-03-14 07:21:12.889845915 +0000 UTC m=+1335.641761109" watchObservedRunningTime="2026-03-14 07:21:12.890287517 +0000 UTC m=+1335.642202731" Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.368847 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.449831 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-2bjf4"] Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.450272 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" podUID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerName="dnsmasq-dns" containerID="cri-o://94622b0d2d04ac62d54bff88e740d79ee2b30b8094d80870f9821e12c72f54b6" gracePeriod=10 Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.876430 5129 generic.go:334] "Generic (PLEG): container finished" podID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerID="94622b0d2d04ac62d54bff88e740d79ee2b30b8094d80870f9821e12c72f54b6" exitCode=0 Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.876698 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" event={"ID":"baf0d43c-9c51-470c-9863-406b5a2a523b","Type":"ContainerDied","Data":"94622b0d2d04ac62d54bff88e740d79ee2b30b8094d80870f9821e12c72f54b6"} Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.876742 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" event={"ID":"baf0d43c-9c51-470c-9863-406b5a2a523b","Type":"ContainerDied","Data":"042644784f815af94ab0940d761153215bad47eaf29f3a3a872e611049a5db0d"} Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.876755 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="042644784f815af94ab0940d761153215bad47eaf29f3a3a872e611049a5db0d" Mar 14 07:21:13 crc kubenswrapper[5129]: I0314 07:21:13.895985 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.021077 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-sb\") pod \"baf0d43c-9c51-470c-9863-406b5a2a523b\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.021449 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-config\") pod \"baf0d43c-9c51-470c-9863-406b5a2a523b\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.021469 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-dns-svc\") pod \"baf0d43c-9c51-470c-9863-406b5a2a523b\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.022082 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph762\" (UniqueName: \"kubernetes.io/projected/baf0d43c-9c51-470c-9863-406b5a2a523b-kube-api-access-ph762\") pod \"baf0d43c-9c51-470c-9863-406b5a2a523b\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.022150 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-nb\") pod \"baf0d43c-9c51-470c-9863-406b5a2a523b\" (UID: \"baf0d43c-9c51-470c-9863-406b5a2a523b\") " Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.027017 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf0d43c-9c51-470c-9863-406b5a2a523b-kube-api-access-ph762" (OuterVolumeSpecName: "kube-api-access-ph762") pod "baf0d43c-9c51-470c-9863-406b5a2a523b" (UID: "baf0d43c-9c51-470c-9863-406b5a2a523b"). InnerVolumeSpecName "kube-api-access-ph762". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.074091 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "baf0d43c-9c51-470c-9863-406b5a2a523b" (UID: "baf0d43c-9c51-470c-9863-406b5a2a523b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.075860 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "baf0d43c-9c51-470c-9863-406b5a2a523b" (UID: "baf0d43c-9c51-470c-9863-406b5a2a523b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.080994 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "baf0d43c-9c51-470c-9863-406b5a2a523b" (UID: "baf0d43c-9c51-470c-9863-406b5a2a523b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.088819 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-config" (OuterVolumeSpecName: "config") pod "baf0d43c-9c51-470c-9863-406b5a2a523b" (UID: "baf0d43c-9c51-470c-9863-406b5a2a523b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.123947 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.123984 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.123996 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.124005 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph762\" (UniqueName: \"kubernetes.io/projected/baf0d43c-9c51-470c-9863-406b5a2a523b-kube-api-access-ph762\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.124021 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baf0d43c-9c51-470c-9863-406b5a2a523b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.882932 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-2bjf4" Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.931148 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-2bjf4"] Mar 14 07:21:14 crc kubenswrapper[5129]: I0314 07:21:14.940886 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-2bjf4"] Mar 14 07:21:15 crc kubenswrapper[5129]: I0314 07:21:15.893870 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-llsfr" event={"ID":"2c8203ff-5259-4d83-a96b-362be3884609","Type":"ContainerStarted","Data":"4f8425c516d3f71ead14b88caaec29687a3c0127db216c8109ec15e81d091767"} Mar 14 07:21:15 crc kubenswrapper[5129]: I0314 07:21:15.908085 5129 generic.go:334] "Generic (PLEG): container finished" podID="4cb44baa-73cf-418a-9cb5-4e7fa092524c" containerID="60729269e9e74cce75eaa29022a5770b1873da001478e25f920eb77f5ad7805a" exitCode=0 Mar 14 07:21:15 crc kubenswrapper[5129]: I0314 07:21:15.908148 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkl9q" event={"ID":"4cb44baa-73cf-418a-9cb5-4e7fa092524c","Type":"ContainerDied","Data":"60729269e9e74cce75eaa29022a5770b1873da001478e25f920eb77f5ad7805a"} Mar 14 07:21:15 crc kubenswrapper[5129]: I0314 07:21:15.923310 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-llsfr" podStartSLOduration=2.9831988259999997 podStartE2EDuration="34.923287837s" podCreationTimestamp="2026-03-14 07:20:41 +0000 UTC" firstStartedPulling="2026-03-14 07:20:42.556118906 +0000 UTC m=+1305.308034100" lastFinishedPulling="2026-03-14 07:21:14.496207937 +0000 UTC m=+1337.248123111" observedRunningTime="2026-03-14 07:21:15.918020904 +0000 UTC m=+1338.669936098" watchObservedRunningTime="2026-03-14 07:21:15.923287837 +0000 UTC m=+1338.675203031" Mar 14 07:21:16 crc kubenswrapper[5129]: I0314 07:21:16.049003 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf0d43c-9c51-470c-9863-406b5a2a523b" path="/var/lib/kubelet/pods/baf0d43c-9c51-470c-9863-406b5a2a523b/volumes" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.209563 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.372641 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-config-data\") pod \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.372694 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-642rk\" (UniqueName: \"kubernetes.io/projected/4cb44baa-73cf-418a-9cb5-4e7fa092524c-kube-api-access-642rk\") pod \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.372764 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-combined-ca-bundle\") pod \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\" (UID: \"4cb44baa-73cf-418a-9cb5-4e7fa092524c\") " Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.383842 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb44baa-73cf-418a-9cb5-4e7fa092524c-kube-api-access-642rk" (OuterVolumeSpecName: "kube-api-access-642rk") pod "4cb44baa-73cf-418a-9cb5-4e7fa092524c" (UID: "4cb44baa-73cf-418a-9cb5-4e7fa092524c"). InnerVolumeSpecName "kube-api-access-642rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.389827 5129 scope.go:117] "RemoveContainer" containerID="2095f9a78fa9f5125a06244efe3139c7f3d1b406945f084663cb839e0e9cdadc" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.404184 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cb44baa-73cf-418a-9cb5-4e7fa092524c" (UID: "4cb44baa-73cf-418a-9cb5-4e7fa092524c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.424504 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-config-data" (OuterVolumeSpecName: "config-data") pod "4cb44baa-73cf-418a-9cb5-4e7fa092524c" (UID: "4cb44baa-73cf-418a-9cb5-4e7fa092524c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.475087 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.475127 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-642rk\" (UniqueName: \"kubernetes.io/projected/4cb44baa-73cf-418a-9cb5-4e7fa092524c-kube-api-access-642rk\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.475137 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb44baa-73cf-418a-9cb5-4e7fa092524c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.925813 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rkl9q" event={"ID":"4cb44baa-73cf-418a-9cb5-4e7fa092524c","Type":"ContainerDied","Data":"0e8c0283b72b6c244c5e13f38fc019db18220ccb75bde50b09a280888da82bfa"} Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.926170 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8c0283b72b6c244c5e13f38fc019db18220ccb75bde50b09a280888da82bfa" Mar 14 07:21:17 crc kubenswrapper[5129]: I0314 07:21:17.925918 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rkl9q" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.729793 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r656l"] Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730120 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerName="dnsmasq-dns" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730132 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerName="dnsmasq-dns" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730146 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerName="init" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730151 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerName="init" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730162 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd17de58-336d-485a-aef2-dbab072a4007" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730175 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd17de58-336d-485a-aef2-dbab072a4007" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730190 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410747b6-e92a-4253-a502-6e43b9e6048b" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730196 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="410747b6-e92a-4253-a502-6e43b9e6048b" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730205 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1170613-23db-4ba5-9e73-d74a8f68fa8e" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730213 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1170613-23db-4ba5-9e73-d74a8f68fa8e" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730227 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb44baa-73cf-418a-9cb5-4e7fa092524c" containerName="keystone-db-sync" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730233 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb44baa-73cf-418a-9cb5-4e7fa092524c" containerName="keystone-db-sync" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730245 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf76f7dd-0392-4e68-a441-c982f7055f24" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730250 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf76f7dd-0392-4e68-a441-c982f7055f24" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730257 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9190972-0c7d-4d51-a42e-089c06e17395" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730263 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9190972-0c7d-4d51-a42e-089c06e17395" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: E0314 07:21:18.730272 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534a50f3-4cdf-4532-9d16-d61c1792d403" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730277 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="534a50f3-4cdf-4532-9d16-d61c1792d403" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730416 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb44baa-73cf-418a-9cb5-4e7fa092524c" containerName="keystone-db-sync" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730426 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="410747b6-e92a-4253-a502-6e43b9e6048b" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730435 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf0d43c-9c51-470c-9863-406b5a2a523b" containerName="dnsmasq-dns" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730445 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1170613-23db-4ba5-9e73-d74a8f68fa8e" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730457 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd17de58-336d-485a-aef2-dbab072a4007" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730466 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9190972-0c7d-4d51-a42e-089c06e17395" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730475 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="534a50f3-4cdf-4532-9d16-d61c1792d403" containerName="mariadb-account-create-update" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.730483 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf76f7dd-0392-4e68-a441-c982f7055f24" containerName="mariadb-database-create" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.731183 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.735293 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.735433 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.735551 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r97ql" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.735676 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.735824 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.754703 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-q5zs4"] Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.756003 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.769374 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r656l"] Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.797662 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-q5zs4"] Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801495 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-config\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801541 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-fernet-keys\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801564 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-sb\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801590 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-combined-ca-bundle\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801617 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-nb\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801693 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-config-data\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801719 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-swift-storage-0\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801737 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-svc\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801773 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-credential-keys\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801798 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp7nt\" (UniqueName: \"kubernetes.io/projected/682fd149-047a-4aca-a945-8711c5e44c9a-kube-api-access-vp7nt\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801821 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrfvk\" (UniqueName: \"kubernetes.io/projected/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-kube-api-access-rrfvk\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.801841 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-scripts\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903153 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-config\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903203 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-fernet-keys\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903227 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-sb\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903253 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-combined-ca-bundle\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903268 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-nb\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903305 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-config-data\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-swift-storage-0\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903342 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-svc\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903370 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-credential-keys\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903398 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp7nt\" (UniqueName: \"kubernetes.io/projected/682fd149-047a-4aca-a945-8711c5e44c9a-kube-api-access-vp7nt\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903420 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrfvk\" (UniqueName: \"kubernetes.io/projected/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-kube-api-access-rrfvk\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.903442 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-scripts\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.906448 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-sb\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.906992 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-config\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.907503 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-nb\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.907561 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-swift-storage-0\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.908251 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-svc\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.914743 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-config-data\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.915683 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-fernet-keys\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.915982 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-scripts\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.921215 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-combined-ca-bundle\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.923206 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-credential-keys\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.935703 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-b8r27"] Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.937058 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.954204 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vd7tn" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.954364 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.955581 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrfvk\" (UniqueName: \"kubernetes.io/projected/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-kube-api-access-rrfvk\") pod \"dnsmasq-dns-b9fb8978c-q5zs4\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.962152 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.967437 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.973783 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.976771 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.977122 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:21:18 crc kubenswrapper[5129]: I0314 07:21:18.998200 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.009178 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp7nt\" (UniqueName: \"kubernetes.io/projected/682fd149-047a-4aca-a945-8711c5e44c9a-kube-api-access-vp7nt\") pod \"keystone-bootstrap-r656l\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.025691 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b8r27"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.075431 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.087344 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.102294 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-q5zs4"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119221 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-log-httpd\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119264 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22cf633c-cf29-4f88-9ef2-693aee84d48d-etc-machine-id\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119317 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-run-httpd\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119342 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-config-data\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119371 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-scripts\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119387 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvjnk\" (UniqueName: \"kubernetes.io/projected/b43a32ae-63bf-4627-bbdc-deb131defd74-kube-api-access-mvjnk\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119401 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-db-sync-config-data\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119418 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2ts\" (UniqueName: \"kubernetes.io/projected/22cf633c-cf29-4f88-9ef2-693aee84d48d-kube-api-access-7r2ts\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119441 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-config-data\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119456 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119471 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119499 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-scripts\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.119523 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-combined-ca-bundle\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.128685 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wcl8h"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.130005 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.150032 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5zr48" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.150236 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.176780 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-2lvwr"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.177987 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.189701 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.207168 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bldkx" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.207420 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.219652 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-x5pc5"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.220745 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.224241 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.224401 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.224644 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ztjd2" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.224840 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wcl8h"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226425 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22cf633c-cf29-4f88-9ef2-693aee84d48d-etc-machine-id\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226467 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-db-sync-config-data\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226498 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-run-httpd\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226521 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-config-data\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226550 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-scripts\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226570 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvjnk\" (UniqueName: \"kubernetes.io/projected/b43a32ae-63bf-4627-bbdc-deb131defd74-kube-api-access-mvjnk\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226585 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-db-sync-config-data\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226614 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2ts\" (UniqueName: \"kubernetes.io/projected/22cf633c-cf29-4f88-9ef2-693aee84d48d-kube-api-access-7r2ts\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226640 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-config-data\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226659 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226674 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.226692 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm2ww\" (UniqueName: \"kubernetes.io/projected/3b45e803-7558-488a-bc9b-39982239b9a5-kube-api-access-gm2ww\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.227776 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22cf633c-cf29-4f88-9ef2-693aee84d48d-etc-machine-id\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.227948 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-config\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.228000 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-scripts\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.228036 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-combined-ca-bundle\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.228076 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-combined-ca-bundle\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.228124 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-combined-ca-bundle\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.228171 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7qm\" (UniqueName: \"kubernetes.io/projected/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-kube-api-access-pl7qm\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.228195 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-log-httpd\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.228192 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-run-httpd\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.232043 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.237752 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x5pc5"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.239568 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-config-data\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.240351 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-log-httpd\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.248472 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-scripts\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.254479 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-config-data\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.256392 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-db-sync-config-data\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.257389 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.260101 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-combined-ca-bundle\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.260478 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-scripts\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.270725 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2ts\" (UniqueName: \"kubernetes.io/projected/22cf633c-cf29-4f88-9ef2-693aee84d48d-kube-api-access-7r2ts\") pod \"cinder-db-sync-b8r27\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.270806 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2lvwr"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.271888 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvjnk\" (UniqueName: \"kubernetes.io/projected/b43a32ae-63bf-4627-bbdc-deb131defd74-kube-api-access-mvjnk\") pod \"ceilometer-0\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.307478 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd4958d8f-gknfd"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.308886 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336360 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336836 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn5wh\" (UniqueName: \"kubernetes.io/projected/7f4ee137-42c5-4c71-943f-767cd4c43b5b-kube-api-access-gn5wh\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336865 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336885 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f4ee137-42c5-4c71-943f-767cd4c43b5b-logs\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336912 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-combined-ca-bundle\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336927 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336953 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm2ww\" (UniqueName: \"kubernetes.io/projected/3b45e803-7558-488a-bc9b-39982239b9a5-kube-api-access-gm2ww\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336971 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-config\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.336991 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-svc\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337034 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb5vl\" (UniqueName: \"kubernetes.io/projected/609bf990-fa73-4e76-93f9-f72278fdda15-kube-api-access-bb5vl\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337057 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-scripts\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337074 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-combined-ca-bundle\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337101 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-combined-ca-bundle\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337127 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-config\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337148 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7qm\" (UniqueName: \"kubernetes.io/projected/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-kube-api-access-pl7qm\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337172 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-db-sync-config-data\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.337194 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-config-data\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.341215 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd4958d8f-gknfd"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.348272 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-db-sync-config-data\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.348893 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-combined-ca-bundle\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.354389 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-config\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.366402 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-combined-ca-bundle\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.367361 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7qm\" (UniqueName: \"kubernetes.io/projected/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-kube-api-access-pl7qm\") pod \"neutron-db-sync-2lvwr\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.369048 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm2ww\" (UniqueName: \"kubernetes.io/projected/3b45e803-7558-488a-bc9b-39982239b9a5-kube-api-access-gm2ww\") pod \"barbican-db-sync-wcl8h\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.384725 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8r27" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.395057 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.439692 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb5vl\" (UniqueName: \"kubernetes.io/projected/609bf990-fa73-4e76-93f9-f72278fdda15-kube-api-access-bb5vl\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.439772 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-scripts\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.439866 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-config\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.439948 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-config-data\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.439990 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.440033 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn5wh\" (UniqueName: \"kubernetes.io/projected/7f4ee137-42c5-4c71-943f-767cd4c43b5b-kube-api-access-gn5wh\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.440062 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.440092 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f4ee137-42c5-4c71-943f-767cd4c43b5b-logs\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.440135 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-combined-ca-bundle\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.440161 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.440228 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-svc\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.445993 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f4ee137-42c5-4c71-943f-767cd4c43b5b-logs\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.446573 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.447037 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-config\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.448169 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.448240 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.449761 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-svc\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.452196 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-combined-ca-bundle\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.454338 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-config-data\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.471579 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.472360 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-scripts\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.481316 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn5wh\" (UniqueName: \"kubernetes.io/projected/7f4ee137-42c5-4c71-943f-767cd4c43b5b-kube-api-access-gn5wh\") pod \"placement-db-sync-x5pc5\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.490227 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb5vl\" (UniqueName: \"kubernetes.io/projected/609bf990-fa73-4e76-93f9-f72278fdda15-kube-api-access-bb5vl\") pod \"dnsmasq-dns-7bd4958d8f-gknfd\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.576540 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.576666 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.636218 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.723852 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.735051 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.753433 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r656l"] Mar 14 07:21:19 crc kubenswrapper[5129]: I0314 07:21:19.955240 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-q5zs4"] Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.031315 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" event={"ID":"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce","Type":"ContainerStarted","Data":"170fa8f8dfc64ed028fe05a1cb8fe54ccd5548250a8ff8a0b64a4f3f81be95ff"} Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.033811 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r656l" event={"ID":"682fd149-047a-4aca-a945-8711c5e44c9a","Type":"ContainerStarted","Data":"cd40e28ce383409e289bb73efa1fecd54bf4ef7885fae5bda541967db7712df2"} Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.144890 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:20 crc kubenswrapper[5129]: W0314 07:21:20.155712 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb43a32ae_63bf_4627_bbdc_deb131defd74.slice/crio-54e320dde3bdb097bd7e2cb407a73e0203896298157ce06a21c609b581e32812 WatchSource:0}: Error finding container 54e320dde3bdb097bd7e2cb407a73e0203896298157ce06a21c609b581e32812: Status 404 returned error can't find the container with id 54e320dde3bdb097bd7e2cb407a73e0203896298157ce06a21c609b581e32812 Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.164996 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b8r27"] Mar 14 07:21:20 crc kubenswrapper[5129]: W0314 07:21:20.178177 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22cf633c_cf29_4f88_9ef2_693aee84d48d.slice/crio-810cf992f9ed8e4e9c318d8b37fb2eabbb826de89429157cb52b2b4743dcad42 WatchSource:0}: Error finding container 810cf992f9ed8e4e9c318d8b37fb2eabbb826de89429157cb52b2b4743dcad42: Status 404 returned error can't find the container with id 810cf992f9ed8e4e9c318d8b37fb2eabbb826de89429157cb52b2b4743dcad42 Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.303064 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2lvwr"] Mar 14 07:21:20 crc kubenswrapper[5129]: W0314 07:21:20.304887 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c82cd2_eaef_43f5_b14d_f892c3c8d0fb.slice/crio-b96e110f91416cf73bd2b0af8a893a79453e6ec26b7680229fa041d35f64185a WatchSource:0}: Error finding container b96e110f91416cf73bd2b0af8a893a79453e6ec26b7680229fa041d35f64185a: Status 404 returned error can't find the container with id b96e110f91416cf73bd2b0af8a893a79453e6ec26b7680229fa041d35f64185a Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.384948 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wcl8h"] Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.400089 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd4958d8f-gknfd"] Mar 14 07:21:20 crc kubenswrapper[5129]: I0314 07:21:20.508380 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x5pc5"] Mar 14 07:21:20 crc kubenswrapper[5129]: W0314 07:21:20.525806 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f4ee137_42c5_4c71_943f_767cd4c43b5b.slice/crio-79c279e4df085b69039f10f41cc6140348eec98c8ee2bc446db31c603a72f00e WatchSource:0}: Error finding container 79c279e4df085b69039f10f41cc6140348eec98c8ee2bc446db31c603a72f00e: Status 404 returned error can't find the container with id 79c279e4df085b69039f10f41cc6140348eec98c8ee2bc446db31c603a72f00e Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.052726 5129 generic.go:334] "Generic (PLEG): container finished" podID="8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" containerID="4b8e4961f2316db6ba46ee448b0de838b4e47209d41ff45d6c10f00d852abe9f" exitCode=0 Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.052787 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" event={"ID":"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce","Type":"ContainerDied","Data":"4b8e4961f2316db6ba46ee448b0de838b4e47209d41ff45d6c10f00d852abe9f"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.057075 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r656l" event={"ID":"682fd149-047a-4aca-a945-8711c5e44c9a","Type":"ContainerStarted","Data":"029512079db9793f0a83f9994180a737873eded1aea3b898b28d36c92dc6e03d"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.059671 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8r27" event={"ID":"22cf633c-cf29-4f88-9ef2-693aee84d48d","Type":"ContainerStarted","Data":"810cf992f9ed8e4e9c318d8b37fb2eabbb826de89429157cb52b2b4743dcad42"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.060961 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5pc5" event={"ID":"7f4ee137-42c5-4c71-943f-767cd4c43b5b","Type":"ContainerStarted","Data":"79c279e4df085b69039f10f41cc6140348eec98c8ee2bc446db31c603a72f00e"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.069727 5129 generic.go:334] "Generic (PLEG): container finished" podID="609bf990-fa73-4e76-93f9-f72278fdda15" containerID="922f66dac307e2c216f86fbbdbe5f52a629651234f2f1423a7a5201d7d237750" exitCode=0 Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.069886 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" event={"ID":"609bf990-fa73-4e76-93f9-f72278fdda15","Type":"ContainerDied","Data":"922f66dac307e2c216f86fbbdbe5f52a629651234f2f1423a7a5201d7d237750"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.069944 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" event={"ID":"609bf990-fa73-4e76-93f9-f72278fdda15","Type":"ContainerStarted","Data":"e8d6e8ef0c5a32caec43d567f9ef4a0a5b35f03636aaa0efbe8f35fc85f08456"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.071826 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerStarted","Data":"54e320dde3bdb097bd7e2cb407a73e0203896298157ce06a21c609b581e32812"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.086473 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcl8h" event={"ID":"3b45e803-7558-488a-bc9b-39982239b9a5","Type":"ContainerStarted","Data":"d3d8e2cc64636489b491a9a1962723e996df083e56ee648ff377108d9fcf11ed"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.097069 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2lvwr" event={"ID":"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb","Type":"ContainerStarted","Data":"3962e27294b5cc1e11ab43a51823ea69248516d08e03f0c0c3df80ff9127b020"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.097450 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2lvwr" event={"ID":"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb","Type":"ContainerStarted","Data":"b96e110f91416cf73bd2b0af8a893a79453e6ec26b7680229fa041d35f64185a"} Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.140547 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r656l" podStartSLOduration=3.140525679 podStartE2EDuration="3.140525679s" podCreationTimestamp="2026-03-14 07:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:21.119067203 +0000 UTC m=+1343.870982377" watchObservedRunningTime="2026-03-14 07:21:21.140525679 +0000 UTC m=+1343.892440863" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.146325 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-2lvwr" podStartSLOduration=2.1463084869999998 podStartE2EDuration="2.146308487s" podCreationTimestamp="2026-03-14 07:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:21.145866255 +0000 UTC m=+1343.897781439" watchObservedRunningTime="2026-03-14 07:21:21.146308487 +0000 UTC m=+1343.898223671" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.564590 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.591870 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-sb\") pod \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.592200 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-nb\") pod \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.592283 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrfvk\" (UniqueName: \"kubernetes.io/projected/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-kube-api-access-rrfvk\") pod \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.592345 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-config\") pod \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.592432 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-swift-storage-0\") pod \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.592480 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-svc\") pod \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\" (UID: \"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce\") " Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.602373 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-kube-api-access-rrfvk" (OuterVolumeSpecName: "kube-api-access-rrfvk") pod "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" (UID: "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce"). InnerVolumeSpecName "kube-api-access-rrfvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.620074 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" (UID: "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.649084 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" (UID: "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.657289 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-config" (OuterVolumeSpecName: "config") pod "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" (UID: "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.661065 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" (UID: "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.670271 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" (UID: "8aa7236f-04ea-4dfb-97f7-0e307a80f9ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.695696 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.695734 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.695746 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.695756 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.695764 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.695776 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrfvk\" (UniqueName: \"kubernetes.io/projected/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce-kube-api-access-rrfvk\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:21 crc kubenswrapper[5129]: I0314 07:21:21.704306 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.129385 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" event={"ID":"609bf990-fa73-4e76-93f9-f72278fdda15","Type":"ContainerStarted","Data":"1af81f0196eff0f512a14b6c0aed44358a69e57b69333925c2c33d2b6dbc0b1b"} Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.131500 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.138062 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" event={"ID":"8aa7236f-04ea-4dfb-97f7-0e307a80f9ce","Type":"ContainerDied","Data":"170fa8f8dfc64ed028fe05a1cb8fe54ccd5548250a8ff8a0b64a4f3f81be95ff"} Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.138145 5129 scope.go:117] "RemoveContainer" containerID="4b8e4961f2316db6ba46ee448b0de838b4e47209d41ff45d6c10f00d852abe9f" Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.142418 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-q5zs4" Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.163667 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" podStartSLOduration=3.163644947 podStartE2EDuration="3.163644947s" podCreationTimestamp="2026-03-14 07:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:22.163000049 +0000 UTC m=+1344.914915243" watchObservedRunningTime="2026-03-14 07:21:22.163644947 +0000 UTC m=+1344.915560161" Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.291241 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-q5zs4"] Mar 14 07:21:22 crc kubenswrapper[5129]: I0314 07:21:22.303887 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-q5zs4"] Mar 14 07:21:24 crc kubenswrapper[5129]: I0314 07:21:24.047376 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" path="/var/lib/kubelet/pods/8aa7236f-04ea-4dfb-97f7-0e307a80f9ce/volumes" Mar 14 07:21:25 crc kubenswrapper[5129]: I0314 07:21:25.179551 5129 generic.go:334] "Generic (PLEG): container finished" podID="682fd149-047a-4aca-a945-8711c5e44c9a" containerID="029512079db9793f0a83f9994180a737873eded1aea3b898b28d36c92dc6e03d" exitCode=0 Mar 14 07:21:25 crc kubenswrapper[5129]: I0314 07:21:25.179666 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r656l" event={"ID":"682fd149-047a-4aca-a945-8711c5e44c9a","Type":"ContainerDied","Data":"029512079db9793f0a83f9994180a737873eded1aea3b898b28d36c92dc6e03d"} Mar 14 07:21:25 crc kubenswrapper[5129]: I0314 07:21:25.183026 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c8203ff-5259-4d83-a96b-362be3884609" containerID="4f8425c516d3f71ead14b88caaec29687a3c0127db216c8109ec15e81d091767" exitCode=0 Mar 14 07:21:25 crc kubenswrapper[5129]: I0314 07:21:25.183071 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-llsfr" event={"ID":"2c8203ff-5259-4d83-a96b-362be3884609","Type":"ContainerDied","Data":"4f8425c516d3f71ead14b88caaec29687a3c0127db216c8109ec15e81d091767"} Mar 14 07:21:27 crc kubenswrapper[5129]: I0314 07:21:27.994661 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-llsfr" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.121050 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk8zl\" (UniqueName: \"kubernetes.io/projected/2c8203ff-5259-4d83-a96b-362be3884609-kube-api-access-rk8zl\") pod \"2c8203ff-5259-4d83-a96b-362be3884609\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.121320 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-db-sync-config-data\") pod \"2c8203ff-5259-4d83-a96b-362be3884609\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.121419 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-config-data\") pod \"2c8203ff-5259-4d83-a96b-362be3884609\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.121449 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-combined-ca-bundle\") pod \"2c8203ff-5259-4d83-a96b-362be3884609\" (UID: \"2c8203ff-5259-4d83-a96b-362be3884609\") " Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.127502 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8203ff-5259-4d83-a96b-362be3884609-kube-api-access-rk8zl" (OuterVolumeSpecName: "kube-api-access-rk8zl") pod "2c8203ff-5259-4d83-a96b-362be3884609" (UID: "2c8203ff-5259-4d83-a96b-362be3884609"). InnerVolumeSpecName "kube-api-access-rk8zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.132796 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2c8203ff-5259-4d83-a96b-362be3884609" (UID: "2c8203ff-5259-4d83-a96b-362be3884609"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.167857 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c8203ff-5259-4d83-a96b-362be3884609" (UID: "2c8203ff-5259-4d83-a96b-362be3884609"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.173805 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-config-data" (OuterVolumeSpecName: "config-data") pod "2c8203ff-5259-4d83-a96b-362be3884609" (UID: "2c8203ff-5259-4d83-a96b-362be3884609"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.222939 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.222968 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.222977 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk8zl\" (UniqueName: \"kubernetes.io/projected/2c8203ff-5259-4d83-a96b-362be3884609-kube-api-access-rk8zl\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.222986 5129 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2c8203ff-5259-4d83-a96b-362be3884609-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.233155 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-llsfr" event={"ID":"2c8203ff-5259-4d83-a96b-362be3884609","Type":"ContainerDied","Data":"fd62d03c8993213f0f417e73b17fedb978316fe1312124f179bd57ebf70ebcf7"} Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.233199 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd62d03c8993213f0f417e73b17fedb978316fe1312124f179bd57ebf70ebcf7" Mar 14 07:21:28 crc kubenswrapper[5129]: I0314 07:21:28.233250 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-llsfr" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.414059 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd4958d8f-gknfd"] Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.417238 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="dnsmasq-dns" containerID="cri-o://1af81f0196eff0f512a14b6c0aed44358a69e57b69333925c2c33d2b6dbc0b1b" gracePeriod=10 Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.422285 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.500330 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-c5sv9"] Mar 14 07:21:29 crc kubenswrapper[5129]: E0314 07:21:29.501407 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" containerName="init" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.501425 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" containerName="init" Mar 14 07:21:29 crc kubenswrapper[5129]: E0314 07:21:29.501460 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8203ff-5259-4d83-a96b-362be3884609" containerName="glance-db-sync" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.501467 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8203ff-5259-4d83-a96b-362be3884609" containerName="glance-db-sync" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.501817 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8203ff-5259-4d83-a96b-362be3884609" containerName="glance-db-sync" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.501845 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa7236f-04ea-4dfb-97f7-0e307a80f9ce" containerName="init" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.504994 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.569034 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-c5sv9"] Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.660270 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-config\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.660493 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.660668 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.661038 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-svc\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.661150 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.661264 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznpq\" (UniqueName: \"kubernetes.io/projected/01bda50c-7553-4997-b592-57430c227d18-kube-api-access-vznpq\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.736124 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.762436 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.762509 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vznpq\" (UniqueName: \"kubernetes.io/projected/01bda50c-7553-4997-b592-57430c227d18-kube-api-access-vznpq\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.762534 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-config\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.762550 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.762585 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.762670 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-svc\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.763531 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-svc\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.763898 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.764034 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.764110 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.764364 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-config\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.785575 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznpq\" (UniqueName: \"kubernetes.io/projected/01bda50c-7553-4997-b592-57430c227d18-kube-api-access-vznpq\") pod \"dnsmasq-dns-759cc7f497-c5sv9\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:29 crc kubenswrapper[5129]: I0314 07:21:29.832202 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.257014 5129 generic.go:334] "Generic (PLEG): container finished" podID="609bf990-fa73-4e76-93f9-f72278fdda15" containerID="1af81f0196eff0f512a14b6c0aed44358a69e57b69333925c2c33d2b6dbc0b1b" exitCode=0 Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.257056 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" event={"ID":"609bf990-fa73-4e76-93f9-f72278fdda15","Type":"ContainerDied","Data":"1af81f0196eff0f512a14b6c0aed44358a69e57b69333925c2c33d2b6dbc0b1b"} Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.383129 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.384591 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.387048 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gptdj" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.387241 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.387427 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.393929 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.512200 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.514368 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.516836 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.523454 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.576342 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-logs\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.576457 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.576498 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-config-data\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.576545 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-scripts\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.576591 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qtr9\" (UniqueName: \"kubernetes.io/projected/b683e569-51fc-41f2-bcbb-c3b080cbdf91-kube-api-access-8qtr9\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.576823 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.576901 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.678671 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.678752 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.678802 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.678827 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.678863 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-logs\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.678905 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.678947 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfmj5\" (UniqueName: \"kubernetes.io/projected/32295485-d05c-4619-b75d-638deb38c9c0-kube-api-access-qfmj5\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679001 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679041 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679316 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-config-data\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679381 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-scripts\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679437 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qtr9\" (UniqueName: \"kubernetes.io/projected/b683e569-51fc-41f2-bcbb-c3b080cbdf91-kube-api-access-8qtr9\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679533 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679645 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.679684 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.680127 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-logs\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.680326 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.684839 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-scripts\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.686103 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-config-data\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.686341 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.707076 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.708792 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qtr9\" (UniqueName: \"kubernetes.io/projected/b683e569-51fc-41f2-bcbb-c3b080cbdf91-kube-api-access-8qtr9\") pod \"glance-default-external-api-0\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781090 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781139 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfmj5\" (UniqueName: \"kubernetes.io/projected/32295485-d05c-4619-b75d-638deb38c9c0-kube-api-access-qfmj5\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781172 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781246 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781297 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781313 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781735 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.781957 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.782361 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.786258 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.786276 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.787775 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.798892 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfmj5\" (UniqueName: \"kubernetes.io/projected/32295485-d05c-4619-b75d-638deb38c9c0-kube-api-access-qfmj5\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.808449 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:30 crc kubenswrapper[5129]: I0314 07:21:30.828796 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:31 crc kubenswrapper[5129]: I0314 07:21:31.003263 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:32 crc kubenswrapper[5129]: I0314 07:21:32.206864 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:32 crc kubenswrapper[5129]: I0314 07:21:32.271078 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:32 crc kubenswrapper[5129]: I0314 07:21:32.919567 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.018662 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp7nt\" (UniqueName: \"kubernetes.io/projected/682fd149-047a-4aca-a945-8711c5e44c9a-kube-api-access-vp7nt\") pod \"682fd149-047a-4aca-a945-8711c5e44c9a\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.018754 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-combined-ca-bundle\") pod \"682fd149-047a-4aca-a945-8711c5e44c9a\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.018868 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-config-data\") pod \"682fd149-047a-4aca-a945-8711c5e44c9a\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.018890 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-credential-keys\") pod \"682fd149-047a-4aca-a945-8711c5e44c9a\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.018918 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-scripts\") pod \"682fd149-047a-4aca-a945-8711c5e44c9a\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.018958 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-fernet-keys\") pod \"682fd149-047a-4aca-a945-8711c5e44c9a\" (UID: \"682fd149-047a-4aca-a945-8711c5e44c9a\") " Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.024850 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "682fd149-047a-4aca-a945-8711c5e44c9a" (UID: "682fd149-047a-4aca-a945-8711c5e44c9a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.025971 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "682fd149-047a-4aca-a945-8711c5e44c9a" (UID: "682fd149-047a-4aca-a945-8711c5e44c9a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.029906 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682fd149-047a-4aca-a945-8711c5e44c9a-kube-api-access-vp7nt" (OuterVolumeSpecName: "kube-api-access-vp7nt") pod "682fd149-047a-4aca-a945-8711c5e44c9a" (UID: "682fd149-047a-4aca-a945-8711c5e44c9a"). InnerVolumeSpecName "kube-api-access-vp7nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.046680 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-scripts" (OuterVolumeSpecName: "scripts") pod "682fd149-047a-4aca-a945-8711c5e44c9a" (UID: "682fd149-047a-4aca-a945-8711c5e44c9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.049925 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "682fd149-047a-4aca-a945-8711c5e44c9a" (UID: "682fd149-047a-4aca-a945-8711c5e44c9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.062068 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-config-data" (OuterVolumeSpecName: "config-data") pod "682fd149-047a-4aca-a945-8711c5e44c9a" (UID: "682fd149-047a-4aca-a945-8711c5e44c9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.121619 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.121650 5129 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.121660 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.121668 5129 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.121677 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp7nt\" (UniqueName: \"kubernetes.io/projected/682fd149-047a-4aca-a945-8711c5e44c9a-kube-api-access-vp7nt\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.121686 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682fd149-047a-4aca-a945-8711c5e44c9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.289415 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r656l" event={"ID":"682fd149-047a-4aca-a945-8711c5e44c9a","Type":"ContainerDied","Data":"cd40e28ce383409e289bb73efa1fecd54bf4ef7885fae5bda541967db7712df2"} Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.289461 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd40e28ce383409e289bb73efa1fecd54bf4ef7885fae5bda541967db7712df2" Mar 14 07:21:33 crc kubenswrapper[5129]: I0314 07:21:33.289565 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r656l" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.081272 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r656l"] Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.088922 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r656l"] Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.179968 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4wxfh"] Mar 14 07:21:34 crc kubenswrapper[5129]: E0314 07:21:34.182625 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682fd149-047a-4aca-a945-8711c5e44c9a" containerName="keystone-bootstrap" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.182653 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="682fd149-047a-4aca-a945-8711c5e44c9a" containerName="keystone-bootstrap" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.182850 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="682fd149-047a-4aca-a945-8711c5e44c9a" containerName="keystone-bootstrap" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.183427 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.185440 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.185616 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.185616 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.186799 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.190819 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4wxfh"] Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.197938 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r97ql" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.347247 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-combined-ca-bundle\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.347280 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-scripts\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.347312 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-credential-keys\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.347393 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-config-data\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.347457 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-fernet-keys\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.347504 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2t77\" (UniqueName: \"kubernetes.io/projected/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-kube-api-access-l2t77\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.448875 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-credential-keys\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.448941 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-config-data\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.448989 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-fernet-keys\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.449029 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2t77\" (UniqueName: \"kubernetes.io/projected/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-kube-api-access-l2t77\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.449078 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-combined-ca-bundle\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.449098 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-scripts\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.459361 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-scripts\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.459430 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-credential-keys\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.459679 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-combined-ca-bundle\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.459809 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-config-data\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.461219 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-fernet-keys\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.469848 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2t77\" (UniqueName: \"kubernetes.io/projected/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-kube-api-access-l2t77\") pod \"keystone-bootstrap-4wxfh\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:34 crc kubenswrapper[5129]: I0314 07:21:34.505825 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:36 crc kubenswrapper[5129]: I0314 07:21:36.046719 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682fd149-047a-4aca-a945-8711c5e44c9a" path="/var/lib/kubelet/pods/682fd149-047a-4aca-a945-8711c5e44c9a/volumes" Mar 14 07:21:39 crc kubenswrapper[5129]: I0314 07:21:39.736310 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.351195 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.358228 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" event={"ID":"609bf990-fa73-4e76-93f9-f72278fdda15","Type":"ContainerDied","Data":"e8d6e8ef0c5a32caec43d567f9ef4a0a5b35f03636aaa0efbe8f35fc85f08456"} Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.358262 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.358283 5129 scope.go:117] "RemoveContainer" containerID="1af81f0196eff0f512a14b6c0aed44358a69e57b69333925c2c33d2b6dbc0b1b" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.360398 5129 generic.go:334] "Generic (PLEG): container finished" podID="e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" containerID="3962e27294b5cc1e11ab43a51823ea69248516d08e03f0c0c3df80ff9127b020" exitCode=0 Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.360434 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2lvwr" event={"ID":"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb","Type":"ContainerDied","Data":"3962e27294b5cc1e11ab43a51823ea69248516d08e03f0c0c3df80ff9127b020"} Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.447126 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-svc\") pod \"609bf990-fa73-4e76-93f9-f72278fdda15\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.447170 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-config\") pod \"609bf990-fa73-4e76-93f9-f72278fdda15\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.447223 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-nb\") pod \"609bf990-fa73-4e76-93f9-f72278fdda15\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.447343 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb5vl\" (UniqueName: \"kubernetes.io/projected/609bf990-fa73-4e76-93f9-f72278fdda15-kube-api-access-bb5vl\") pod \"609bf990-fa73-4e76-93f9-f72278fdda15\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.447379 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-sb\") pod \"609bf990-fa73-4e76-93f9-f72278fdda15\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.447410 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-swift-storage-0\") pod \"609bf990-fa73-4e76-93f9-f72278fdda15\" (UID: \"609bf990-fa73-4e76-93f9-f72278fdda15\") " Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.454761 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609bf990-fa73-4e76-93f9-f72278fdda15-kube-api-access-bb5vl" (OuterVolumeSpecName: "kube-api-access-bb5vl") pod "609bf990-fa73-4e76-93f9-f72278fdda15" (UID: "609bf990-fa73-4e76-93f9-f72278fdda15"). InnerVolumeSpecName "kube-api-access-bb5vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.498593 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "609bf990-fa73-4e76-93f9-f72278fdda15" (UID: "609bf990-fa73-4e76-93f9-f72278fdda15"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.504990 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "609bf990-fa73-4e76-93f9-f72278fdda15" (UID: "609bf990-fa73-4e76-93f9-f72278fdda15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.509160 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-config" (OuterVolumeSpecName: "config") pod "609bf990-fa73-4e76-93f9-f72278fdda15" (UID: "609bf990-fa73-4e76-93f9-f72278fdda15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.519207 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "609bf990-fa73-4e76-93f9-f72278fdda15" (UID: "609bf990-fa73-4e76-93f9-f72278fdda15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.521753 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "609bf990-fa73-4e76-93f9-f72278fdda15" (UID: "609bf990-fa73-4e76-93f9-f72278fdda15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.549434 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb5vl\" (UniqueName: \"kubernetes.io/projected/609bf990-fa73-4e76-93f9-f72278fdda15-kube-api-access-bb5vl\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.549474 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.549486 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.549500 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.549513 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.549525 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/609bf990-fa73-4e76-93f9-f72278fdda15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.712902 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd4958d8f-gknfd"] Mar 14 07:21:40 crc kubenswrapper[5129]: I0314 07:21:40.721193 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd4958d8f-gknfd"] Mar 14 07:21:41 crc kubenswrapper[5129]: E0314 07:21:41.633221 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 14 07:21:41 crc kubenswrapper[5129]: E0314 07:21:41.633707 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7r2ts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-b8r27_openstack(22cf633c-cf29-4f88-9ef2-693aee84d48d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 07:21:41 crc kubenswrapper[5129]: E0314 07:21:41.634922 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-b8r27" podUID="22cf633c-cf29-4f88-9ef2-693aee84d48d" Mar 14 07:21:41 crc kubenswrapper[5129]: I0314 07:21:41.636393 5129 scope.go:117] "RemoveContainer" containerID="922f66dac307e2c216f86fbbdbe5f52a629651234f2f1423a7a5201d7d237750" Mar 14 07:21:41 crc kubenswrapper[5129]: I0314 07:21:41.993546 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.050051 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" path="/var/lib/kubelet/pods/609bf990-fa73-4e76-93f9-f72278fdda15/volumes" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.076378 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-config\") pod \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.076498 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl7qm\" (UniqueName: \"kubernetes.io/projected/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-kube-api-access-pl7qm\") pod \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.076680 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-combined-ca-bundle\") pod \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\" (UID: \"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb\") " Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.085365 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-kube-api-access-pl7qm" (OuterVolumeSpecName: "kube-api-access-pl7qm") pod "e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" (UID: "e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb"). InnerVolumeSpecName "kube-api-access-pl7qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.116671 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" (UID: "e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.126442 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-c5sv9"] Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.136568 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-config" (OuterVolumeSpecName: "config") pod "e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" (UID: "e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.178257 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.178284 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.178294 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl7qm\" (UniqueName: \"kubernetes.io/projected/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb-kube-api-access-pl7qm\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.235862 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.322011 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4wxfh"] Mar 14 07:21:42 crc kubenswrapper[5129]: W0314 07:21:42.354162 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9fd78eb_874d_4fbd_b8b3_7e23a32503b5.slice/crio-0a24fef5b1ac06da67c6991ff79481df3fc6adfb03b875048944045da1814587 WatchSource:0}: Error finding container 0a24fef5b1ac06da67c6991ff79481df3fc6adfb03b875048944045da1814587: Status 404 returned error can't find the container with id 0a24fef5b1ac06da67c6991ff79481df3fc6adfb03b875048944045da1814587 Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.410201 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5pc5" event={"ID":"7f4ee137-42c5-4c71-943f-767cd4c43b5b","Type":"ContainerStarted","Data":"688b601624978ad581d2874e16b4fd640acb7ecdf38357ac8796f3c33fdb6a93"} Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.426761 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wxfh" event={"ID":"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5","Type":"ContainerStarted","Data":"0a24fef5b1ac06da67c6991ff79481df3fc6adfb03b875048944045da1814587"} Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.451394 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" event={"ID":"01bda50c-7553-4997-b592-57430c227d18","Type":"ContainerStarted","Data":"5331df50af8b1b516095f38c6ed4a01cc7da0bccb624f41913a0666e26acf343"} Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.452630 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" event={"ID":"01bda50c-7553-4997-b592-57430c227d18","Type":"ContainerStarted","Data":"52484474d44021030c17574872b522a6feb466f94b1020a68dad29aaaa3af88a"} Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.470862 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerStarted","Data":"dc7e65d99e50ea8ef70644c1ae7042bd9cb31116ec6009c82470eb2460cb5c7b"} Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.488854 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcl8h" event={"ID":"3b45e803-7558-488a-bc9b-39982239b9a5","Type":"ContainerStarted","Data":"b6129aad28f25a4355bc45fcf76cc3d1877902860adc2871b519d96f22803f4c"} Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.493037 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-x5pc5" podStartSLOduration=2.451271068 podStartE2EDuration="23.493017703s" podCreationTimestamp="2026-03-14 07:21:19 +0000 UTC" firstStartedPulling="2026-03-14 07:21:20.528481458 +0000 UTC m=+1343.280396642" lastFinishedPulling="2026-03-14 07:21:41.570228093 +0000 UTC m=+1364.322143277" observedRunningTime="2026-03-14 07:21:42.451716958 +0000 UTC m=+1365.203632142" watchObservedRunningTime="2026-03-14 07:21:42.493017703 +0000 UTC m=+1365.244932887" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.512095 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wcl8h" podStartSLOduration=2.388098997 podStartE2EDuration="23.512077894s" podCreationTimestamp="2026-03-14 07:21:19 +0000 UTC" firstStartedPulling="2026-03-14 07:21:20.467805793 +0000 UTC m=+1343.219720977" lastFinishedPulling="2026-03-14 07:21:41.59178469 +0000 UTC m=+1364.343699874" observedRunningTime="2026-03-14 07:21:42.509104434 +0000 UTC m=+1365.261019618" watchObservedRunningTime="2026-03-14 07:21:42.512077894 +0000 UTC m=+1365.263993078" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.529373 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2lvwr" event={"ID":"e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb","Type":"ContainerDied","Data":"b96e110f91416cf73bd2b0af8a893a79453e6ec26b7680229fa041d35f64185a"} Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.529419 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96e110f91416cf73bd2b0af8a893a79453e6ec26b7680229fa041d35f64185a" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.529514 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2lvwr" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.547672 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b683e569-51fc-41f2-bcbb-c3b080cbdf91","Type":"ContainerStarted","Data":"80974a998b7059047e1f2325840e49b57ee1d6aab47172b119ecba7c322abaa4"} Mar 14 07:21:42 crc kubenswrapper[5129]: E0314 07:21:42.595560 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-b8r27" podUID="22cf633c-cf29-4f88-9ef2-693aee84d48d" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.689918 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.760082 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-c5sv9"] Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.816690 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-4c2zz"] Mar 14 07:21:42 crc kubenswrapper[5129]: E0314 07:21:42.817086 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="init" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.817099 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="init" Mar 14 07:21:42 crc kubenswrapper[5129]: E0314 07:21:42.817109 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" containerName="neutron-db-sync" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.817115 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" containerName="neutron-db-sync" Mar 14 07:21:42 crc kubenswrapper[5129]: E0314 07:21:42.817126 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="dnsmasq-dns" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.817132 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="dnsmasq-dns" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.817294 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" containerName="neutron-db-sync" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.817310 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="dnsmasq-dns" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.818145 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.848891 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-4c2zz"] Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.886069 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f8d656d4-bfw28"] Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.888176 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.893506 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.893588 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.893914 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bldkx" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.894190 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.899229 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.899349 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.899422 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.899529 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4k9q\" (UniqueName: \"kubernetes.io/projected/78ce367e-be55-4091-915c-319a0e9f4986-kube-api-access-n4k9q\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.899650 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-config\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.901616 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:42 crc kubenswrapper[5129]: I0314 07:21:42.903208 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f8d656d4-bfw28"] Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.004693 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-config\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.004845 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.004905 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-combined-ca-bundle\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005008 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97mr\" (UniqueName: \"kubernetes.io/projected/d0a291cb-33cf-45e8-8a69-697f1503e4fb-kube-api-access-x97mr\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005097 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-httpd-config\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005128 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005149 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005178 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005219 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-ovndb-tls-certs\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005254 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4k9q\" (UniqueName: \"kubernetes.io/projected/78ce367e-be55-4091-915c-319a0e9f4986-kube-api-access-n4k9q\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005292 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-config\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005968 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.005983 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.006458 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-config\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.006621 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.006696 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.038357 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4k9q\" (UniqueName: \"kubernetes.io/projected/78ce367e-be55-4091-915c-319a0e9f4986-kube-api-access-n4k9q\") pod \"dnsmasq-dns-6d67d65cb9-4c2zz\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.106587 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-combined-ca-bundle\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.106701 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x97mr\" (UniqueName: \"kubernetes.io/projected/d0a291cb-33cf-45e8-8a69-697f1503e4fb-kube-api-access-x97mr\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.106740 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-httpd-config\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.106813 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-ovndb-tls-certs\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.106878 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-config\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.111647 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-combined-ca-bundle\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.112367 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-httpd-config\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.113316 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-ovndb-tls-certs\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.123324 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-config\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.150307 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x97mr\" (UniqueName: \"kubernetes.io/projected/d0a291cb-33cf-45e8-8a69-697f1503e4fb-kube-api-access-x97mr\") pod \"neutron-7f8d656d4-bfw28\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.188015 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.218514 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.567240 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"32295485-d05c-4619-b75d-638deb38c9c0","Type":"ContainerStarted","Data":"38aadd8ffed95ef5064fbd249cdc01ae832d442e51d3e05e94866e8903a07974"} Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.609798 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wxfh" event={"ID":"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5","Type":"ContainerStarted","Data":"6d5f8bf87e692aabeac10226f1e8e09b6e6634992f400961227fd04fb3fa5a3d"} Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.619449 5129 generic.go:334] "Generic (PLEG): container finished" podID="01bda50c-7553-4997-b592-57430c227d18" containerID="5331df50af8b1b516095f38c6ed4a01cc7da0bccb624f41913a0666e26acf343" exitCode=0 Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.620650 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" event={"ID":"01bda50c-7553-4997-b592-57430c227d18","Type":"ContainerDied","Data":"5331df50af8b1b516095f38c6ed4a01cc7da0bccb624f41913a0666e26acf343"} Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.640110 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4wxfh" podStartSLOduration=9.640090764 podStartE2EDuration="9.640090764s" podCreationTimestamp="2026-03-14 07:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:43.629036318 +0000 UTC m=+1366.380951502" watchObservedRunningTime="2026-03-14 07:21:43.640090764 +0000 UTC m=+1366.392005948" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.760206 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-4c2zz"] Mar 14 07:21:43 crc kubenswrapper[5129]: W0314 07:21:43.800734 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78ce367e_be55_4091_915c_319a0e9f4986.slice/crio-e41cb410bbb3761c49e0957a20d503b79be8e9711847efccfad03fdc065bca37 WatchSource:0}: Error finding container e41cb410bbb3761c49e0957a20d503b79be8e9711847efccfad03fdc065bca37: Status 404 returned error can't find the container with id e41cb410bbb3761c49e0957a20d503b79be8e9711847efccfad03fdc065bca37 Mar 14 07:21:43 crc kubenswrapper[5129]: E0314 07:21:43.936510 5129 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 14 07:21:43 crc kubenswrapper[5129]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/01bda50c-7553-4997-b592-57430c227d18/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 14 07:21:43 crc kubenswrapper[5129]: > podSandboxID="52484474d44021030c17574872b522a6feb466f94b1020a68dad29aaaa3af88a" Mar 14 07:21:43 crc kubenswrapper[5129]: E0314 07:21:43.936713 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:21:43 crc kubenswrapper[5129]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n86h549h66h5b5h5dfh587h56bh555h586h5h67fh584h665h5f8h689h64dh58fhf6hd9h648h5bfhcfh55fh696h7bh55fh5f8h65h5dh57h645h596q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vznpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-759cc7f497-c5sv9_openstack(01bda50c-7553-4997-b592-57430c227d18): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/01bda50c-7553-4997-b592-57430c227d18/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 14 07:21:43 crc kubenswrapper[5129]: > logger="UnhandledError" Mar 14 07:21:43 crc kubenswrapper[5129]: E0314 07:21:43.938976 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/01bda50c-7553-4997-b592-57430c227d18/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" podUID="01bda50c-7553-4997-b592-57430c227d18" Mar 14 07:21:43 crc kubenswrapper[5129]: I0314 07:21:43.973950 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f8d656d4-bfw28"] Mar 14 07:21:44 crc kubenswrapper[5129]: W0314 07:21:44.569638 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a291cb_33cf_45e8_8a69_697f1503e4fb.slice/crio-e6dd8fa37c015d99e06d39dc2d2c6bca96f12281515a6eeed438bd636ae71e71 WatchSource:0}: Error finding container e6dd8fa37c015d99e06d39dc2d2c6bca96f12281515a6eeed438bd636ae71e71: Status 404 returned error can't find the container with id e6dd8fa37c015d99e06d39dc2d2c6bca96f12281515a6eeed438bd636ae71e71 Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.631964 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8d656d4-bfw28" event={"ID":"d0a291cb-33cf-45e8-8a69-697f1503e4fb","Type":"ContainerStarted","Data":"e6dd8fa37c015d99e06d39dc2d2c6bca96f12281515a6eeed438bd636ae71e71"} Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.644644 5129 generic.go:334] "Generic (PLEG): container finished" podID="78ce367e-be55-4091-915c-319a0e9f4986" containerID="83d768513bc8963f3630e976d5b5c42e5fc8f8d407509b8ba31a5738ae2f708b" exitCode=0 Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.644828 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" event={"ID":"78ce367e-be55-4091-915c-319a0e9f4986","Type":"ContainerDied","Data":"83d768513bc8963f3630e976d5b5c42e5fc8f8d407509b8ba31a5738ae2f708b"} Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.644928 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" event={"ID":"78ce367e-be55-4091-915c-319a0e9f4986","Type":"ContainerStarted","Data":"e41cb410bbb3761c49e0957a20d503b79be8e9711847efccfad03fdc065bca37"} Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.649803 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b683e569-51fc-41f2-bcbb-c3b080cbdf91","Type":"ContainerStarted","Data":"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34"} Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.649983 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b683e569-51fc-41f2-bcbb-c3b080cbdf91","Type":"ContainerStarted","Data":"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1"} Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.650191 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-log" containerID="cri-o://d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1" gracePeriod=30 Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.650559 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-httpd" containerID="cri-o://682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34" gracePeriod=30 Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.662190 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-log" containerID="cri-o://071d4c470a706315a4ab4d2dbcecacc5edcf67f5816419489da0c1c9e353496f" gracePeriod=30 Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.662468 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"32295485-d05c-4619-b75d-638deb38c9c0","Type":"ContainerStarted","Data":"54d3e699c738bc194b2a33f855142c97e9cee53bfb0ad24709038db8332022df"} Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.662546 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"32295485-d05c-4619-b75d-638deb38c9c0","Type":"ContainerStarted","Data":"071d4c470a706315a4ab4d2dbcecacc5edcf67f5816419489da0c1c9e353496f"} Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.662786 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-httpd" containerID="cri-o://54d3e699c738bc194b2a33f855142c97e9cee53bfb0ad24709038db8332022df" gracePeriod=30 Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.698821 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.698805561 podStartE2EDuration="15.698805561s" podCreationTimestamp="2026-03-14 07:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:44.693804717 +0000 UTC m=+1367.445719901" watchObservedRunningTime="2026-03-14 07:21:44.698805561 +0000 UTC m=+1367.450720745" Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.734338 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.734322062 podStartE2EDuration="15.734322062s" podCreationTimestamp="2026-03-14 07:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:44.733969692 +0000 UTC m=+1367.485884886" watchObservedRunningTime="2026-03-14 07:21:44.734322062 +0000 UTC m=+1367.486237246" Mar 14 07:21:44 crc kubenswrapper[5129]: I0314 07:21:44.737282 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bd4958d8f-gknfd" podUID="609bf990-fa73-4e76-93f9-f72278fdda15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.032545 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.142379 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-nb\") pod \"01bda50c-7553-4997-b592-57430c227d18\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.142472 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-svc\") pod \"01bda50c-7553-4997-b592-57430c227d18\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.142520 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-config\") pod \"01bda50c-7553-4997-b592-57430c227d18\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.142567 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vznpq\" (UniqueName: \"kubernetes.io/projected/01bda50c-7553-4997-b592-57430c227d18-kube-api-access-vznpq\") pod \"01bda50c-7553-4997-b592-57430c227d18\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.142676 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-sb\") pod \"01bda50c-7553-4997-b592-57430c227d18\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.142724 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-swift-storage-0\") pod \"01bda50c-7553-4997-b592-57430c227d18\" (UID: \"01bda50c-7553-4997-b592-57430c227d18\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.148147 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bda50c-7553-4997-b592-57430c227d18-kube-api-access-vznpq" (OuterVolumeSpecName: "kube-api-access-vznpq") pod "01bda50c-7553-4997-b592-57430c227d18" (UID: "01bda50c-7553-4997-b592-57430c227d18"). InnerVolumeSpecName "kube-api-access-vznpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.248773 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vznpq\" (UniqueName: \"kubernetes.io/projected/01bda50c-7553-4997-b592-57430c227d18-kube-api-access-vznpq\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.409933 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01bda50c-7553-4997-b592-57430c227d18" (UID: "01bda50c-7553-4997-b592-57430c227d18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.456588 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.472275 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "01bda50c-7553-4997-b592-57430c227d18" (UID: "01bda50c-7553-4997-b592-57430c227d18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.477883 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01bda50c-7553-4997-b592-57430c227d18" (UID: "01bda50c-7553-4997-b592-57430c227d18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.486615 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-config" (OuterVolumeSpecName: "config") pod "01bda50c-7553-4997-b592-57430c227d18" (UID: "01bda50c-7553-4997-b592-57430c227d18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.490125 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-798779f645-jt8hz"] Mar 14 07:21:45 crc kubenswrapper[5129]: E0314 07:21:45.490656 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bda50c-7553-4997-b592-57430c227d18" containerName="init" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.490762 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bda50c-7553-4997-b592-57430c227d18" containerName="init" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.491019 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bda50c-7553-4997-b592-57430c227d18" containerName="init" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.491949 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.501164 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01bda50c-7553-4997-b592-57430c227d18" (UID: "01bda50c-7553-4997-b592-57430c227d18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.501664 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.503246 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.513707 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.522282 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-798779f645-jt8hz"] Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.560268 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.560301 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.560314 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.560324 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bda50c-7553-4997-b592-57430c227d18-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.661366 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-httpd-run\") pod \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.661435 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-logs\") pod \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.661521 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qtr9\" (UniqueName: \"kubernetes.io/projected/b683e569-51fc-41f2-bcbb-c3b080cbdf91-kube-api-access-8qtr9\") pod \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.661543 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-config-data\") pod \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.661588 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.661665 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-combined-ca-bundle\") pod \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.661736 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-scripts\") pod \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\" (UID: \"b683e569-51fc-41f2-bcbb-c3b080cbdf91\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.662400 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-config\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.662463 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-internal-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.662511 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-combined-ca-bundle\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.662543 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-public-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.662622 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr8nz\" (UniqueName: \"kubernetes.io/projected/f8803f0b-1655-4721-a92e-8241f500d9a5-kube-api-access-wr8nz\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.662678 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-httpd-config\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.662697 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-ovndb-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.668385 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b683e569-51fc-41f2-bcbb-c3b080cbdf91" (UID: "b683e569-51fc-41f2-bcbb-c3b080cbdf91"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.668748 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-logs" (OuterVolumeSpecName: "logs") pod "b683e569-51fc-41f2-bcbb-c3b080cbdf91" (UID: "b683e569-51fc-41f2-bcbb-c3b080cbdf91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.669561 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "b683e569-51fc-41f2-bcbb-c3b080cbdf91" (UID: "b683e569-51fc-41f2-bcbb-c3b080cbdf91"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.670745 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b683e569-51fc-41f2-bcbb-c3b080cbdf91-kube-api-access-8qtr9" (OuterVolumeSpecName: "kube-api-access-8qtr9") pod "b683e569-51fc-41f2-bcbb-c3b080cbdf91" (UID: "b683e569-51fc-41f2-bcbb-c3b080cbdf91"). InnerVolumeSpecName "kube-api-access-8qtr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.677460 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8d656d4-bfw28" event={"ID":"d0a291cb-33cf-45e8-8a69-697f1503e4fb","Type":"ContainerStarted","Data":"14e1f2bdf58c845f4d0ee2c1e34bb9aa30c8152615ec6a1270697b0b4d949a6b"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.677497 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8d656d4-bfw28" event={"ID":"d0a291cb-33cf-45e8-8a69-697f1503e4fb","Type":"ContainerStarted","Data":"79c401fa0036c778172e1bdeb74a8a13f2286ff8e77fe730b93bf628fd2f40be"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.678525 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.681896 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-scripts" (OuterVolumeSpecName: "scripts") pod "b683e569-51fc-41f2-bcbb-c3b080cbdf91" (UID: "b683e569-51fc-41f2-bcbb-c3b080cbdf91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.682770 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" event={"ID":"01bda50c-7553-4997-b592-57430c227d18","Type":"ContainerDied","Data":"52484474d44021030c17574872b522a6feb466f94b1020a68dad29aaaa3af88a"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.682822 5129 scope.go:117] "RemoveContainer" containerID="5331df50af8b1b516095f38c6ed4a01cc7da0bccb624f41913a0666e26acf343" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.682955 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-c5sv9" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.688360 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerStarted","Data":"9384ab87298e2eeab16b06733e1b00cefca7ef04298183017cade4161e32481e"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.689718 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" event={"ID":"78ce367e-be55-4091-915c-319a0e9f4986","Type":"ContainerStarted","Data":"a94725cfb09d99dd393166cca4d6186bdfd260fce260e7fb1127ad0662a92c11"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.690206 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.692173 5129 generic.go:334] "Generic (PLEG): container finished" podID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerID="682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34" exitCode=143 Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.692194 5129 generic.go:334] "Generic (PLEG): container finished" podID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerID="d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1" exitCode=143 Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.692221 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b683e569-51fc-41f2-bcbb-c3b080cbdf91","Type":"ContainerDied","Data":"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.692236 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b683e569-51fc-41f2-bcbb-c3b080cbdf91","Type":"ContainerDied","Data":"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.692246 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b683e569-51fc-41f2-bcbb-c3b080cbdf91","Type":"ContainerDied","Data":"80974a998b7059047e1f2325840e49b57ee1d6aab47172b119ecba7c322abaa4"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.692284 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.695265 5129 generic.go:334] "Generic (PLEG): container finished" podID="32295485-d05c-4619-b75d-638deb38c9c0" containerID="54d3e699c738bc194b2a33f855142c97e9cee53bfb0ad24709038db8332022df" exitCode=143 Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.705300 5129 generic.go:334] "Generic (PLEG): container finished" podID="32295485-d05c-4619-b75d-638deb38c9c0" containerID="071d4c470a706315a4ab4d2dbcecacc5edcf67f5816419489da0c1c9e353496f" exitCode=143 Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.696026 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"32295485-d05c-4619-b75d-638deb38c9c0","Type":"ContainerDied","Data":"54d3e699c738bc194b2a33f855142c97e9cee53bfb0ad24709038db8332022df"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.705770 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"32295485-d05c-4619-b75d-638deb38c9c0","Type":"ContainerDied","Data":"071d4c470a706315a4ab4d2dbcecacc5edcf67f5816419489da0c1c9e353496f"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.705875 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"32295485-d05c-4619-b75d-638deb38c9c0","Type":"ContainerDied","Data":"38aadd8ffed95ef5064fbd249cdc01ae832d442e51d3e05e94866e8903a07974"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.705961 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38aadd8ffed95ef5064fbd249cdc01ae832d442e51d3e05e94866e8903a07974" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.706784 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f8d656d4-bfw28" podStartSLOduration=3.70677153 podStartE2EDuration="3.70677153s" podCreationTimestamp="2026-03-14 07:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:45.695565551 +0000 UTC m=+1368.447480735" watchObservedRunningTime="2026-03-14 07:21:45.70677153 +0000 UTC m=+1368.458686714" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.718661 5129 generic.go:334] "Generic (PLEG): container finished" podID="7f4ee137-42c5-4c71-943f-767cd4c43b5b" containerID="688b601624978ad581d2874e16b4fd640acb7ecdf38357ac8796f3c33fdb6a93" exitCode=0 Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.718754 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5pc5" event={"ID":"7f4ee137-42c5-4c71-943f-767cd4c43b5b","Type":"ContainerDied","Data":"688b601624978ad581d2874e16b4fd640acb7ecdf38357ac8796f3c33fdb6a93"} Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.734812 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b683e569-51fc-41f2-bcbb-c3b080cbdf91" (UID: "b683e569-51fc-41f2-bcbb-c3b080cbdf91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.753252 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" podStartSLOduration=3.753231434 podStartE2EDuration="3.753231434s" podCreationTimestamp="2026-03-14 07:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:45.750805529 +0000 UTC m=+1368.502720713" watchObservedRunningTime="2026-03-14 07:21:45.753231434 +0000 UTC m=+1368.505146628" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764479 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr8nz\" (UniqueName: \"kubernetes.io/projected/f8803f0b-1655-4721-a92e-8241f500d9a5-kube-api-access-wr8nz\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764535 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-httpd-config\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764554 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-ovndb-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764626 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-config\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764667 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-internal-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764684 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-combined-ca-bundle\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764714 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-public-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764761 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764771 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764780 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764790 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b683e569-51fc-41f2-bcbb-c3b080cbdf91-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764798 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qtr9\" (UniqueName: \"kubernetes.io/projected/b683e569-51fc-41f2-bcbb-c3b080cbdf91-kube-api-access-8qtr9\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.764816 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.774938 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-config\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.780278 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-public-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.780683 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-combined-ca-bundle\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.781668 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-ovndb-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.783227 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-httpd-config\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.785545 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-internal-tls-certs\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.786717 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr8nz\" (UniqueName: \"kubernetes.io/projected/f8803f0b-1655-4721-a92e-8241f500d9a5-kube-api-access-wr8nz\") pod \"neutron-798779f645-jt8hz\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.788426 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-config-data" (OuterVolumeSpecName: "config-data") pod "b683e569-51fc-41f2-bcbb-c3b080cbdf91" (UID: "b683e569-51fc-41f2-bcbb-c3b080cbdf91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.809795 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.820228 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.824937 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.866324 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b683e569-51fc-41f2-bcbb-c3b080cbdf91-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.868192 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.919722 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-c5sv9"] Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.930086 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-c5sv9"] Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.936281 5129 scope.go:117] "RemoveContainer" containerID="682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.965114 5129 scope.go:117] "RemoveContainer" containerID="d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969104 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-combined-ca-bundle\") pod \"32295485-d05c-4619-b75d-638deb38c9c0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969145 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"32295485-d05c-4619-b75d-638deb38c9c0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969189 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-scripts\") pod \"32295485-d05c-4619-b75d-638deb38c9c0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969301 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-httpd-run\") pod \"32295485-d05c-4619-b75d-638deb38c9c0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969363 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-config-data\") pod \"32295485-d05c-4619-b75d-638deb38c9c0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969420 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfmj5\" (UniqueName: \"kubernetes.io/projected/32295485-d05c-4619-b75d-638deb38c9c0-kube-api-access-qfmj5\") pod \"32295485-d05c-4619-b75d-638deb38c9c0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969494 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-logs\") pod \"32295485-d05c-4619-b75d-638deb38c9c0\" (UID: \"32295485-d05c-4619-b75d-638deb38c9c0\") " Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.969993 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "32295485-d05c-4619-b75d-638deb38c9c0" (UID: "32295485-d05c-4619-b75d-638deb38c9c0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.970156 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-logs" (OuterVolumeSpecName: "logs") pod "32295485-d05c-4619-b75d-638deb38c9c0" (UID: "32295485-d05c-4619-b75d-638deb38c9c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.974098 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "32295485-d05c-4619-b75d-638deb38c9c0" (UID: "32295485-d05c-4619-b75d-638deb38c9c0"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.974618 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-scripts" (OuterVolumeSpecName: "scripts") pod "32295485-d05c-4619-b75d-638deb38c9c0" (UID: "32295485-d05c-4619-b75d-638deb38c9c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:45 crc kubenswrapper[5129]: I0314 07:21:45.976429 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32295485-d05c-4619-b75d-638deb38c9c0-kube-api-access-qfmj5" (OuterVolumeSpecName: "kube-api-access-qfmj5") pod "32295485-d05c-4619-b75d-638deb38c9c0" (UID: "32295485-d05c-4619-b75d-638deb38c9c0"). InnerVolumeSpecName "kube-api-access-qfmj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.000781 5129 scope.go:117] "RemoveContainer" containerID="682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34" Mar 14 07:21:46 crc kubenswrapper[5129]: E0314 07:21:46.004790 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34\": container with ID starting with 682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34 not found: ID does not exist" containerID="682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.004823 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34"} err="failed to get container status \"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34\": rpc error: code = NotFound desc = could not find container \"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34\": container with ID starting with 682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34 not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.004845 5129 scope.go:117] "RemoveContainer" containerID="d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1" Mar 14 07:21:46 crc kubenswrapper[5129]: E0314 07:21:46.009658 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1\": container with ID starting with d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1 not found: ID does not exist" containerID="d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.009708 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1"} err="failed to get container status \"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1\": rpc error: code = NotFound desc = could not find container \"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1\": container with ID starting with d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1 not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.009737 5129 scope.go:117] "RemoveContainer" containerID="682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.010043 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34"} err="failed to get container status \"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34\": rpc error: code = NotFound desc = could not find container \"682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34\": container with ID starting with 682d2ed4954dc4e4fb7f293ca71f5f085fe3f99dd30cfc6fa7f95d6e09eebf34 not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.010059 5129 scope.go:117] "RemoveContainer" containerID="d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.010033 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32295485-d05c-4619-b75d-638deb38c9c0" (UID: "32295485-d05c-4619-b75d-638deb38c9c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.010350 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1"} err="failed to get container status \"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1\": rpc error: code = NotFound desc = could not find container \"d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1\": container with ID starting with d6bb852176112c2234cf3876699b9390e59e88974d70296aee119813e4d5e0b1 not found: ID does not exist" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.044062 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-config-data" (OuterVolumeSpecName: "config-data") pod "32295485-d05c-4619-b75d-638deb38c9c0" (UID: "32295485-d05c-4619-b75d-638deb38c9c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.047438 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bda50c-7553-4997-b592-57430c227d18" path="/var/lib/kubelet/pods/01bda50c-7553-4997-b592-57430c227d18/volumes" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.059586 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.069661 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.073842 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.073963 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.074021 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.074073 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.074124 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32295485-d05c-4619-b75d-638deb38c9c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.074174 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfmj5\" (UniqueName: \"kubernetes.io/projected/32295485-d05c-4619-b75d-638deb38c9c0-kube-api-access-qfmj5\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.074248 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32295485-d05c-4619-b75d-638deb38c9c0-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.077387 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: E0314 07:21:46.078065 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-httpd" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.078087 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-httpd" Mar 14 07:21:46 crc kubenswrapper[5129]: E0314 07:21:46.082162 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-httpd" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.082195 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-httpd" Mar 14 07:21:46 crc kubenswrapper[5129]: E0314 07:21:46.082241 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-log" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.082248 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-log" Mar 14 07:21:46 crc kubenswrapper[5129]: E0314 07:21:46.082266 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-log" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.082274 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-log" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.082544 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-httpd" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.082561 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-log" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.082576 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="32295485-d05c-4619-b75d-638deb38c9c0" containerName="glance-httpd" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.082583 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" containerName="glance-log" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.085512 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.085704 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.087979 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.091878 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.094450 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.177964 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279503 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279567 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279610 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279637 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279672 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-logs\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279689 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279717 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4z8\" (UniqueName: \"kubernetes.io/projected/90a4650a-066d-455d-987d-a67b396fd4d9-kube-api-access-bn4z8\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.279845 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381540 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381673 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381719 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381754 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381786 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381825 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-logs\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381843 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.381867 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4z8\" (UniqueName: \"kubernetes.io/projected/90a4650a-066d-455d-987d-a67b396fd4d9-kube-api-access-bn4z8\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.382182 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.382274 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.382559 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-logs\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.389347 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.389842 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.390248 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.397938 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.401841 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4z8\" (UniqueName: \"kubernetes.io/projected/90a4650a-066d-455d-987d-a67b396fd4d9-kube-api-access-bn4z8\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.415812 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.485624 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-798779f645-jt8hz"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.705374 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.742100 5129 generic.go:334] "Generic (PLEG): container finished" podID="e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" containerID="6d5f8bf87e692aabeac10226f1e8e09b6e6634992f400961227fd04fb3fa5a3d" exitCode=0 Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.742154 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wxfh" event={"ID":"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5","Type":"ContainerDied","Data":"6d5f8bf87e692aabeac10226f1e8e09b6e6634992f400961227fd04fb3fa5a3d"} Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.746791 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798779f645-jt8hz" event={"ID":"f8803f0b-1655-4721-a92e-8241f500d9a5","Type":"ContainerStarted","Data":"8054a637992c524ebc44788bf45661558d65c19eb31bfd95c92ef46f20cf3e49"} Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.778466 5129 generic.go:334] "Generic (PLEG): container finished" podID="3b45e803-7558-488a-bc9b-39982239b9a5" containerID="b6129aad28f25a4355bc45fcf76cc3d1877902860adc2871b519d96f22803f4c" exitCode=0 Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.778536 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcl8h" event={"ID":"3b45e803-7558-488a-bc9b-39982239b9a5","Type":"ContainerDied","Data":"b6129aad28f25a4355bc45fcf76cc3d1877902860adc2871b519d96f22803f4c"} Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.805589 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.879716 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.892817 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.911338 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.919014 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.923072 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.932198 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.932671 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991160 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991231 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991333 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991372 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991415 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbd4\" (UniqueName: \"kubernetes.io/projected/bb182a34-3807-465d-b706-929d0abe4904-kube-api-access-kvbd4\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991448 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991498 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:46 crc kubenswrapper[5129]: I0314 07:21:46.991571 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.102840 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbd4\" (UniqueName: \"kubernetes.io/projected/bb182a34-3807-465d-b706-929d0abe4904-kube-api-access-kvbd4\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.102918 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.103000 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.103297 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.103464 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.103510 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.103586 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.103666 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.105166 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.107474 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.108476 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.113360 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.119326 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.125426 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.130711 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbd4\" (UniqueName: \"kubernetes.io/projected/bb182a34-3807-465d-b706-929d0abe4904-kube-api-access-kvbd4\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.132588 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.144146 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.236224 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.291744 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.408170 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn5wh\" (UniqueName: \"kubernetes.io/projected/7f4ee137-42c5-4c71-943f-767cd4c43b5b-kube-api-access-gn5wh\") pod \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.408287 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-scripts\") pod \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.408428 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-combined-ca-bundle\") pod \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.409451 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f4ee137-42c5-4c71-943f-767cd4c43b5b-logs\") pod \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.409533 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-config-data\") pod \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\" (UID: \"7f4ee137-42c5-4c71-943f-767cd4c43b5b\") " Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.411966 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f4ee137-42c5-4c71-943f-767cd4c43b5b-logs" (OuterVolumeSpecName: "logs") pod "7f4ee137-42c5-4c71-943f-767cd4c43b5b" (UID: "7f4ee137-42c5-4c71-943f-767cd4c43b5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.416963 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-scripts" (OuterVolumeSpecName: "scripts") pod "7f4ee137-42c5-4c71-943f-767cd4c43b5b" (UID: "7f4ee137-42c5-4c71-943f-767cd4c43b5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.420785 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4ee137-42c5-4c71-943f-767cd4c43b5b-kube-api-access-gn5wh" (OuterVolumeSpecName: "kube-api-access-gn5wh") pod "7f4ee137-42c5-4c71-943f-767cd4c43b5b" (UID: "7f4ee137-42c5-4c71-943f-767cd4c43b5b"). InnerVolumeSpecName "kube-api-access-gn5wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.464049 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f4ee137-42c5-4c71-943f-767cd4c43b5b" (UID: "7f4ee137-42c5-4c71-943f-767cd4c43b5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.464416 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-config-data" (OuterVolumeSpecName: "config-data") pod "7f4ee137-42c5-4c71-943f-767cd4c43b5b" (UID: "7f4ee137-42c5-4c71-943f-767cd4c43b5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.488879 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.512120 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.512147 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.512179 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f4ee137-42c5-4c71-943f-767cd4c43b5b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.512189 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4ee137-42c5-4c71-943f-767cd4c43b5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.512199 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn5wh\" (UniqueName: \"kubernetes.io/projected/7f4ee137-42c5-4c71-943f-767cd4c43b5b-kube-api-access-gn5wh\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.817871 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90a4650a-066d-455d-987d-a67b396fd4d9","Type":"ContainerStarted","Data":"d7838e8a795c34cca10b06e361f98d01ce99725bee18229a7703a2ac15865a32"} Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.821389 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x5pc5" event={"ID":"7f4ee137-42c5-4c71-943f-767cd4c43b5b","Type":"ContainerDied","Data":"79c279e4df085b69039f10f41cc6140348eec98c8ee2bc446db31c603a72f00e"} Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.821436 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79c279e4df085b69039f10f41cc6140348eec98c8ee2bc446db31c603a72f00e" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.821483 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x5pc5" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.827780 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798779f645-jt8hz" event={"ID":"f8803f0b-1655-4721-a92e-8241f500d9a5","Type":"ContainerStarted","Data":"e7914ce04ee9148973aef1627d62c03d2644a8e703facc10c2922ec5de2807dc"} Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.827822 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798779f645-jt8hz" event={"ID":"f8803f0b-1655-4721-a92e-8241f500d9a5","Type":"ContainerStarted","Data":"39c6a62ea5b9ad99beda56d6e7b2e80859bc775547164da440ad044eca959f8e"} Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.829651 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.844071 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.892256 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-798779f645-jt8hz" podStartSLOduration=2.892233884 podStartE2EDuration="2.892233884s" podCreationTimestamp="2026-03-14 07:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:47.862871339 +0000 UTC m=+1370.614786523" watchObservedRunningTime="2026-03-14 07:21:47.892233884 +0000 UTC m=+1370.644149068" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.925370 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-586cb48554-pw8mx"] Mar 14 07:21:47 crc kubenswrapper[5129]: E0314 07:21:47.925713 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4ee137-42c5-4c71-943f-767cd4c43b5b" containerName="placement-db-sync" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.925729 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4ee137-42c5-4c71-943f-767cd4c43b5b" containerName="placement-db-sync" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.925883 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4ee137-42c5-4c71-943f-767cd4c43b5b" containerName="placement-db-sync" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.926921 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.929651 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.932140 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.932361 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ztjd2" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.932470 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.932554 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 14 07:21:47 crc kubenswrapper[5129]: I0314 07:21:47.986662 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586cb48554-pw8mx"] Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.020765 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-scripts\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.021010 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-internal-tls-certs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.021043 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-public-tls-certs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.021088 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbvgz\" (UniqueName: \"kubernetes.io/projected/abf3a56d-8229-4cb2-8c84-b8f12e11753f-kube-api-access-dbvgz\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.021133 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-combined-ca-bundle\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.021184 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-config-data\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.021204 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf3a56d-8229-4cb2-8c84-b8f12e11753f-logs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.059592 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32295485-d05c-4619-b75d-638deb38c9c0" path="/var/lib/kubelet/pods/32295485-d05c-4619-b75d-638deb38c9c0/volumes" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.077051 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b683e569-51fc-41f2-bcbb-c3b080cbdf91" path="/var/lib/kubelet/pods/b683e569-51fc-41f2-bcbb-c3b080cbdf91/volumes" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.125559 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbvgz\" (UniqueName: \"kubernetes.io/projected/abf3a56d-8229-4cb2-8c84-b8f12e11753f-kube-api-access-dbvgz\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.126513 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-combined-ca-bundle\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.126581 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-config-data\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.126630 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf3a56d-8229-4cb2-8c84-b8f12e11753f-logs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.126679 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-scripts\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.126697 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-internal-tls-certs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.126722 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-public-tls-certs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.127889 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf3a56d-8229-4cb2-8c84-b8f12e11753f-logs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.147043 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-public-tls-certs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.147816 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-combined-ca-bundle\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.153057 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-scripts\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.157157 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-config-data\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.182348 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-internal-tls-certs\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.223673 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbvgz\" (UniqueName: \"kubernetes.io/projected/abf3a56d-8229-4cb2-8c84-b8f12e11753f-kube-api-access-dbvgz\") pod \"placement-586cb48554-pw8mx\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.253126 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.492284 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.573107 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-combined-ca-bundle\") pod \"3b45e803-7558-488a-bc9b-39982239b9a5\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.573206 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm2ww\" (UniqueName: \"kubernetes.io/projected/3b45e803-7558-488a-bc9b-39982239b9a5-kube-api-access-gm2ww\") pod \"3b45e803-7558-488a-bc9b-39982239b9a5\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.573289 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-db-sync-config-data\") pod \"3b45e803-7558-488a-bc9b-39982239b9a5\" (UID: \"3b45e803-7558-488a-bc9b-39982239b9a5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.582893 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3b45e803-7558-488a-bc9b-39982239b9a5" (UID: "3b45e803-7558-488a-bc9b-39982239b9a5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.583382 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b45e803-7558-488a-bc9b-39982239b9a5-kube-api-access-gm2ww" (OuterVolumeSpecName: "kube-api-access-gm2ww") pod "3b45e803-7558-488a-bc9b-39982239b9a5" (UID: "3b45e803-7558-488a-bc9b-39982239b9a5"). InnerVolumeSpecName "kube-api-access-gm2ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.608708 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b45e803-7558-488a-bc9b-39982239b9a5" (UID: "3b45e803-7558-488a-bc9b-39982239b9a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.676656 5129 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.676693 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b45e803-7558-488a-bc9b-39982239b9a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.676702 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm2ww\" (UniqueName: \"kubernetes.io/projected/3b45e803-7558-488a-bc9b-39982239b9a5-kube-api-access-gm2ww\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.736059 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.848779 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586cb48554-pw8mx"] Mar 14 07:21:48 crc kubenswrapper[5129]: W0314 07:21:48.853776 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf3a56d_8229_4cb2_8c84_b8f12e11753f.slice/crio-2052816c5c5effe5ed1b04b1c5af4261524c0a9b579bd45cef3f326de28bd443 WatchSource:0}: Error finding container 2052816c5c5effe5ed1b04b1c5af4261524c0a9b579bd45cef3f326de28bd443: Status 404 returned error can't find the container with id 2052816c5c5effe5ed1b04b1c5af4261524c0a9b579bd45cef3f326de28bd443 Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.866788 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb182a34-3807-465d-b706-929d0abe4904","Type":"ContainerStarted","Data":"0360e27fb95df7cd16afa2a836684c36cdef6ee7059adcb428f3ebdbf54b7f69"} Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.866849 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb182a34-3807-465d-b706-929d0abe4904","Type":"ContainerStarted","Data":"7f5fafd24b6f41853d26333565ae435f18109848a05677376eb869366381cbfa"} Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.876453 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wxfh" event={"ID":"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5","Type":"ContainerDied","Data":"0a24fef5b1ac06da67c6991ff79481df3fc6adfb03b875048944045da1814587"} Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.876490 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a24fef5b1ac06da67c6991ff79481df3fc6adfb03b875048944045da1814587" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.876651 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wxfh" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.883250 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-fernet-keys\") pod \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.883304 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-config-data\") pod \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.883346 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-combined-ca-bundle\") pod \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.883442 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2t77\" (UniqueName: \"kubernetes.io/projected/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-kube-api-access-l2t77\") pod \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.883510 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-scripts\") pod \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.883544 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-credential-keys\") pod \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\" (UID: \"e9fd78eb-874d-4fbd-b8b3-7e23a32503b5\") " Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.894944 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-scripts" (OuterVolumeSpecName: "scripts") pod "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" (UID: "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.895616 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" (UID: "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.898847 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" (UID: "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.899648 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-kube-api-access-l2t77" (OuterVolumeSpecName: "kube-api-access-l2t77") pod "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" (UID: "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5"). InnerVolumeSpecName "kube-api-access-l2t77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.925339 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wcl8h" event={"ID":"3b45e803-7558-488a-bc9b-39982239b9a5","Type":"ContainerDied","Data":"d3d8e2cc64636489b491a9a1962723e996df083e56ee648ff377108d9fcf11ed"} Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.925577 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d8e2cc64636489b491a9a1962723e996df083e56ee648ff377108d9fcf11ed" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.926032 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wcl8h" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.962671 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" (UID: "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.973757 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90a4650a-066d-455d-987d-a67b396fd4d9","Type":"ContainerStarted","Data":"caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775"} Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.977851 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-754fd75497-x4zwc"] Mar 14 07:21:48 crc kubenswrapper[5129]: E0314 07:21:48.978571 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" containerName="keystone-bootstrap" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.978589 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" containerName="keystone-bootstrap" Mar 14 07:21:48 crc kubenswrapper[5129]: E0314 07:21:48.978630 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b45e803-7558-488a-bc9b-39982239b9a5" containerName="barbican-db-sync" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.978640 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b45e803-7558-488a-bc9b-39982239b9a5" containerName="barbican-db-sync" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.978807 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" containerName="keystone-bootstrap" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.978823 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b45e803-7558-488a-bc9b-39982239b9a5" containerName="barbican-db-sync" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.979317 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.992007 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.992182 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.993592 5129 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.993626 5129 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.993743 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.993755 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2t77\" (UniqueName: \"kubernetes.io/projected/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-kube-api-access-l2t77\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:48 crc kubenswrapper[5129]: I0314 07:21:48.993763 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.010083 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-config-data" (OuterVolumeSpecName: "config-data") pod "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" (UID: "e9fd78eb-874d-4fbd-b8b3-7e23a32503b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.027351 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-754fd75497-x4zwc"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.095324 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-config-data\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.095757 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-fernet-keys\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.095842 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-combined-ca-bundle\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.095980 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-internal-tls-certs\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.096116 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-credential-keys\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.096191 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-public-tls-certs\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.096313 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dg7g\" (UniqueName: \"kubernetes.io/projected/be987b8a-a47d-46a9-bce9-6969473125ff-kube-api-access-9dg7g\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.096425 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-scripts\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.096516 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.101660 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-654c655ccc-bwh2q"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.102993 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.134935 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5zr48" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.146326 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.146881 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.200038 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-internal-tls-certs\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.201576 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-credential-keys\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.201986 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-public-tls-certs\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202072 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dg7g\" (UniqueName: \"kubernetes.io/projected/be987b8a-a47d-46a9-bce9-6969473125ff-kube-api-access-9dg7g\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202116 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202154 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn62d\" (UniqueName: \"kubernetes.io/projected/68957226-c1ac-43ef-8dba-69e7eb7a805d-kube-api-access-pn62d\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202177 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-scripts\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202243 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-combined-ca-bundle\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202266 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-config-data\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202299 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-fernet-keys\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202320 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68957226-c1ac-43ef-8dba-69e7eb7a805d-logs\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202353 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data-custom\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.202381 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-combined-ca-bundle\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.214383 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-internal-tls-certs\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.227262 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-public-tls-certs\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.231287 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-fernet-keys\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.244193 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-config-data\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.245041 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-combined-ca-bundle\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.245199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-scripts\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.247246 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-654c655ccc-bwh2q"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.248199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-credential-keys\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.251270 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dg7g\" (UniqueName: \"kubernetes.io/projected/be987b8a-a47d-46a9-bce9-6969473125ff-kube-api-access-9dg7g\") pod \"keystone-754fd75497-x4zwc\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.258684 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7bb8f997b6-dl5q6"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.260093 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.265902 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-4c2zz"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.266113 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" podUID="78ce367e-be55-4091-915c-319a0e9f4986" containerName="dnsmasq-dns" containerID="cri-o://a94725cfb09d99dd393166cca4d6186bdfd260fce260e7fb1127ad0662a92c11" gracePeriod=10 Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.268170 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.282260 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7bb8f997b6-dl5q6"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.299863 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-xtd6w"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.301269 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.305777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.305818 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn62d\" (UniqueName: \"kubernetes.io/projected/68957226-c1ac-43ef-8dba-69e7eb7a805d-kube-api-access-pn62d\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.305853 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-combined-ca-bundle\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.305874 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68957226-c1ac-43ef-8dba-69e7eb7a805d-logs\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.305894 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data-custom\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.314969 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.315238 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68957226-c1ac-43ef-8dba-69e7eb7a805d-logs\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.315269 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-xtd6w"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.319000 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data-custom\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.324147 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6449ddfb8d-8cpfw"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.331984 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.333727 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.337298 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-combined-ca-bundle\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.347046 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6449ddfb8d-8cpfw"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.363489 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn62d\" (UniqueName: \"kubernetes.io/projected/68957226-c1ac-43ef-8dba-69e7eb7a805d-kube-api-access-pn62d\") pod \"barbican-worker-654c655ccc-bwh2q\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.374671 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-696c7b8d5f-j2w5q"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.376379 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.378701 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.412356 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-config\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.412422 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data-custom\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.412514 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22p2q\" (UniqueName: \"kubernetes.io/projected/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-kube-api-access-22p2q\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.412593 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn8xj\" (UniqueName: \"kubernetes.io/projected/7dd8255e-0613-43d3-a721-eb5cde92ae4f-kube-api-access-sn8xj\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.412848 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8gbg\" (UniqueName: \"kubernetes.io/projected/ba0650a3-3274-43ea-8c60-fa69e20086dd-kube-api-access-g8gbg\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.412984 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.413156 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.413312 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-combined-ca-bundle\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.413608 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data-custom\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.413936 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.413968 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.414090 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba0650a3-3274-43ea-8c60-fa69e20086dd-logs\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.414109 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-logs\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.414227 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.414276 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.414497 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-combined-ca-bundle\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.414901 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-54888cd7bb-schqh"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.429349 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.449675 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54888cd7bb-schqh"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.462860 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.472840 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-696c7b8d5f-j2w5q"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.516869 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8xj\" (UniqueName: \"kubernetes.io/projected/7dd8255e-0613-43d3-a721-eb5cde92ae4f-kube-api-access-sn8xj\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.516915 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8gbg\" (UniqueName: \"kubernetes.io/projected/ba0650a3-3274-43ea-8c60-fa69e20086dd-kube-api-access-g8gbg\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.516943 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.516967 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.516999 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517027 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-combined-ca-bundle\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517054 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233604f0-adda-4669-b868-b96791d98bca-logs\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517076 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517093 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data-custom\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517110 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data-custom\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517131 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzhdp\" (UniqueName: \"kubernetes.io/projected/30ca9513-5ae9-4520-8012-3c941786ce2a-kube-api-access-kzhdp\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517159 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ca9513-5ae9-4520-8012-3c941786ce2a-logs\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517175 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517191 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517207 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba0650a3-3274-43ea-8c60-fa69e20086dd-logs\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517224 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-logs\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517240 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517261 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517283 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6sqx\" (UniqueName: \"kubernetes.io/projected/233604f0-adda-4669-b868-b96791d98bca-kube-api-access-t6sqx\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517305 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517324 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-combined-ca-bundle\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517343 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-combined-ca-bundle\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517362 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-combined-ca-bundle\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517379 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-config\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517396 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data-custom\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.517419 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22p2q\" (UniqueName: \"kubernetes.io/projected/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-kube-api-access-22p2q\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.518195 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.518213 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.520485 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba0650a3-3274-43ea-8c60-fa69e20086dd-logs\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.524329 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.525864 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-logs\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.527773 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.538686 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-combined-ca-bundle\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.538799 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn8xj\" (UniqueName: \"kubernetes.io/projected/7dd8255e-0613-43d3-a721-eb5cde92ae4f-kube-api-access-sn8xj\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.538864 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-config\") pod \"dnsmasq-dns-7fc46d7df7-xtd6w\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.541165 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22p2q\" (UniqueName: \"kubernetes.io/projected/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-kube-api-access-22p2q\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.546523 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8gbg\" (UniqueName: \"kubernetes.io/projected/ba0650a3-3274-43ea-8c60-fa69e20086dd-kube-api-access-g8gbg\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.549598 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data-custom\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.552926 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-combined-ca-bundle\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.553738 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75b4586cb8-pfpxj"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.555527 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.563798 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75b4586cb8-pfpxj"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.566312 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data\") pod \"barbican-api-6449ddfb8d-8cpfw\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.566613 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data-custom\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.574097 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.574696 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.574741 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.576580 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bfe21ee59c696834fdd6604caeed60a263cff0e96aef565e246c7dd2d9b99d9"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.576651 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://2bfe21ee59c696834fdd6604caeed60a263cff0e96aef565e246c7dd2d9b99d9" gracePeriod=600 Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.585634 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data\") pod \"barbican-keystone-listener-7bb8f997b6-dl5q6\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.617881 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.620697 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.620849 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233604f0-adda-4669-b868-b96791d98bca-logs\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.620877 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.620923 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data-custom\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.620955 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzhdp\" (UniqueName: \"kubernetes.io/projected/30ca9513-5ae9-4520-8012-3c941786ce2a-kube-api-access-kzhdp\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.621015 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ca9513-5ae9-4520-8012-3c941786ce2a-logs\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.621048 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.621090 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6sqx\" (UniqueName: \"kubernetes.io/projected/233604f0-adda-4669-b868-b96791d98bca-kube-api-access-t6sqx\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.621131 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-combined-ca-bundle\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.621190 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-combined-ca-bundle\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.622400 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ca9513-5ae9-4520-8012-3c941786ce2a-logs\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.623961 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233604f0-adda-4669-b868-b96791d98bca-logs\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.628166 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.628204 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.630016 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-combined-ca-bundle\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.630668 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-bb6676db4-77lhl"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.632063 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.632116 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data-custom\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.640519 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-combined-ca-bundle\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.643906 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzhdp\" (UniqueName: \"kubernetes.io/projected/30ca9513-5ae9-4520-8012-3c941786ce2a-kube-api-access-kzhdp\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.643973 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bb6676db4-77lhl"] Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.656103 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom\") pod \"barbican-keystone-listener-54888cd7bb-schqh\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.665236 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6sqx\" (UniqueName: \"kubernetes.io/projected/233604f0-adda-4669-b868-b96791d98bca-kube-api-access-t6sqx\") pod \"barbican-worker-696c7b8d5f-j2w5q\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.722318 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhsj7\" (UniqueName: \"kubernetes.io/projected/8da79e9b-0c3f-4d66-9813-08116725c6a4-kube-api-access-bhsj7\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.722451 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pcdt\" (UniqueName: \"kubernetes.io/projected/8ef31003-6429-4270-b29d-750a82d4c7fe-kube-api-access-9pcdt\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.722919 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-config-data\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723002 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-internal-tls-certs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723363 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-combined-ca-bundle\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723391 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8da79e9b-0c3f-4d66-9813-08116725c6a4-logs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723412 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-public-tls-certs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723431 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data-custom\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723484 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-scripts\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723502 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723524 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-combined-ca-bundle\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.723861 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef31003-6429-4270-b29d-750a82d4c7fe-logs\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.733918 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.764957 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.817481 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.824638 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhsj7\" (UniqueName: \"kubernetes.io/projected/8da79e9b-0c3f-4d66-9813-08116725c6a4-kube-api-access-bhsj7\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.824700 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pcdt\" (UniqueName: \"kubernetes.io/projected/8ef31003-6429-4270-b29d-750a82d4c7fe-kube-api-access-9pcdt\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.828809 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-config-data\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.828856 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-internal-tls-certs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.828889 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-combined-ca-bundle\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.828909 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8da79e9b-0c3f-4d66-9813-08116725c6a4-logs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.828933 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-public-tls-certs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.828952 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data-custom\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.828996 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-scripts\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.829014 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.829035 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-combined-ca-bundle\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.829063 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef31003-6429-4270-b29d-750a82d4c7fe-logs\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.829532 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef31003-6429-4270-b29d-750a82d4c7fe-logs\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.845910 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.854051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8da79e9b-0c3f-4d66-9813-08116725c6a4-logs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.858728 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-combined-ca-bundle\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.860217 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-internal-tls-certs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.868337 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.873862 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data-custom\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.874389 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-combined-ca-bundle\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.878179 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pcdt\" (UniqueName: \"kubernetes.io/projected/8ef31003-6429-4270-b29d-750a82d4c7fe-kube-api-access-9pcdt\") pod \"barbican-api-bb6676db4-77lhl\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.883487 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhsj7\" (UniqueName: \"kubernetes.io/projected/8da79e9b-0c3f-4d66-9813-08116725c6a4-kube-api-access-bhsj7\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.886211 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-config-data\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.893250 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-public-tls-certs\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.893655 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-scripts\") pod \"placement-75b4586cb8-pfpxj\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:49 crc kubenswrapper[5129]: I0314 07:21:49.944354 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.025882 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586cb48554-pw8mx" event={"ID":"abf3a56d-8229-4cb2-8c84-b8f12e11753f","Type":"ContainerStarted","Data":"bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044"} Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.025926 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586cb48554-pw8mx" event={"ID":"abf3a56d-8229-4cb2-8c84-b8f12e11753f","Type":"ContainerStarted","Data":"2052816c5c5effe5ed1b04b1c5af4261524c0a9b579bd45cef3f326de28bd443"} Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.042172 5129 generic.go:334] "Generic (PLEG): container finished" podID="78ce367e-be55-4091-915c-319a0e9f4986" containerID="a94725cfb09d99dd393166cca4d6186bdfd260fce260e7fb1127ad0662a92c11" exitCode=0 Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.051655 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="2bfe21ee59c696834fdd6604caeed60a263cff0e96aef565e246c7dd2d9b99d9" exitCode=0 Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.062402 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" event={"ID":"78ce367e-be55-4091-915c-319a0e9f4986","Type":"ContainerDied","Data":"a94725cfb09d99dd393166cca4d6186bdfd260fce260e7fb1127ad0662a92c11"} Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.062454 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"2bfe21ee59c696834fdd6604caeed60a263cff0e96aef565e246c7dd2d9b99d9"} Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.062480 5129 scope.go:117] "RemoveContainer" containerID="74d416b7b010cf091ac9c019f0f997a8bcaaae0573c048f8fb52b3ba09a111ee" Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.124979 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.224939 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-754fd75497-x4zwc"] Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.320255 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-654c655ccc-bwh2q"] Mar 14 07:21:50 crc kubenswrapper[5129]: I0314 07:21:50.400896 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7bb8f997b6-dl5q6"] Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.059760 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90a4650a-066d-455d-987d-a67b396fd4d9","Type":"ContainerStarted","Data":"af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c"} Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.503570 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.678124 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-nb\") pod \"78ce367e-be55-4091-915c-319a0e9f4986\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.678383 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4k9q\" (UniqueName: \"kubernetes.io/projected/78ce367e-be55-4091-915c-319a0e9f4986-kube-api-access-n4k9q\") pod \"78ce367e-be55-4091-915c-319a0e9f4986\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.678438 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-swift-storage-0\") pod \"78ce367e-be55-4091-915c-319a0e9f4986\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.678464 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-config\") pod \"78ce367e-be55-4091-915c-319a0e9f4986\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.678659 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-sb\") pod \"78ce367e-be55-4091-915c-319a0e9f4986\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.678674 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-svc\") pod \"78ce367e-be55-4091-915c-319a0e9f4986\" (UID: \"78ce367e-be55-4091-915c-319a0e9f4986\") " Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.688043 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ce367e-be55-4091-915c-319a0e9f4986-kube-api-access-n4k9q" (OuterVolumeSpecName: "kube-api-access-n4k9q") pod "78ce367e-be55-4091-915c-319a0e9f4986" (UID: "78ce367e-be55-4091-915c-319a0e9f4986"). InnerVolumeSpecName "kube-api-access-n4k9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.720566 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54888cd7bb-schqh"] Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.735129 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78ce367e-be55-4091-915c-319a0e9f4986" (UID: "78ce367e-be55-4091-915c-319a0e9f4986"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.775454 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6449ddfb8d-8cpfw"] Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.780434 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4k9q\" (UniqueName: \"kubernetes.io/projected/78ce367e-be55-4091-915c-319a0e9f4986-kube-api-access-n4k9q\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.780469 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.795902 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "78ce367e-be55-4091-915c-319a0e9f4986" (UID: "78ce367e-be55-4091-915c-319a0e9f4986"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.823577 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "78ce367e-be55-4091-915c-319a0e9f4986" (UID: "78ce367e-be55-4091-915c-319a0e9f4986"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.824827 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-config" (OuterVolumeSpecName: "config") pod "78ce367e-be55-4091-915c-319a0e9f4986" (UID: "78ce367e-be55-4091-915c-319a0e9f4986"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.831140 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "78ce367e-be55-4091-915c-319a0e9f4986" (UID: "78ce367e-be55-4091-915c-319a0e9f4986"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.881882 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.881914 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.881923 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.881933 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ce367e-be55-4091-915c-319a0e9f4986-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:51 crc kubenswrapper[5129]: I0314 07:21:51.983146 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bb6676db4-77lhl"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.071981 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" event={"ID":"78ce367e-be55-4091-915c-319a0e9f4986","Type":"ContainerDied","Data":"e41cb410bbb3761c49e0957a20d503b79be8e9711847efccfad03fdc065bca37"} Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.072028 5129 scope.go:117] "RemoveContainer" containerID="a94725cfb09d99dd393166cca4d6186bdfd260fce260e7fb1127ad0662a92c11" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.072032 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-4c2zz" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.080279 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb182a34-3807-465d-b706-929d0abe4904","Type":"ContainerStarted","Data":"10f5dac652b667a1b23938a6a39a2c391a126cd5863bfdd75416622e28ddf506"} Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.090564 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-654c655ccc-bwh2q" event={"ID":"68957226-c1ac-43ef-8dba-69e7eb7a805d","Type":"ContainerStarted","Data":"2e392f9edb731bc80686fead875011060ecf2d681d616cb9a685e47dc56a814f"} Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.093124 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754fd75497-x4zwc" event={"ID":"be987b8a-a47d-46a9-bce9-6969473125ff","Type":"ContainerStarted","Data":"953eea29bdc354c617411e73cad623b7f0a9af66f593b7fba568db94fda9d685"} Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.093154 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754fd75497-x4zwc" event={"ID":"be987b8a-a47d-46a9-bce9-6969473125ff","Type":"ContainerStarted","Data":"19ff2bc2e9aacd6fda05c8222e2579a06b5c003b8d219db120364b26800f5d13"} Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.093263 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.095695 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586cb48554-pw8mx" event={"ID":"abf3a56d-8229-4cb2-8c84-b8f12e11753f","Type":"ContainerStarted","Data":"89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c"} Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.096979 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.097012 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.100430 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" event={"ID":"ba0650a3-3274-43ea-8c60-fa69e20086dd","Type":"ContainerStarted","Data":"e1f3f41257a4ade9e94ebc05875b022b73262abc71e13fe6d1cd7a013b41562e"} Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.121330 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.121310078 podStartE2EDuration="6.121310078s" podCreationTimestamp="2026-03-14 07:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:52.115188534 +0000 UTC m=+1374.867103718" watchObservedRunningTime="2026-03-14 07:21:52.121310078 +0000 UTC m=+1374.873225262" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.148970 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-696c7b8d5f-j2w5q"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.157756 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75b4586cb8-pfpxj"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.167536 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-586cb48554-pw8mx" podStartSLOduration=5.167514214 podStartE2EDuration="5.167514214s" podCreationTimestamp="2026-03-14 07:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:52.141936449 +0000 UTC m=+1374.893851643" watchObservedRunningTime="2026-03-14 07:21:52.167514214 +0000 UTC m=+1374.919429398" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.179827 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-xtd6w"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.181348 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-754fd75497-x4zwc" podStartSLOduration=4.181328743 podStartE2EDuration="4.181328743s" podCreationTimestamp="2026-03-14 07:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:52.161184725 +0000 UTC m=+1374.913099919" watchObservedRunningTime="2026-03-14 07:21:52.181328743 +0000 UTC m=+1374.933243927" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.192446 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.19242932 podStartE2EDuration="6.19242932s" podCreationTimestamp="2026-03-14 07:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:52.18603568 +0000 UTC m=+1374.937950874" watchObservedRunningTime="2026-03-14 07:21:52.19242932 +0000 UTC m=+1374.944344504" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.218240 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-4c2zz"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.232524 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-4c2zz"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.450217 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6449ddfb8d-8cpfw"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.484655 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cc754bc48-djssr"] Mar 14 07:21:52 crc kubenswrapper[5129]: E0314 07:21:52.485075 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ce367e-be55-4091-915c-319a0e9f4986" containerName="init" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.485089 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ce367e-be55-4091-915c-319a0e9f4986" containerName="init" Mar 14 07:21:52 crc kubenswrapper[5129]: E0314 07:21:52.485112 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ce367e-be55-4091-915c-319a0e9f4986" containerName="dnsmasq-dns" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.485118 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ce367e-be55-4091-915c-319a0e9f4986" containerName="dnsmasq-dns" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.485314 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ce367e-be55-4091-915c-319a0e9f4986" containerName="dnsmasq-dns" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.486305 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.488636 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.488929 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.497421 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cc754bc48-djssr"] Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.597553 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.597664 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-public-tls-certs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.597783 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-combined-ca-bundle\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.597812 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-internal-tls-certs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.597835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data-custom\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.597870 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vpv\" (UniqueName: \"kubernetes.io/projected/9d7fc10c-3f26-4459-9577-e7f09371a44b-kube-api-access-77vpv\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.597890 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7fc10c-3f26-4459-9577-e7f09371a44b-logs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.699333 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-public-tls-certs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.699455 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-combined-ca-bundle\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.699487 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-internal-tls-certs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.699508 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data-custom\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.699536 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vpv\" (UniqueName: \"kubernetes.io/projected/9d7fc10c-3f26-4459-9577-e7f09371a44b-kube-api-access-77vpv\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.699551 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7fc10c-3f26-4459-9577-e7f09371a44b-logs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.699622 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.702850 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7fc10c-3f26-4459-9577-e7f09371a44b-logs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.703201 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-public-tls-certs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.706060 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.706279 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-internal-tls-certs\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.709832 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-combined-ca-bundle\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.714474 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data-custom\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.718061 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vpv\" (UniqueName: \"kubernetes.io/projected/9d7fc10c-3f26-4459-9577-e7f09371a44b-kube-api-access-77vpv\") pod \"barbican-api-6cc754bc48-djssr\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:52 crc kubenswrapper[5129]: I0314 07:21:52.813133 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:54 crc kubenswrapper[5129]: I0314 07:21:54.047558 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ce367e-be55-4091-915c-319a0e9f4986" path="/var/lib/kubelet/pods/78ce367e-be55-4091-915c-319a0e9f4986/volumes" Mar 14 07:21:54 crc kubenswrapper[5129]: W0314 07:21:54.397757 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30ca9513_5ae9_4520_8012_3c941786ce2a.slice/crio-c80dc4bf162b474b3071f41b2f9c48160db5cdc8397f4bb35b1410cb11c1ab51 WatchSource:0}: Error finding container c80dc4bf162b474b3071f41b2f9c48160db5cdc8397f4bb35b1410cb11c1ab51: Status 404 returned error can't find the container with id c80dc4bf162b474b3071f41b2f9c48160db5cdc8397f4bb35b1410cb11c1ab51 Mar 14 07:21:54 crc kubenswrapper[5129]: I0314 07:21:54.413916 5129 scope.go:117] "RemoveContainer" containerID="83d768513bc8963f3630e976d5b5c42e5fc8f8d407509b8ba31a5738ae2f708b" Mar 14 07:21:54 crc kubenswrapper[5129]: W0314 07:21:54.420366 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod510c476d_74b7_483b_a1ae_2bc9ccda6fe8.slice/crio-af2f1d8a83fe7ccd061082c92771757f78c9240abacb969887b342802d5c6d21 WatchSource:0}: Error finding container af2f1d8a83fe7ccd061082c92771757f78c9240abacb969887b342802d5c6d21: Status 404 returned error can't find the container with id af2f1d8a83fe7ccd061082c92771757f78c9240abacb969887b342802d5c6d21 Mar 14 07:21:54 crc kubenswrapper[5129]: W0314 07:21:54.789487 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd8255e_0613_43d3_a721_eb5cde92ae4f.slice/crio-df11f17919d7149ea860ee5cf97d86549a8f9011446c8448b6031c76ecc27464 WatchSource:0}: Error finding container df11f17919d7149ea860ee5cf97d86549a8f9011446c8448b6031c76ecc27464: Status 404 returned error can't find the container with id df11f17919d7149ea860ee5cf97d86549a8f9011446c8448b6031c76ecc27464 Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.146803 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" event={"ID":"233604f0-adda-4669-b868-b96791d98bca","Type":"ContainerStarted","Data":"f7af0415a56cb4e739d7f5fdc29c7672f0c3842490e43ef212b01081a0025a67"} Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.152546 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3"} Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.157494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb6676db4-77lhl" event={"ID":"8ef31003-6429-4270-b29d-750a82d4c7fe","Type":"ContainerStarted","Data":"89a9e538c966dc821174f7bac98a3b50866bbddaf5b9a0182be4d873fd8ccf65"} Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.158532 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b4586cb8-pfpxj" event={"ID":"8da79e9b-0c3f-4d66-9813-08116725c6a4","Type":"ContainerStarted","Data":"e05aad8860f6b477225eadd8146b46e6b4b7e8c4a703290e40abe4f403f0a7bf"} Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.160825 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" event={"ID":"30ca9513-5ae9-4520-8012-3c941786ce2a","Type":"ContainerStarted","Data":"c80dc4bf162b474b3071f41b2f9c48160db5cdc8397f4bb35b1410cb11c1ab51"} Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.164363 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6449ddfb8d-8cpfw" event={"ID":"510c476d-74b7-483b-a1ae-2bc9ccda6fe8","Type":"ContainerStarted","Data":"af2f1d8a83fe7ccd061082c92771757f78c9240abacb969887b342802d5c6d21"} Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.165818 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" event={"ID":"7dd8255e-0613-43d3-a721-eb5cde92ae4f","Type":"ContainerStarted","Data":"df11f17919d7149ea860ee5cf97d86549a8f9011446c8448b6031c76ecc27464"} Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.352509 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cc754bc48-djssr"] Mar 14 07:21:55 crc kubenswrapper[5129]: I0314 07:21:55.405769 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.179146 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6449ddfb8d-8cpfw" event={"ID":"510c476d-74b7-483b-a1ae-2bc9ccda6fe8","Type":"ContainerStarted","Data":"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.179526 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.179535 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6449ddfb8d-8cpfw" event={"ID":"510c476d-74b7-483b-a1ae-2bc9ccda6fe8","Type":"ContainerStarted","Data":"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.179546 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.179231 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6449ddfb8d-8cpfw" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api-log" containerID="cri-o://7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8" gracePeriod=30 Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.179694 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6449ddfb8d-8cpfw" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api" containerID="cri-o://e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705" gracePeriod=30 Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.185852 5129 generic.go:334] "Generic (PLEG): container finished" podID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerID="929c627cb87f00956cc9d542f9760717e4d500d49f2dc7e682c12c51fce27b12" exitCode=0 Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.185914 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" event={"ID":"7dd8255e-0613-43d3-a721-eb5cde92ae4f","Type":"ContainerDied","Data":"929c627cb87f00956cc9d542f9760717e4d500d49f2dc7e682c12c51fce27b12"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.188897 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc754bc48-djssr" event={"ID":"9d7fc10c-3f26-4459-9577-e7f09371a44b","Type":"ContainerStarted","Data":"b7fedc334b9aeda9f3eff633b2bb5ad2a6353604791c1e6f2adeb911962cfdac"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.188928 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc754bc48-djssr" event={"ID":"9d7fc10c-3f26-4459-9577-e7f09371a44b","Type":"ContainerStarted","Data":"415248023f335d941d9c0d17ead22257eaba6c308138cb3f4b361ef356ebbed0"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.190980 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb6676db4-77lhl" event={"ID":"8ef31003-6429-4270-b29d-750a82d4c7fe","Type":"ContainerStarted","Data":"1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.191006 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb6676db4-77lhl" event={"ID":"8ef31003-6429-4270-b29d-750a82d4c7fe","Type":"ContainerStarted","Data":"bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.191442 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.191496 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.193471 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerStarted","Data":"bba5c15ca0fb9a9fac552c4411716314cada4dff957f275a3944a0fa32561b45"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.197020 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b4586cb8-pfpxj" event={"ID":"8da79e9b-0c3f-4d66-9813-08116725c6a4","Type":"ContainerStarted","Data":"19c89364d2b386b626087226ef1de11773409433b1f6b4165c7de722f1c08207"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.197051 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b4586cb8-pfpxj" event={"ID":"8da79e9b-0c3f-4d66-9813-08116725c6a4","Type":"ContainerStarted","Data":"c8a2eb3d81e166b08f15ce66729e9485cc211076230f60d2b3882494534f0f34"} Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.197066 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.197087 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.202808 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6449ddfb8d-8cpfw" podStartSLOduration=7.202791842 podStartE2EDuration="7.202791842s" podCreationTimestamp="2026-03-14 07:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:56.198552998 +0000 UTC m=+1378.950468182" watchObservedRunningTime="2026-03-14 07:21:56.202791842 +0000 UTC m=+1378.954707026" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.226882 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75b4586cb8-pfpxj" podStartSLOduration=7.226866166 podStartE2EDuration="7.226866166s" podCreationTimestamp="2026-03-14 07:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:56.219444958 +0000 UTC m=+1378.971360142" watchObservedRunningTime="2026-03-14 07:21:56.226866166 +0000 UTC m=+1378.978781340" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.235945 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-bb6676db4-77lhl" podStartSLOduration=7.235926179 podStartE2EDuration="7.235926179s" podCreationTimestamp="2026-03-14 07:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:56.234002557 +0000 UTC m=+1378.985917741" watchObservedRunningTime="2026-03-14 07:21:56.235926179 +0000 UTC m=+1378.987841353" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.705572 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.705877 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.776983 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.823106 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.881101 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.986297 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-logs\") pod \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.986545 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-combined-ca-bundle\") pod \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.986730 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data\") pod \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.986826 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data-custom\") pod \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.986864 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-logs" (OuterVolumeSpecName: "logs") pod "510c476d-74b7-483b-a1ae-2bc9ccda6fe8" (UID: "510c476d-74b7-483b-a1ae-2bc9ccda6fe8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.986961 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22p2q\" (UniqueName: \"kubernetes.io/projected/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-kube-api-access-22p2q\") pod \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\" (UID: \"510c476d-74b7-483b-a1ae-2bc9ccda6fe8\") " Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.987482 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.993189 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "510c476d-74b7-483b-a1ae-2bc9ccda6fe8" (UID: "510c476d-74b7-483b-a1ae-2bc9ccda6fe8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:56 crc kubenswrapper[5129]: I0314 07:21:56.994892 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-kube-api-access-22p2q" (OuterVolumeSpecName: "kube-api-access-22p2q") pod "510c476d-74b7-483b-a1ae-2bc9ccda6fe8" (UID: "510c476d-74b7-483b-a1ae-2bc9ccda6fe8"). InnerVolumeSpecName "kube-api-access-22p2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.041628 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "510c476d-74b7-483b-a1ae-2bc9ccda6fe8" (UID: "510c476d-74b7-483b-a1ae-2bc9ccda6fe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.054005 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data" (OuterVolumeSpecName: "config-data") pod "510c476d-74b7-483b-a1ae-2bc9ccda6fe8" (UID: "510c476d-74b7-483b-a1ae-2bc9ccda6fe8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.089037 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.089074 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22p2q\" (UniqueName: \"kubernetes.io/projected/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-kube-api-access-22p2q\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.089089 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.089099 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510c476d-74b7-483b-a1ae-2bc9ccda6fe8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.233850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" event={"ID":"233604f0-adda-4669-b868-b96791d98bca","Type":"ContainerStarted","Data":"d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.238791 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.238832 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.258295 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc754bc48-djssr" event={"ID":"9d7fc10c-3f26-4459-9577-e7f09371a44b","Type":"ContainerStarted","Data":"f0d88a61613b0d796589b600c918f3c42969fe8081ec6149ed9e97446ae73149"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.258788 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.258824 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.268859 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" event={"ID":"ba0650a3-3274-43ea-8c60-fa69e20086dd","Type":"ContainerStarted","Data":"6652b5e68edae742c22329ea9a31cb25ff11c4e1879d3f639a119913b9d413f1"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.274429 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" event={"ID":"30ca9513-5ae9-4520-8012-3c941786ce2a","Type":"ContainerStarted","Data":"d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.286418 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cc754bc48-djssr" podStartSLOduration=5.286393744 podStartE2EDuration="5.286393744s" podCreationTimestamp="2026-03-14 07:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:57.279196802 +0000 UTC m=+1380.031111986" watchObservedRunningTime="2026-03-14 07:21:57.286393744 +0000 UTC m=+1380.038308928" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.297937 5129 generic.go:334] "Generic (PLEG): container finished" podID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerID="e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705" exitCode=0 Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.297974 5129 generic.go:334] "Generic (PLEG): container finished" podID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerID="7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8" exitCode=143 Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.297989 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6449ddfb8d-8cpfw" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.298082 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6449ddfb8d-8cpfw" event={"ID":"510c476d-74b7-483b-a1ae-2bc9ccda6fe8","Type":"ContainerDied","Data":"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.298117 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6449ddfb8d-8cpfw" event={"ID":"510c476d-74b7-483b-a1ae-2bc9ccda6fe8","Type":"ContainerDied","Data":"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.298131 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6449ddfb8d-8cpfw" event={"ID":"510c476d-74b7-483b-a1ae-2bc9ccda6fe8","Type":"ContainerDied","Data":"af2f1d8a83fe7ccd061082c92771757f78c9240abacb969887b342802d5c6d21"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.298149 5129 scope.go:117] "RemoveContainer" containerID="e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.305100 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-654c655ccc-bwh2q" event={"ID":"68957226-c1ac-43ef-8dba-69e7eb7a805d","Type":"ContainerStarted","Data":"69b2d4fe5b0580f2987f8ca18bbab4c417b9e45b6ed10eb453c0fc7044ab9ffa"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.316385 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" event={"ID":"7dd8255e-0613-43d3-a721-eb5cde92ae4f","Type":"ContainerStarted","Data":"53edb1146f5f8f489bc6613f5e07c4b08ed1f9ea90aafb42bc12fb7af35e5dfc"} Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.316423 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.317541 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.317562 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.323653 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.343654 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" podStartSLOduration=8.343635536 podStartE2EDuration="8.343635536s" podCreationTimestamp="2026-03-14 07:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:21:57.340970995 +0000 UTC m=+1380.092886179" watchObservedRunningTime="2026-03-14 07:21:57.343635536 +0000 UTC m=+1380.095550720" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.375505 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.383330 5129 scope.go:117] "RemoveContainer" containerID="7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.401702 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6449ddfb8d-8cpfw"] Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.431821 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6449ddfb8d-8cpfw"] Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.484942 5129 scope.go:117] "RemoveContainer" containerID="e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705" Mar 14 07:21:57 crc kubenswrapper[5129]: E0314 07:21:57.485547 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705\": container with ID starting with e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705 not found: ID does not exist" containerID="e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.485577 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705"} err="failed to get container status \"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705\": rpc error: code = NotFound desc = could not find container \"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705\": container with ID starting with e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705 not found: ID does not exist" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.485626 5129 scope.go:117] "RemoveContainer" containerID="7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8" Mar 14 07:21:57 crc kubenswrapper[5129]: E0314 07:21:57.485829 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8\": container with ID starting with 7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8 not found: ID does not exist" containerID="7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.485850 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8"} err="failed to get container status \"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8\": rpc error: code = NotFound desc = could not find container \"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8\": container with ID starting with 7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8 not found: ID does not exist" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.485862 5129 scope.go:117] "RemoveContainer" containerID="e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.486012 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705"} err="failed to get container status \"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705\": rpc error: code = NotFound desc = could not find container \"e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705\": container with ID starting with e37ffdde2e4e4900e39a1937b980b1898ed78c38210b23a3ea0ee50dfb263705 not found: ID does not exist" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.486028 5129 scope.go:117] "RemoveContainer" containerID="7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8" Mar 14 07:21:57 crc kubenswrapper[5129]: I0314 07:21:57.486166 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8"} err="failed to get container status \"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8\": rpc error: code = NotFound desc = could not find container \"7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8\": container with ID starting with 7127226b234f5d089eb24f420b4cddaf51e88c88359940ae806686d0fe0de9f8 not found: ID does not exist" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.052955 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" path="/var/lib/kubelet/pods/510c476d-74b7-483b-a1ae-2bc9ccda6fe8/volumes" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.325972 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" event={"ID":"ba0650a3-3274-43ea-8c60-fa69e20086dd","Type":"ContainerStarted","Data":"8d13f9b0d58ca30ed9aa69b801959d666313273fbeb9d73fae38843476719fff"} Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.329772 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" event={"ID":"30ca9513-5ae9-4520-8012-3c941786ce2a","Type":"ContainerStarted","Data":"ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce"} Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.331429 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8r27" event={"ID":"22cf633c-cf29-4f88-9ef2-693aee84d48d","Type":"ContainerStarted","Data":"52bd0a20548a621f45a9955bd8dc359566b8030a3da89b698db926a9a9020962"} Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.334012 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-654c655ccc-bwh2q" event={"ID":"68957226-c1ac-43ef-8dba-69e7eb7a805d","Type":"ContainerStarted","Data":"8fbdacd00dedbd22ffaa9dcae34c3055a4dad50b4b16a964817c8aa2dcbe22a6"} Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.339561 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" event={"ID":"233604f0-adda-4669-b868-b96791d98bca","Type":"ContainerStarted","Data":"3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e"} Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.341415 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.341445 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.374820 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" podStartSLOduration=3.913216375 podStartE2EDuration="9.374799196s" podCreationTimestamp="2026-03-14 07:21:49 +0000 UTC" firstStartedPulling="2026-03-14 07:21:51.183595048 +0000 UTC m=+1373.935510252" lastFinishedPulling="2026-03-14 07:21:56.645177889 +0000 UTC m=+1379.397093073" observedRunningTime="2026-03-14 07:21:58.348088621 +0000 UTC m=+1381.100003805" watchObservedRunningTime="2026-03-14 07:21:58.374799196 +0000 UTC m=+1381.126714380" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.378555 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-654c655ccc-bwh2q" podStartSLOduration=4.97390825 podStartE2EDuration="10.378544396s" podCreationTimestamp="2026-03-14 07:21:48 +0000 UTC" firstStartedPulling="2026-03-14 07:21:51.18365183 +0000 UTC m=+1373.935567014" lastFinishedPulling="2026-03-14 07:21:56.588287956 +0000 UTC m=+1379.340203160" observedRunningTime="2026-03-14 07:21:58.371245411 +0000 UTC m=+1381.123160615" watchObservedRunningTime="2026-03-14 07:21:58.378544396 +0000 UTC m=+1381.130459590" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.404295 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-b8r27" podStartSLOduration=3.939641721 podStartE2EDuration="40.404276515s" podCreationTimestamp="2026-03-14 07:21:18 +0000 UTC" firstStartedPulling="2026-03-14 07:21:20.1807452 +0000 UTC m=+1342.932660384" lastFinishedPulling="2026-03-14 07:21:56.645379994 +0000 UTC m=+1379.397295178" observedRunningTime="2026-03-14 07:21:58.400870693 +0000 UTC m=+1381.152785897" watchObservedRunningTime="2026-03-14 07:21:58.404276515 +0000 UTC m=+1381.156191699" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.432527 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" podStartSLOduration=7.587482294 podStartE2EDuration="9.4325105s" podCreationTimestamp="2026-03-14 07:21:49 +0000 UTC" firstStartedPulling="2026-03-14 07:21:54.774930648 +0000 UTC m=+1377.526845832" lastFinishedPulling="2026-03-14 07:21:56.619958854 +0000 UTC m=+1379.371874038" observedRunningTime="2026-03-14 07:21:58.424117516 +0000 UTC m=+1381.176032710" watchObservedRunningTime="2026-03-14 07:21:58.4325105 +0000 UTC m=+1381.184425684" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.471847 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-654c655ccc-bwh2q"] Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.474278 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" podStartSLOduration=7.209182842 podStartE2EDuration="9.474254787s" podCreationTimestamp="2026-03-14 07:21:49 +0000 UTC" firstStartedPulling="2026-03-14 07:21:54.413905888 +0000 UTC m=+1377.165821082" lastFinishedPulling="2026-03-14 07:21:56.678977843 +0000 UTC m=+1379.430893027" observedRunningTime="2026-03-14 07:21:58.44409117 +0000 UTC m=+1381.196006354" watchObservedRunningTime="2026-03-14 07:21:58.474254787 +0000 UTC m=+1381.226169971" Mar 14 07:21:58 crc kubenswrapper[5129]: I0314 07:21:58.490619 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7bb8f997b6-dl5q6"] Mar 14 07:21:59 crc kubenswrapper[5129]: I0314 07:21:59.346674 5129 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:21:59 crc kubenswrapper[5129]: I0314 07:21:59.346948 5129 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:21:59 crc kubenswrapper[5129]: I0314 07:21:59.962541 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.074804 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.132885 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557882-j24cn"] Mar 14 07:22:00 crc kubenswrapper[5129]: E0314 07:22:00.133245 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api-log" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.133260 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api-log" Mar 14 07:22:00 crc kubenswrapper[5129]: E0314 07:22:00.133278 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.133285 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.134123 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api-log" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.134148 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="510c476d-74b7-483b-a1ae-2bc9ccda6fe8" containerName="barbican-api" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.135798 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-j24cn" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.138953 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.139142 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.139260 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.148371 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-j24cn"] Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.256829 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlkt\" (UniqueName: \"kubernetes.io/projected/67606449-22cf-4aed-82df-32cece6daffb-kube-api-access-wmlkt\") pod \"auto-csr-approver-29557882-j24cn\" (UID: \"67606449-22cf-4aed-82df-32cece6daffb\") " pod="openshift-infra/auto-csr-approver-29557882-j24cn" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.353847 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener-log" containerID="cri-o://6652b5e68edae742c22329ea9a31cb25ff11c4e1879d3f639a119913b9d413f1" gracePeriod=30 Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.353892 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener" containerID="cri-o://8d13f9b0d58ca30ed9aa69b801959d666313273fbeb9d73fae38843476719fff" gracePeriod=30 Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.354016 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-654c655ccc-bwh2q" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker-log" containerID="cri-o://69b2d4fe5b0580f2987f8ca18bbab4c417b9e45b6ed10eb453c0fc7044ab9ffa" gracePeriod=30 Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.354078 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-654c655ccc-bwh2q" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker" containerID="cri-o://8fbdacd00dedbd22ffaa9dcae34c3055a4dad50b4b16a964817c8aa2dcbe22a6" gracePeriod=30 Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.358650 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlkt\" (UniqueName: \"kubernetes.io/projected/67606449-22cf-4aed-82df-32cece6daffb-kube-api-access-wmlkt\") pod \"auto-csr-approver-29557882-j24cn\" (UID: \"67606449-22cf-4aed-82df-32cece6daffb\") " pod="openshift-infra/auto-csr-approver-29557882-j24cn" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.387539 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlkt\" (UniqueName: \"kubernetes.io/projected/67606449-22cf-4aed-82df-32cece6daffb-kube-api-access-wmlkt\") pod \"auto-csr-approver-29557882-j24cn\" (UID: \"67606449-22cf-4aed-82df-32cece6daffb\") " pod="openshift-infra/auto-csr-approver-29557882-j24cn" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.466767 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-j24cn" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.491752 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.491850 5129 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 07:22:00 crc kubenswrapper[5129]: I0314 07:22:00.495013 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:01 crc kubenswrapper[5129]: I0314 07:22:01.365251 5129 generic.go:334] "Generic (PLEG): container finished" podID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerID="69b2d4fe5b0580f2987f8ca18bbab4c417b9e45b6ed10eb453c0fc7044ab9ffa" exitCode=143 Mar 14 07:22:01 crc kubenswrapper[5129]: I0314 07:22:01.365336 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-654c655ccc-bwh2q" event={"ID":"68957226-c1ac-43ef-8dba-69e7eb7a805d","Type":"ContainerDied","Data":"69b2d4fe5b0580f2987f8ca18bbab4c417b9e45b6ed10eb453c0fc7044ab9ffa"} Mar 14 07:22:01 crc kubenswrapper[5129]: I0314 07:22:01.369936 5129 generic.go:334] "Generic (PLEG): container finished" podID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerID="6652b5e68edae742c22329ea9a31cb25ff11c4e1879d3f639a119913b9d413f1" exitCode=143 Mar 14 07:22:01 crc kubenswrapper[5129]: I0314 07:22:01.370016 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" event={"ID":"ba0650a3-3274-43ea-8c60-fa69e20086dd","Type":"ContainerDied","Data":"6652b5e68edae742c22329ea9a31cb25ff11c4e1879d3f639a119913b9d413f1"} Mar 14 07:22:02 crc kubenswrapper[5129]: I0314 07:22:02.380572 5129 generic.go:334] "Generic (PLEG): container finished" podID="22cf633c-cf29-4f88-9ef2-693aee84d48d" containerID="52bd0a20548a621f45a9955bd8dc359566b8030a3da89b698db926a9a9020962" exitCode=0 Mar 14 07:22:02 crc kubenswrapper[5129]: I0314 07:22:02.380721 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8r27" event={"ID":"22cf633c-cf29-4f88-9ef2-693aee84d48d","Type":"ContainerDied","Data":"52bd0a20548a621f45a9955bd8dc359566b8030a3da89b698db926a9a9020962"} Mar 14 07:22:02 crc kubenswrapper[5129]: I0314 07:22:02.384727 5129 generic.go:334] "Generic (PLEG): container finished" podID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerID="8fbdacd00dedbd22ffaa9dcae34c3055a4dad50b4b16a964817c8aa2dcbe22a6" exitCode=0 Mar 14 07:22:02 crc kubenswrapper[5129]: I0314 07:22:02.384817 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-654c655ccc-bwh2q" event={"ID":"68957226-c1ac-43ef-8dba-69e7eb7a805d","Type":"ContainerDied","Data":"8fbdacd00dedbd22ffaa9dcae34c3055a4dad50b4b16a964817c8aa2dcbe22a6"} Mar 14 07:22:02 crc kubenswrapper[5129]: I0314 07:22:02.387910 5129 generic.go:334] "Generic (PLEG): container finished" podID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerID="8d13f9b0d58ca30ed9aa69b801959d666313273fbeb9d73fae38843476719fff" exitCode=0 Mar 14 07:22:02 crc kubenswrapper[5129]: I0314 07:22:02.387944 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" event={"ID":"ba0650a3-3274-43ea-8c60-fa69e20086dd","Type":"ContainerDied","Data":"8d13f9b0d58ca30ed9aa69b801959d666313273fbeb9d73fae38843476719fff"} Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.192485 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.284919 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.351315 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bb6676db4-77lhl"] Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.351637 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api-log" containerID="cri-o://bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647" gracePeriod=30 Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.352566 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api" containerID="cri-o://1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c" gracePeriod=30 Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.378790 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.378906 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.735929 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.808609 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-mzts4"] Mar 14 07:22:04 crc kubenswrapper[5129]: I0314 07:22:04.809028 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67754df655-mzts4" podUID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerName="dnsmasq-dns" containerID="cri-o://0d67e389bdca0ea85d32e76fea34d68d08559fb7bc1f65eb1207c6c50fa51da3" gracePeriod=10 Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.429506 5129 generic.go:334] "Generic (PLEG): container finished" podID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerID="bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647" exitCode=143 Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.429574 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb6676db4-77lhl" event={"ID":"8ef31003-6429-4270-b29d-750a82d4c7fe","Type":"ContainerDied","Data":"bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647"} Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.432424 5129 generic.go:334] "Generic (PLEG): container finished" podID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerID="0d67e389bdca0ea85d32e76fea34d68d08559fb7bc1f65eb1207c6c50fa51da3" exitCode=0 Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.432469 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-mzts4" event={"ID":"9f4da728-25d7-4876-8039-c6db1f4ee858","Type":"ContainerDied","Data":"0d67e389bdca0ea85d32e76fea34d68d08559fb7bc1f65eb1207c6c50fa51da3"} Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.782642 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8r27" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.883348 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-db-sync-config-data\") pod \"22cf633c-cf29-4f88-9ef2-693aee84d48d\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.883983 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-scripts\") pod \"22cf633c-cf29-4f88-9ef2-693aee84d48d\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.884046 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-config-data\") pod \"22cf633c-cf29-4f88-9ef2-693aee84d48d\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.884133 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-combined-ca-bundle\") pod \"22cf633c-cf29-4f88-9ef2-693aee84d48d\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.884163 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22cf633c-cf29-4f88-9ef2-693aee84d48d-etc-machine-id\") pod \"22cf633c-cf29-4f88-9ef2-693aee84d48d\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.884182 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r2ts\" (UniqueName: \"kubernetes.io/projected/22cf633c-cf29-4f88-9ef2-693aee84d48d-kube-api-access-7r2ts\") pod \"22cf633c-cf29-4f88-9ef2-693aee84d48d\" (UID: \"22cf633c-cf29-4f88-9ef2-693aee84d48d\") " Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.885839 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22cf633c-cf29-4f88-9ef2-693aee84d48d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "22cf633c-cf29-4f88-9ef2-693aee84d48d" (UID: "22cf633c-cf29-4f88-9ef2-693aee84d48d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.896779 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22cf633c-cf29-4f88-9ef2-693aee84d48d-kube-api-access-7r2ts" (OuterVolumeSpecName: "kube-api-access-7r2ts") pod "22cf633c-cf29-4f88-9ef2-693aee84d48d" (UID: "22cf633c-cf29-4f88-9ef2-693aee84d48d"). InnerVolumeSpecName "kube-api-access-7r2ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.906945 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-scripts" (OuterVolumeSpecName: "scripts") pod "22cf633c-cf29-4f88-9ef2-693aee84d48d" (UID: "22cf633c-cf29-4f88-9ef2-693aee84d48d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.943748 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "22cf633c-cf29-4f88-9ef2-693aee84d48d" (UID: "22cf633c-cf29-4f88-9ef2-693aee84d48d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.973261 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22cf633c-cf29-4f88-9ef2-693aee84d48d" (UID: "22cf633c-cf29-4f88-9ef2-693aee84d48d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.988637 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-config-data" (OuterVolumeSpecName: "config-data") pod "22cf633c-cf29-4f88-9ef2-693aee84d48d" (UID: "22cf633c-cf29-4f88-9ef2-693aee84d48d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.989322 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.989357 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.989373 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22cf633c-cf29-4f88-9ef2-693aee84d48d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.989384 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r2ts\" (UniqueName: \"kubernetes.io/projected/22cf633c-cf29-4f88-9ef2-693aee84d48d-kube-api-access-7r2ts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.989396 5129 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:05 crc kubenswrapper[5129]: I0314 07:22:05.989406 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cf633c-cf29-4f88-9ef2-693aee84d48d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.441244 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8r27" event={"ID":"22cf633c-cf29-4f88-9ef2-693aee84d48d","Type":"ContainerDied","Data":"810cf992f9ed8e4e9c318d8b37fb2eabbb826de89429157cb52b2b4743dcad42"} Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.441279 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="810cf992f9ed8e4e9c318d8b37fb2eabbb826de89429157cb52b2b4743dcad42" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.441330 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8r27" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.798908 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.816899 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917080 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8gbg\" (UniqueName: \"kubernetes.io/projected/ba0650a3-3274-43ea-8c60-fa69e20086dd-kube-api-access-g8gbg\") pod \"ba0650a3-3274-43ea-8c60-fa69e20086dd\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917414 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917443 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-sb\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917500 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-svc\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917535 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data-custom\") pod \"ba0650a3-3274-43ea-8c60-fa69e20086dd\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917576 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba0650a3-3274-43ea-8c60-fa69e20086dd-logs\") pod \"ba0650a3-3274-43ea-8c60-fa69e20086dd\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917621 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-combined-ca-bundle\") pod \"ba0650a3-3274-43ea-8c60-fa69e20086dd\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917653 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w26b4\" (UniqueName: \"kubernetes.io/projected/9f4da728-25d7-4876-8039-c6db1f4ee858-kube-api-access-w26b4\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917764 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data\") pod \"ba0650a3-3274-43ea-8c60-fa69e20086dd\" (UID: \"ba0650a3-3274-43ea-8c60-fa69e20086dd\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917817 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-nb\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.917834 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.922904 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba0650a3-3274-43ea-8c60-fa69e20086dd" (UID: "ba0650a3-3274-43ea-8c60-fa69e20086dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.925072 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4da728-25d7-4876-8039-c6db1f4ee858-kube-api-access-w26b4" (OuterVolumeSpecName: "kube-api-access-w26b4") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "kube-api-access-w26b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.926994 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0650a3-3274-43ea-8c60-fa69e20086dd-kube-api-access-g8gbg" (OuterVolumeSpecName: "kube-api-access-g8gbg") pod "ba0650a3-3274-43ea-8c60-fa69e20086dd" (UID: "ba0650a3-3274-43ea-8c60-fa69e20086dd"). InnerVolumeSpecName "kube-api-access-g8gbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.928410 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba0650a3-3274-43ea-8c60-fa69e20086dd-logs" (OuterVolumeSpecName: "logs") pod "ba0650a3-3274-43ea-8c60-fa69e20086dd" (UID: "ba0650a3-3274-43ea-8c60-fa69e20086dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.979903 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.981402 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:06 crc kubenswrapper[5129]: E0314 07:22:06.982308 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker-log" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982322 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker-log" Mar 14 07:22:06 crc kubenswrapper[5129]: E0314 07:22:06.982334 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982340 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener" Mar 14 07:22:06 crc kubenswrapper[5129]: E0314 07:22:06.982367 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerName="init" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982373 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerName="init" Mar 14 07:22:06 crc kubenswrapper[5129]: E0314 07:22:06.982391 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982399 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker" Mar 14 07:22:06 crc kubenswrapper[5129]: E0314 07:22:06.982408 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cf633c-cf29-4f88-9ef2-693aee84d48d" containerName="cinder-db-sync" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982415 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cf633c-cf29-4f88-9ef2-693aee84d48d" containerName="cinder-db-sync" Mar 14 07:22:06 crc kubenswrapper[5129]: E0314 07:22:06.982444 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener-log" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982450 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener-log" Mar 14 07:22:06 crc kubenswrapper[5129]: E0314 07:22:06.982461 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerName="dnsmasq-dns" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982467 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerName="dnsmasq-dns" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982656 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982692 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4da728-25d7-4876-8039-c6db1f4ee858" containerName="dnsmasq-dns" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982702 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" containerName="barbican-keystone-listener-log" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982709 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982718 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" containerName="barbican-worker-log" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.982728 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="22cf633c-cf29-4f88-9ef2-693aee84d48d" containerName="cinder-db-sync" Mar 14 07:22:06 crc kubenswrapper[5129]: I0314 07:22:06.983673 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:06.994810 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vd7tn" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:06.994983 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:06.995068 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:06.995157 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.017588 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.033435 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn62d\" (UniqueName: \"kubernetes.io/projected/68957226-c1ac-43ef-8dba-69e7eb7a805d-kube-api-access-pn62d\") pod \"68957226-c1ac-43ef-8dba-69e7eb7a805d\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.033534 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data\") pod \"68957226-c1ac-43ef-8dba-69e7eb7a805d\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.033645 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data-custom\") pod \"68957226-c1ac-43ef-8dba-69e7eb7a805d\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.033812 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68957226-c1ac-43ef-8dba-69e7eb7a805d-logs\") pod \"68957226-c1ac-43ef-8dba-69e7eb7a805d\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.033906 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-combined-ca-bundle\") pod \"68957226-c1ac-43ef-8dba-69e7eb7a805d\" (UID: \"68957226-c1ac-43ef-8dba-69e7eb7a805d\") " Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.034703 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.034988 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.035145 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g64zr\" (UniqueName: \"kubernetes.io/projected/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-kube-api-access-g64zr\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.035249 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.035302 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.035374 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.036058 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68957226-c1ac-43ef-8dba-69e7eb7a805d-logs" (OuterVolumeSpecName: "logs") pod "68957226-c1ac-43ef-8dba-69e7eb7a805d" (UID: "68957226-c1ac-43ef-8dba-69e7eb7a805d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.039778 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.039821 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba0650a3-3274-43ea-8c60-fa69e20086dd-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.039831 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w26b4\" (UniqueName: \"kubernetes.io/projected/9f4da728-25d7-4876-8039-c6db1f4ee858-kube-api-access-w26b4\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.039842 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8gbg\" (UniqueName: \"kubernetes.io/projected/ba0650a3-3274-43ea-8c60-fa69e20086dd-kube-api-access-g8gbg\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.061808 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-vh8bg"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.063643 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.075892 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "68957226-c1ac-43ef-8dba-69e7eb7a805d" (UID: "68957226-c1ac-43ef-8dba-69e7eb7a805d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.084487 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68957226-c1ac-43ef-8dba-69e7eb7a805d-kube-api-access-pn62d" (OuterVolumeSpecName: "kube-api-access-pn62d") pod "68957226-c1ac-43ef-8dba-69e7eb7a805d" (UID: "68957226-c1ac-43ef-8dba-69e7eb7a805d"). InnerVolumeSpecName "kube-api-access-pn62d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.085074 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-vh8bg"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143055 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143102 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g64zr\" (UniqueName: \"kubernetes.io/projected/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-kube-api-access-g64zr\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143147 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143172 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143192 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143225 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143260 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143295 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-config\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143321 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xp4\" (UniqueName: \"kubernetes.io/projected/21f24524-573b-4948-b866-2dc0828a860f-kube-api-access-p6xp4\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143352 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143368 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143568 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143749 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68957226-c1ac-43ef-8dba-69e7eb7a805d-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143761 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn62d\" (UniqueName: \"kubernetes.io/projected/68957226-c1ac-43ef-8dba-69e7eb7a805d-kube-api-access-pn62d\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.143771 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.144206 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.164157 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.164836 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.165321 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.165684 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.167617 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.170797 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba0650a3-3274-43ea-8c60-fa69e20086dd" (UID: "ba0650a3-3274-43ea-8c60-fa69e20086dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.174920 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g64zr\" (UniqueName: \"kubernetes.io/projected/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-kube-api-access-g64zr\") pod \"cinder-scheduler-0\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.211920 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.213652 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.222995 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.232546 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.246803 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config" (OuterVolumeSpecName: "config") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.246984 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: W0314 07:22:07.247985 5129 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9f4da728-25d7-4876-8039-c6db1f4ee858/volumes/kubernetes.io~configmap/config Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.247999 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config" (OuterVolumeSpecName: "config") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.250758 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.250873 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0\") pod \"9f4da728-25d7-4876-8039-c6db1f4ee858\" (UID: \"9f4da728-25d7-4876-8039-c6db1f4ee858\") " Mar 14 07:22:07 crc kubenswrapper[5129]: W0314 07:22:07.250941 5129 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9f4da728-25d7-4876-8039-c6db1f4ee858/volumes/kubernetes.io~configmap/dns-swift-storage-0 Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.250959 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.251170 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-config\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.251216 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xp4\" (UniqueName: \"kubernetes.io/projected/21f24524-573b-4948-b866-2dc0828a860f-kube-api-access-p6xp4\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.251316 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.251450 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.251561 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.251998 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-config\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.252063 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.252426 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.252667 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.252690 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.252699 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.252707 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.258775 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.260030 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.260303 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f4da728-25d7-4876-8039-c6db1f4ee858" (UID: "9f4da728-25d7-4876-8039-c6db1f4ee858"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.260718 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.261515 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.264496 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68957226-c1ac-43ef-8dba-69e7eb7a805d" (UID: "68957226-c1ac-43ef-8dba-69e7eb7a805d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.278357 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xp4\" (UniqueName: \"kubernetes.io/projected/21f24524-573b-4948-b866-2dc0828a860f-kube-api-access-p6xp4\") pod \"dnsmasq-dns-58b85ccffc-vh8bg\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.280841 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data" (OuterVolumeSpecName: "config-data") pod "ba0650a3-3274-43ea-8c60-fa69e20086dd" (UID: "ba0650a3-3274-43ea-8c60-fa69e20086dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.312270 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data" (OuterVolumeSpecName: "config-data") pod "68957226-c1ac-43ef-8dba-69e7eb7a805d" (UID: "68957226-c1ac-43ef-8dba-69e7eb7a805d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.321561 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-j24cn"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.330007 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.353687 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmpr6\" (UniqueName: \"kubernetes.io/projected/66db65fb-a7cf-4b0f-bad9-11215f942f34-kube-api-access-bmpr6\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.353750 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.353801 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66db65fb-a7cf-4b0f-bad9-11215f942f34-logs\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.353821 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-scripts\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.353849 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data-custom\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.353929 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.353957 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66db65fb-a7cf-4b0f-bad9-11215f942f34-etc-machine-id\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.354063 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.354076 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.354084 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba0650a3-3274-43ea-8c60-fa69e20086dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.354092 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f4da728-25d7-4876-8039-c6db1f4ee858-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.354103 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68957226-c1ac-43ef-8dba-69e7eb7a805d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.408073 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.452877 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerStarted","Data":"7686ee942427f823745c94372a0c72fd20e3b280d82bdf52620f653c4b97882b"} Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.453057 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-central-agent" containerID="cri-o://dc7e65d99e50ea8ef70644c1ae7042bd9cb31116ec6009c82470eb2460cb5c7b" gracePeriod=30 Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.453734 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.454015 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="proxy-httpd" containerID="cri-o://7686ee942427f823745c94372a0c72fd20e3b280d82bdf52620f653c4b97882b" gracePeriod=30 Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.454062 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="sg-core" containerID="cri-o://bba5c15ca0fb9a9fac552c4411716314cada4dff957f275a3944a0fa32561b45" gracePeriod=30 Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.454094 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-notification-agent" containerID="cri-o://9384ab87298e2eeab16b06733e1b00cefca7ef04298183017cade4161e32481e" gracePeriod=30 Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66db65fb-a7cf-4b0f-bad9-11215f942f34-etc-machine-id\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456389 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66db65fb-a7cf-4b0f-bad9-11215f942f34-etc-machine-id\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456471 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmpr6\" (UniqueName: \"kubernetes.io/projected/66db65fb-a7cf-4b0f-bad9-11215f942f34-kube-api-access-bmpr6\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456565 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456670 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66db65fb-a7cf-4b0f-bad9-11215f942f34-logs\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456745 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-scripts\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456812 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data-custom\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.456906 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.458550 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66db65fb-a7cf-4b0f-bad9-11215f942f34-logs\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.461043 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.461155 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-scripts\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.464312 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" event={"ID":"ba0650a3-3274-43ea-8c60-fa69e20086dd","Type":"ContainerDied","Data":"e1f3f41257a4ade9e94ebc05875b022b73262abc71e13fe6d1cd7a013b41562e"} Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.464479 5129 scope.go:117] "RemoveContainer" containerID="8d13f9b0d58ca30ed9aa69b801959d666313273fbeb9d73fae38843476719fff" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.464908 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7bb8f997b6-dl5q6" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.489467 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmpr6\" (UniqueName: \"kubernetes.io/projected/66db65fb-a7cf-4b0f-bad9-11215f942f34-kube-api-access-bmpr6\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.490664 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.953542057 podStartE2EDuration="49.49065153s" podCreationTimestamp="2026-03-14 07:21:18 +0000 UTC" firstStartedPulling="2026-03-14 07:21:20.16171587 +0000 UTC m=+1342.913631054" lastFinishedPulling="2026-03-14 07:22:06.698825343 +0000 UTC m=+1389.450740527" observedRunningTime="2026-03-14 07:22:07.4839163 +0000 UTC m=+1390.235831484" watchObservedRunningTime="2026-03-14 07:22:07.49065153 +0000 UTC m=+1390.242566714" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.496887 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-j24cn" event={"ID":"67606449-22cf-4aed-82df-32cece6daffb","Type":"ContainerStarted","Data":"8f7b1bbf60fa4c5ba72415db3f5b7ad992c668305c09f26a6c4830332e295e97"} Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.510118 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-mzts4" event={"ID":"9f4da728-25d7-4876-8039-c6db1f4ee858","Type":"ContainerDied","Data":"55723da1a4ff49141705706221a2538988f6d88f5e9eac53172988972287b560"} Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.513391 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-mzts4" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.514959 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.517690 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data-custom\") pod \"cinder-api-0\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.523143 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-654c655ccc-bwh2q" event={"ID":"68957226-c1ac-43ef-8dba-69e7eb7a805d","Type":"ContainerDied","Data":"2e392f9edb731bc80686fead875011060ecf2d681d616cb9a685e47dc56a814f"} Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.523223 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-654c655ccc-bwh2q" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.544452 5129 scope.go:117] "RemoveContainer" containerID="6652b5e68edae742c22329ea9a31cb25ff11c4e1879d3f639a119913b9d413f1" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.549291 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.674501 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7bb8f997b6-dl5q6"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.720741 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7bb8f997b6-dl5q6"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.737530 5129 scope.go:117] "RemoveContainer" containerID="0d67e389bdca0ea85d32e76fea34d68d08559fb7bc1f65eb1207c6c50fa51da3" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.742408 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-654c655ccc-bwh2q"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.798326 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-654c655ccc-bwh2q"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.817325 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-mzts4"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.841421 5129 scope.go:117] "RemoveContainer" containerID="f1adc22aa4959719845b2e7c497fb8f4b284d7d2c33af89e5135942b5661c5a5" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.847059 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67754df655-mzts4"] Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.866770 5129 scope.go:117] "RemoveContainer" containerID="8fbdacd00dedbd22ffaa9dcae34c3055a4dad50b4b16a964817c8aa2dcbe22a6" Mar 14 07:22:07 crc kubenswrapper[5129]: I0314 07:22:07.914877 5129 scope.go:117] "RemoveContainer" containerID="69b2d4fe5b0580f2987f8ca18bbab4c417b9e45b6ed10eb453c0fc7044ab9ffa" Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.008052 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.046845 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68957226-c1ac-43ef-8dba-69e7eb7a805d" path="/var/lib/kubelet/pods/68957226-c1ac-43ef-8dba-69e7eb7a805d/volumes" Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.047411 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4da728-25d7-4876-8039-c6db1f4ee858" path="/var/lib/kubelet/pods/9f4da728-25d7-4876-8039-c6db1f4ee858/volumes" Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.048336 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0650a3-3274-43ea-8c60-fa69e20086dd" path="/var/lib/kubelet/pods/ba0650a3-3274-43ea-8c60-fa69e20086dd/volumes" Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.176158 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-vh8bg"] Mar 14 07:22:08 crc kubenswrapper[5129]: W0314 07:22:08.182070 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21f24524_573b_4948_b866_2dc0828a860f.slice/crio-f1e594c26072910157d861498a6a38e073ecc3ba4a314306dd150fe8eb6fbd14 WatchSource:0}: Error finding container f1e594c26072910157d861498a6a38e073ecc3ba4a314306dd150fe8eb6fbd14: Status 404 returned error can't find the container with id f1e594c26072910157d861498a6a38e073ecc3ba4a314306dd150fe8eb6fbd14 Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.272736 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.552898 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-j24cn" event={"ID":"67606449-22cf-4aed-82df-32cece6daffb","Type":"ContainerStarted","Data":"58636ddf9056d5cf6b9725a0fc761ea443236adbe98622fdee0c49765d0989de"} Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.567022 5129 generic.go:334] "Generic (PLEG): container finished" podID="21f24524-573b-4948-b866-2dc0828a860f" containerID="86b1d76f2f88d2f87fa1167fa9ed1d09d2e2515fe9194103ba75fd503968c821" exitCode=0 Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.567081 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" event={"ID":"21f24524-573b-4948-b866-2dc0828a860f","Type":"ContainerDied","Data":"86b1d76f2f88d2f87fa1167fa9ed1d09d2e2515fe9194103ba75fd503968c821"} Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.567107 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" event={"ID":"21f24524-573b-4948-b866-2dc0828a860f","Type":"ContainerStarted","Data":"f1e594c26072910157d861498a6a38e073ecc3ba4a314306dd150fe8eb6fbd14"} Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.583381 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557882-j24cn" podStartSLOduration=7.733805737 podStartE2EDuration="8.583360557s" podCreationTimestamp="2026-03-14 07:22:00 +0000 UTC" firstStartedPulling="2026-03-14 07:22:07.315922816 +0000 UTC m=+1390.067838000" lastFinishedPulling="2026-03-14 07:22:08.165477636 +0000 UTC m=+1390.917392820" observedRunningTime="2026-03-14 07:22:08.567910644 +0000 UTC m=+1391.319825828" watchObservedRunningTime="2026-03-14 07:22:08.583360557 +0000 UTC m=+1391.335275741" Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.601937 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e85da6c4-a9a7-4d1c-baf9-67777bf925e4","Type":"ContainerStarted","Data":"5a0ead8315ddfc4701d9be88e2e8eba4e9c4f26fcf9b62fe6be8c6c7a64cf77d"} Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.607238 5129 generic.go:334] "Generic (PLEG): container finished" podID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerID="7686ee942427f823745c94372a0c72fd20e3b280d82bdf52620f653c4b97882b" exitCode=0 Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.607263 5129 generic.go:334] "Generic (PLEG): container finished" podID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerID="bba5c15ca0fb9a9fac552c4411716314cada4dff957f275a3944a0fa32561b45" exitCode=2 Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.607273 5129 generic.go:334] "Generic (PLEG): container finished" podID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerID="dc7e65d99e50ea8ef70644c1ae7042bd9cb31116ec6009c82470eb2460cb5c7b" exitCode=0 Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.607304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerDied","Data":"7686ee942427f823745c94372a0c72fd20e3b280d82bdf52620f653c4b97882b"} Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.607321 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerDied","Data":"bba5c15ca0fb9a9fac552c4411716314cada4dff957f275a3944a0fa32561b45"} Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.607329 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerDied","Data":"dc7e65d99e50ea8ef70644c1ae7042bd9cb31116ec6009c82470eb2460cb5c7b"} Mar 14 07:22:08 crc kubenswrapper[5129]: I0314 07:22:08.615300 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66db65fb-a7cf-4b0f-bad9-11215f942f34","Type":"ContainerStarted","Data":"045a80443a4786e6b57f24d134578b773e85fd5ff08dc61ea39a57bf5f797f26"} Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.680773 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.692501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" event={"ID":"21f24524-573b-4948-b866-2dc0828a860f","Type":"ContainerStarted","Data":"cc33061d7eea06e4d1027500b33204f6b46c9cab6e12536bff0261c3fc7da2f0"} Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.693726 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.699492 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e85da6c4-a9a7-4d1c-baf9-67777bf925e4","Type":"ContainerStarted","Data":"54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e"} Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.708137 5129 generic.go:334] "Generic (PLEG): container finished" podID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerID="9384ab87298e2eeab16b06733e1b00cefca7ef04298183017cade4161e32481e" exitCode=0 Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.708195 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerDied","Data":"9384ab87298e2eeab16b06733e1b00cefca7ef04298183017cade4161e32481e"} Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.717889 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66db65fb-a7cf-4b0f-bad9-11215f942f34","Type":"ContainerStarted","Data":"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3"} Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.719662 5129 generic.go:334] "Generic (PLEG): container finished" podID="67606449-22cf-4aed-82df-32cece6daffb" containerID="58636ddf9056d5cf6b9725a0fc761ea443236adbe98622fdee0c49765d0989de" exitCode=0 Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.719704 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-j24cn" event={"ID":"67606449-22cf-4aed-82df-32cece6daffb","Type":"ContainerDied","Data":"58636ddf9056d5cf6b9725a0fc761ea443236adbe98622fdee0c49765d0989de"} Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.728121 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" podStartSLOduration=2.728097227 podStartE2EDuration="2.728097227s" podCreationTimestamp="2026-03-14 07:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:09.713224288 +0000 UTC m=+1392.465139472" watchObservedRunningTime="2026-03-14 07:22:09.728097227 +0000 UTC m=+1392.480012411" Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.773133 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:39790->10.217.0.166:9311: read: connection reset by peer" Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.773570 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:39782->10.217.0.166:9311: read: connection reset by peer" Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.930799 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.945525 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 14 07:22:09 crc kubenswrapper[5129]: I0314 07:22:09.945639 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bb6676db4-77lhl" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.032058 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-run-httpd\") pod \"b43a32ae-63bf-4627-bbdc-deb131defd74\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.032160 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvjnk\" (UniqueName: \"kubernetes.io/projected/b43a32ae-63bf-4627-bbdc-deb131defd74-kube-api-access-mvjnk\") pod \"b43a32ae-63bf-4627-bbdc-deb131defd74\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.032184 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-sg-core-conf-yaml\") pod \"b43a32ae-63bf-4627-bbdc-deb131defd74\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.032211 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-config-data\") pod \"b43a32ae-63bf-4627-bbdc-deb131defd74\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.032282 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-log-httpd\") pod \"b43a32ae-63bf-4627-bbdc-deb131defd74\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.032315 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-scripts\") pod \"b43a32ae-63bf-4627-bbdc-deb131defd74\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.032372 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-combined-ca-bundle\") pod \"b43a32ae-63bf-4627-bbdc-deb131defd74\" (UID: \"b43a32ae-63bf-4627-bbdc-deb131defd74\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.034446 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b43a32ae-63bf-4627-bbdc-deb131defd74" (UID: "b43a32ae-63bf-4627-bbdc-deb131defd74"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.041660 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b43a32ae-63bf-4627-bbdc-deb131defd74" (UID: "b43a32ae-63bf-4627-bbdc-deb131defd74"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.044861 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-scripts" (OuterVolumeSpecName: "scripts") pod "b43a32ae-63bf-4627-bbdc-deb131defd74" (UID: "b43a32ae-63bf-4627-bbdc-deb131defd74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.058976 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43a32ae-63bf-4627-bbdc-deb131defd74-kube-api-access-mvjnk" (OuterVolumeSpecName: "kube-api-access-mvjnk") pod "b43a32ae-63bf-4627-bbdc-deb131defd74" (UID: "b43a32ae-63bf-4627-bbdc-deb131defd74"). InnerVolumeSpecName "kube-api-access-mvjnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.082683 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b43a32ae-63bf-4627-bbdc-deb131defd74" (UID: "b43a32ae-63bf-4627-bbdc-deb131defd74"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.134366 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.134406 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvjnk\" (UniqueName: \"kubernetes.io/projected/b43a32ae-63bf-4627-bbdc-deb131defd74-kube-api-access-mvjnk\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.134422 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.134436 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b43a32ae-63bf-4627-bbdc-deb131defd74-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.134448 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.138752 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b43a32ae-63bf-4627-bbdc-deb131defd74" (UID: "b43a32ae-63bf-4627-bbdc-deb131defd74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.172027 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-config-data" (OuterVolumeSpecName: "config-data") pod "b43a32ae-63bf-4627-bbdc-deb131defd74" (UID: "b43a32ae-63bf-4627-bbdc-deb131defd74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.183871 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.235741 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.235771 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43a32ae-63bf-4627-bbdc-deb131defd74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.337184 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data-custom\") pod \"8ef31003-6429-4270-b29d-750a82d4c7fe\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.337262 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data\") pod \"8ef31003-6429-4270-b29d-750a82d4c7fe\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.337293 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-combined-ca-bundle\") pod \"8ef31003-6429-4270-b29d-750a82d4c7fe\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.337360 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef31003-6429-4270-b29d-750a82d4c7fe-logs\") pod \"8ef31003-6429-4270-b29d-750a82d4c7fe\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.337402 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pcdt\" (UniqueName: \"kubernetes.io/projected/8ef31003-6429-4270-b29d-750a82d4c7fe-kube-api-access-9pcdt\") pod \"8ef31003-6429-4270-b29d-750a82d4c7fe\" (UID: \"8ef31003-6429-4270-b29d-750a82d4c7fe\") " Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.337785 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef31003-6429-4270-b29d-750a82d4c7fe-logs" (OuterVolumeSpecName: "logs") pod "8ef31003-6429-4270-b29d-750a82d4c7fe" (UID: "8ef31003-6429-4270-b29d-750a82d4c7fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.340780 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ef31003-6429-4270-b29d-750a82d4c7fe" (UID: "8ef31003-6429-4270-b29d-750a82d4c7fe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.342266 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef31003-6429-4270-b29d-750a82d4c7fe-kube-api-access-9pcdt" (OuterVolumeSpecName: "kube-api-access-9pcdt") pod "8ef31003-6429-4270-b29d-750a82d4c7fe" (UID: "8ef31003-6429-4270-b29d-750a82d4c7fe"). InnerVolumeSpecName "kube-api-access-9pcdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.367897 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ef31003-6429-4270-b29d-750a82d4c7fe" (UID: "8ef31003-6429-4270-b29d-750a82d4c7fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.384116 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data" (OuterVolumeSpecName: "config-data") pod "8ef31003-6429-4270-b29d-750a82d4c7fe" (UID: "8ef31003-6429-4270-b29d-750a82d4c7fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.439500 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.439577 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.439589 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef31003-6429-4270-b29d-750a82d4c7fe-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.440536 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pcdt\" (UniqueName: \"kubernetes.io/projected/8ef31003-6429-4270-b29d-750a82d4c7fe-kube-api-access-9pcdt\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.440566 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ef31003-6429-4270-b29d-750a82d4c7fe-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.736760 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66db65fb-a7cf-4b0f-bad9-11215f942f34","Type":"ContainerStarted","Data":"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb"} Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.736792 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api-log" containerID="cri-o://c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3" gracePeriod=30 Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.736976 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api" containerID="cri-o://895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb" gracePeriod=30 Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.737055 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.747118 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e85da6c4-a9a7-4d1c-baf9-67777bf925e4","Type":"ContainerStarted","Data":"e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1"} Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.750632 5129 generic.go:334] "Generic (PLEG): container finished" podID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerID="1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c" exitCode=0 Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.750778 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bb6676db4-77lhl" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.750899 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb6676db4-77lhl" event={"ID":"8ef31003-6429-4270-b29d-750a82d4c7fe","Type":"ContainerDied","Data":"1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c"} Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.750958 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bb6676db4-77lhl" event={"ID":"8ef31003-6429-4270-b29d-750a82d4c7fe","Type":"ContainerDied","Data":"89a9e538c966dc821174f7bac98a3b50866bbddaf5b9a0182be4d873fd8ccf65"} Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.750999 5129 scope.go:117] "RemoveContainer" containerID="1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.757831 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.760042 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b43a32ae-63bf-4627-bbdc-deb131defd74","Type":"ContainerDied","Data":"54e320dde3bdb097bd7e2cb407a73e0203896298157ce06a21c609b581e32812"} Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.775920 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.7758956919999997 podStartE2EDuration="3.775895692s" podCreationTimestamp="2026-03-14 07:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:10.770501918 +0000 UTC m=+1393.522417152" watchObservedRunningTime="2026-03-14 07:22:10.775895692 +0000 UTC m=+1393.527810886" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.797838 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.959511779 podStartE2EDuration="4.797810578s" podCreationTimestamp="2026-03-14 07:22:06 +0000 UTC" firstStartedPulling="2026-03-14 07:22:08.016885251 +0000 UTC m=+1390.768800435" lastFinishedPulling="2026-03-14 07:22:08.85518405 +0000 UTC m=+1391.607099234" observedRunningTime="2026-03-14 07:22:10.789395643 +0000 UTC m=+1393.541310827" watchObservedRunningTime="2026-03-14 07:22:10.797810578 +0000 UTC m=+1393.549725762" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.826271 5129 scope.go:117] "RemoveContainer" containerID="bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.829296 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bb6676db4-77lhl"] Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.850141 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-bb6676db4-77lhl"] Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.858893 5129 scope.go:117] "RemoveContainer" containerID="1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.862991 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.864407 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c\": container with ID starting with 1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c not found: ID does not exist" containerID="1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.864449 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c"} err="failed to get container status \"1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c\": rpc error: code = NotFound desc = could not find container \"1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c\": container with ID starting with 1da488b585b0059125d8aabd8a6fee64f2e7a5cb392a9b19f3684e31d319a78c not found: ID does not exist" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.864473 5129 scope.go:117] "RemoveContainer" containerID="bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647" Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.864968 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647\": container with ID starting with bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647 not found: ID does not exist" containerID="bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.864995 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647"} err="failed to get container status \"bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647\": rpc error: code = NotFound desc = could not find container \"bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647\": container with ID starting with bc2bfdcea0f584c7cdb892ae7e18c5b85cac0e919f18190071a9405176c04647 not found: ID does not exist" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.865016 5129 scope.go:117] "RemoveContainer" containerID="7686ee942427f823745c94372a0c72fd20e3b280d82bdf52620f653c4b97882b" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.873794 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.904746 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.905149 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api-log" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905165 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api-log" Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.905179 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-central-agent" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905185 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-central-agent" Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.905206 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="sg-core" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905212 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="sg-core" Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.905227 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-notification-agent" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905233 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-notification-agent" Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.905249 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="proxy-httpd" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905254 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="proxy-httpd" Mar 14 07:22:10 crc kubenswrapper[5129]: E0314 07:22:10.905265 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905273 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905445 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-central-agent" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905457 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api-log" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905469 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="sg-core" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905479 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" containerName="barbican-api" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905490 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="ceilometer-notification-agent" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.905502 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" containerName="proxy-httpd" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.907088 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.911624 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.911853 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.920763 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:10 crc kubenswrapper[5129]: I0314 07:22:10.945872 5129 scope.go:117] "RemoveContainer" containerID="bba5c15ca0fb9a9fac552c4411716314cada4dff957f275a3944a0fa32561b45" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.049456 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-run-httpd\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.049505 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.049568 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-scripts\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.049593 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-log-httpd\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.049700 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.049724 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-config-data\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.049857 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zzj\" (UniqueName: \"kubernetes.io/projected/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-kube-api-access-h7zzj\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.069980 5129 scope.go:117] "RemoveContainer" containerID="9384ab87298e2eeab16b06733e1b00cefca7ef04298183017cade4161e32481e" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.134773 5129 scope.go:117] "RemoveContainer" containerID="dc7e65d99e50ea8ef70644c1ae7042bd9cb31116ec6009c82470eb2460cb5c7b" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.153486 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-scripts\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.153538 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-log-httpd\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.153651 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.153676 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-config-data\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.153702 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zzj\" (UniqueName: \"kubernetes.io/projected/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-kube-api-access-h7zzj\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.153726 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-run-httpd\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.153750 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.158489 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.161450 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-log-httpd\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.166567 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.168298 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-scripts\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.168762 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-run-httpd\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.175013 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-config-data\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.177353 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zzj\" (UniqueName: \"kubernetes.io/projected/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-kube-api-access-h7zzj\") pod \"ceilometer-0\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.247778 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.277468 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-j24cn" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.358233 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlkt\" (UniqueName: \"kubernetes.io/projected/67606449-22cf-4aed-82df-32cece6daffb-kube-api-access-wmlkt\") pod \"67606449-22cf-4aed-82df-32cece6daffb\" (UID: \"67606449-22cf-4aed-82df-32cece6daffb\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.363984 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67606449-22cf-4aed-82df-32cece6daffb-kube-api-access-wmlkt" (OuterVolumeSpecName: "kube-api-access-wmlkt") pod "67606449-22cf-4aed-82df-32cece6daffb" (UID: "67606449-22cf-4aed-82df-32cece6daffb"). InnerVolumeSpecName "kube-api-access-wmlkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.380759 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.460211 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data\") pod \"66db65fb-a7cf-4b0f-bad9-11215f942f34\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.460307 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66db65fb-a7cf-4b0f-bad9-11215f942f34-logs\") pod \"66db65fb-a7cf-4b0f-bad9-11215f942f34\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.460333 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66db65fb-a7cf-4b0f-bad9-11215f942f34-etc-machine-id\") pod \"66db65fb-a7cf-4b0f-bad9-11215f942f34\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.460369 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-scripts\") pod \"66db65fb-a7cf-4b0f-bad9-11215f942f34\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.460420 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmpr6\" (UniqueName: \"kubernetes.io/projected/66db65fb-a7cf-4b0f-bad9-11215f942f34-kube-api-access-bmpr6\") pod \"66db65fb-a7cf-4b0f-bad9-11215f942f34\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.460479 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data-custom\") pod \"66db65fb-a7cf-4b0f-bad9-11215f942f34\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.460564 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-combined-ca-bundle\") pod \"66db65fb-a7cf-4b0f-bad9-11215f942f34\" (UID: \"66db65fb-a7cf-4b0f-bad9-11215f942f34\") " Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.461064 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlkt\" (UniqueName: \"kubernetes.io/projected/67606449-22cf-4aed-82df-32cece6daffb-kube-api-access-wmlkt\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.461355 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66db65fb-a7cf-4b0f-bad9-11215f942f34-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "66db65fb-a7cf-4b0f-bad9-11215f942f34" (UID: "66db65fb-a7cf-4b0f-bad9-11215f942f34"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.462593 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66db65fb-a7cf-4b0f-bad9-11215f942f34-logs" (OuterVolumeSpecName: "logs") pod "66db65fb-a7cf-4b0f-bad9-11215f942f34" (UID: "66db65fb-a7cf-4b0f-bad9-11215f942f34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.465391 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "66db65fb-a7cf-4b0f-bad9-11215f942f34" (UID: "66db65fb-a7cf-4b0f-bad9-11215f942f34"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.465406 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66db65fb-a7cf-4b0f-bad9-11215f942f34-kube-api-access-bmpr6" (OuterVolumeSpecName: "kube-api-access-bmpr6") pod "66db65fb-a7cf-4b0f-bad9-11215f942f34" (UID: "66db65fb-a7cf-4b0f-bad9-11215f942f34"). InnerVolumeSpecName "kube-api-access-bmpr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.468768 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-scripts" (OuterVolumeSpecName: "scripts") pod "66db65fb-a7cf-4b0f-bad9-11215f942f34" (UID: "66db65fb-a7cf-4b0f-bad9-11215f942f34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.497349 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66db65fb-a7cf-4b0f-bad9-11215f942f34" (UID: "66db65fb-a7cf-4b0f-bad9-11215f942f34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.511934 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data" (OuterVolumeSpecName: "config-data") pod "66db65fb-a7cf-4b0f-bad9-11215f942f34" (UID: "66db65fb-a7cf-4b0f-bad9-11215f942f34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.562967 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.562999 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.563008 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66db65fb-a7cf-4b0f-bad9-11215f942f34-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.563016 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66db65fb-a7cf-4b0f-bad9-11215f942f34-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.563024 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.563032 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmpr6\" (UniqueName: \"kubernetes.io/projected/66db65fb-a7cf-4b0f-bad9-11215f942f34-kube-api-access-bmpr6\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.563042 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66db65fb-a7cf-4b0f-bad9-11215f942f34-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.641353 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-2lhvz"] Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.649560 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557876-2lhvz"] Mar 14 07:22:11 crc kubenswrapper[5129]: W0314 07:22:11.738814 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16e4a3fd_270a_4dd0_b1ca_b6bf8893e4fc.slice/crio-1fab7382c4421818f90ce9aac8701bbf824a7472b1f09c761bfad83b2a19d54e WatchSource:0}: Error finding container 1fab7382c4421818f90ce9aac8701bbf824a7472b1f09c761bfad83b2a19d54e: Status 404 returned error can't find the container with id 1fab7382c4421818f90ce9aac8701bbf824a7472b1f09c761bfad83b2a19d54e Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.744325 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.770183 5129 generic.go:334] "Generic (PLEG): container finished" podID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerID="895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb" exitCode=0 Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.770212 5129 generic.go:334] "Generic (PLEG): container finished" podID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerID="c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3" exitCode=143 Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.770226 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.770260 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66db65fb-a7cf-4b0f-bad9-11215f942f34","Type":"ContainerDied","Data":"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb"} Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.770296 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66db65fb-a7cf-4b0f-bad9-11215f942f34","Type":"ContainerDied","Data":"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3"} Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.770307 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66db65fb-a7cf-4b0f-bad9-11215f942f34","Type":"ContainerDied","Data":"045a80443a4786e6b57f24d134578b773e85fd5ff08dc61ea39a57bf5f797f26"} Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.770324 5129 scope.go:117] "RemoveContainer" containerID="895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.774193 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557882-j24cn" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.777243 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557882-j24cn" event={"ID":"67606449-22cf-4aed-82df-32cece6daffb","Type":"ContainerDied","Data":"8f7b1bbf60fa4c5ba72415db3f5b7ad992c668305c09f26a6c4830332e295e97"} Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.777289 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f7b1bbf60fa4c5ba72415db3f5b7ad992c668305c09f26a6c4830332e295e97" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.780050 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerStarted","Data":"1fab7382c4421818f90ce9aac8701bbf824a7472b1f09c761bfad83b2a19d54e"} Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.793799 5129 scope.go:117] "RemoveContainer" containerID="c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.811818 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.819495 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.831401 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:11 crc kubenswrapper[5129]: E0314 07:22:11.831790 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67606449-22cf-4aed-82df-32cece6daffb" containerName="oc" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.831804 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="67606449-22cf-4aed-82df-32cece6daffb" containerName="oc" Mar 14 07:22:11 crc kubenswrapper[5129]: E0314 07:22:11.831824 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api-log" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.831832 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api-log" Mar 14 07:22:11 crc kubenswrapper[5129]: E0314 07:22:11.831843 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.831853 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.832039 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.832062 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="67606449-22cf-4aed-82df-32cece6daffb" containerName="oc" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.832082 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" containerName="cinder-api-log" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.834462 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.835943 5129 scope.go:117] "RemoveContainer" containerID="895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb" Mar 14 07:22:11 crc kubenswrapper[5129]: E0314 07:22:11.836367 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb\": container with ID starting with 895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb not found: ID does not exist" containerID="895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.836408 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb"} err="failed to get container status \"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb\": rpc error: code = NotFound desc = could not find container \"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb\": container with ID starting with 895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb not found: ID does not exist" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.836432 5129 scope.go:117] "RemoveContainer" containerID="c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3" Mar 14 07:22:11 crc kubenswrapper[5129]: E0314 07:22:11.836709 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3\": container with ID starting with c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3 not found: ID does not exist" containerID="c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.836729 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3"} err="failed to get container status \"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3\": rpc error: code = NotFound desc = could not find container \"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3\": container with ID starting with c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3 not found: ID does not exist" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.836741 5129 scope.go:117] "RemoveContainer" containerID="895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.838037 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.838298 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.838410 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb"} err="failed to get container status \"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb\": rpc error: code = NotFound desc = could not find container \"895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb\": container with ID starting with 895e6dae613957366e41dc6feef7cbb7092cb532edb0343befa22d397196cafb not found: ID does not exist" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.838436 5129 scope.go:117] "RemoveContainer" containerID="c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.838476 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.838666 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3"} err="failed to get container status \"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3\": rpc error: code = NotFound desc = could not find container \"c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3\": container with ID starting with c3e8b48b6549e352be0c80a05ecaa084aa321d66328997d3d1c9ecc26a9620b3 not found: ID does not exist" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.842570 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.969592 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.969672 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.969695 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a3fad4b-8e44-471d-b262-27d6a7e05276-logs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.969725 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.969884 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.969979 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g2m5\" (UniqueName: \"kubernetes.io/projected/1a3fad4b-8e44-471d-b262-27d6a7e05276-kube-api-access-4g2m5\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.970112 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.970151 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-scripts\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:11 crc kubenswrapper[5129]: I0314 07:22:11.970200 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a3fad4b-8e44-471d-b262-27d6a7e05276-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.050202 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4" path="/var/lib/kubelet/pods/2cb8f0de-67ca-4a50-aeae-b1ef99ce40d4/volumes" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.051996 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66db65fb-a7cf-4b0f-bad9-11215f942f34" path="/var/lib/kubelet/pods/66db65fb-a7cf-4b0f-bad9-11215f942f34/volumes" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.054126 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef31003-6429-4270-b29d-750a82d4c7fe" path="/var/lib/kubelet/pods/8ef31003-6429-4270-b29d-750a82d4c7fe/volumes" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.057207 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43a32ae-63bf-4627-bbdc-deb131defd74" path="/var/lib/kubelet/pods/b43a32ae-63bf-4627-bbdc-deb131defd74/volumes" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.071931 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.071985 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g2m5\" (UniqueName: \"kubernetes.io/projected/1a3fad4b-8e44-471d-b262-27d6a7e05276-kube-api-access-4g2m5\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072034 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072051 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-scripts\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072078 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a3fad4b-8e44-471d-b262-27d6a7e05276-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072112 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072136 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072156 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a3fad4b-8e44-471d-b262-27d6a7e05276-logs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072183 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.072559 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a3fad4b-8e44-471d-b262-27d6a7e05276-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.073264 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a3fad4b-8e44-471d-b262-27d6a7e05276-logs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.076364 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.078142 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.078848 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.079787 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.080630 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-scripts\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.090186 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.093556 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g2m5\" (UniqueName: \"kubernetes.io/projected/1a3fad4b-8e44-471d-b262-27d6a7e05276-kube-api-access-4g2m5\") pod \"cinder-api-0\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.156824 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.330942 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 07:22:12 crc kubenswrapper[5129]: W0314 07:22:12.649466 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3fad4b_8e44_471d_b262_27d6a7e05276.slice/crio-3472c054b50f74444cfb9f99926ed91789bd6fb03b78b9180e9bafdaef83653d WatchSource:0}: Error finding container 3472c054b50f74444cfb9f99926ed91789bd6fb03b78b9180e9bafdaef83653d: Status 404 returned error can't find the container with id 3472c054b50f74444cfb9f99926ed91789bd6fb03b78b9180e9bafdaef83653d Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.654184 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.790935 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a3fad4b-8e44-471d-b262-27d6a7e05276","Type":"ContainerStarted","Data":"3472c054b50f74444cfb9f99926ed91789bd6fb03b78b9180e9bafdaef83653d"} Mar 14 07:22:12 crc kubenswrapper[5129]: I0314 07:22:12.793617 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerStarted","Data":"026ce796c3ea7a332fd8a306f8d3aea9d566cc065d6db6597b9a30f2fb8388af"} Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.228120 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.487424 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-798779f645-jt8hz"] Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.487684 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-798779f645-jt8hz" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-api" containerID="cri-o://39c6a62ea5b9ad99beda56d6e7b2e80859bc775547164da440ad044eca959f8e" gracePeriod=30 Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.487798 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-798779f645-jt8hz" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-httpd" containerID="cri-o://e7914ce04ee9148973aef1627d62c03d2644a8e703facc10c2922ec5de2807dc" gracePeriod=30 Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.516103 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7fd6bbc76c-9rshh"] Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.517568 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.525251 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.528148 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fd6bbc76c-9rshh"] Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.600688 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-httpd-config\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.600992 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-ovndb-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.601070 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-combined-ca-bundle\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.601111 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbdt5\" (UniqueName: \"kubernetes.io/projected/e4553e57-9090-44cb-a8af-7297e4c624c0-kube-api-access-sbdt5\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.601139 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-public-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.601172 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-config\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.601229 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-internal-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.710437 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-combined-ca-bundle\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.710519 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbdt5\" (UniqueName: \"kubernetes.io/projected/e4553e57-9090-44cb-a8af-7297e4c624c0-kube-api-access-sbdt5\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.710583 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-public-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.710684 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-config\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.716797 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-internal-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.717391 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-httpd-config\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.717483 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-ovndb-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.723450 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-combined-ca-bundle\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.725228 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-public-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.726811 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-config\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.733139 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-ovndb-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.733331 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-httpd-config\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.733727 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-internal-tls-certs\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.733819 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbdt5\" (UniqueName: \"kubernetes.io/projected/e4553e57-9090-44cb-a8af-7297e4c624c0-kube-api-access-sbdt5\") pod \"neutron-7fd6bbc76c-9rshh\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.807749 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a3fad4b-8e44-471d-b262-27d6a7e05276","Type":"ContainerStarted","Data":"b6c1654757c5b72874d5b9a46ca2254eeb71fc65f978afb406f43a49c77a25ae"} Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.809856 5129 generic.go:334] "Generic (PLEG): container finished" podID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerID="e7914ce04ee9148973aef1627d62c03d2644a8e703facc10c2922ec5de2807dc" exitCode=0 Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.809900 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798779f645-jt8hz" event={"ID":"f8803f0b-1655-4721-a92e-8241f500d9a5","Type":"ContainerDied","Data":"e7914ce04ee9148973aef1627d62c03d2644a8e703facc10c2922ec5de2807dc"} Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.814318 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerStarted","Data":"3c7bc269d0a30369279ed545b8ed2a866536a55d65895da7616a09aee3c3f0fb"} Mar 14 07:22:13 crc kubenswrapper[5129]: I0314 07:22:13.869044 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:14 crc kubenswrapper[5129]: I0314 07:22:14.399257 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fd6bbc76c-9rshh"] Mar 14 07:22:14 crc kubenswrapper[5129]: W0314 07:22:14.402286 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4553e57_9090_44cb_a8af_7297e4c624c0.slice/crio-7222285490c57f1476b7a540feb24baf129594704555c0308a6c0ea9c393a0fe WatchSource:0}: Error finding container 7222285490c57f1476b7a540feb24baf129594704555c0308a6c0ea9c393a0fe: Status 404 returned error can't find the container with id 7222285490c57f1476b7a540feb24baf129594704555c0308a6c0ea9c393a0fe Mar 14 07:22:14 crc kubenswrapper[5129]: I0314 07:22:14.828429 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a3fad4b-8e44-471d-b262-27d6a7e05276","Type":"ContainerStarted","Data":"d055a298c80b1c78420262c5f6b1a8b08ee515621adade6064aa22f0860f7d0f"} Mar 14 07:22:14 crc kubenswrapper[5129]: I0314 07:22:14.829040 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 07:22:14 crc kubenswrapper[5129]: I0314 07:22:14.831022 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerStarted","Data":"04b8fd8314078510468789115b21ff897749d3e2ae3d588707c6fa3c2db3e992"} Mar 14 07:22:14 crc kubenswrapper[5129]: I0314 07:22:14.832455 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd6bbc76c-9rshh" event={"ID":"e4553e57-9090-44cb-a8af-7297e4c624c0","Type":"ContainerStarted","Data":"26d0f3fccd15d22f5d226b58a7b4b02bc6754da5235ba9f5ff39da154f4b4c5b"} Mar 14 07:22:14 crc kubenswrapper[5129]: I0314 07:22:14.832493 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd6bbc76c-9rshh" event={"ID":"e4553e57-9090-44cb-a8af-7297e4c624c0","Type":"ContainerStarted","Data":"7222285490c57f1476b7a540feb24baf129594704555c0308a6c0ea9c393a0fe"} Mar 14 07:22:14 crc kubenswrapper[5129]: I0314 07:22:14.860255 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.860231803 podStartE2EDuration="3.860231803s" podCreationTimestamp="2026-03-14 07:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:14.849557227 +0000 UTC m=+1397.601472411" watchObservedRunningTime="2026-03-14 07:22:14.860231803 +0000 UTC m=+1397.612146987" Mar 14 07:22:15 crc kubenswrapper[5129]: I0314 07:22:15.822256 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-798779f645-jt8hz" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": dial tcp 10.217.0.154:9696: connect: connection refused" Mar 14 07:22:15 crc kubenswrapper[5129]: I0314 07:22:15.846419 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerStarted","Data":"577a4dc0999e91f6f331ec853bc9022d5b7360d235462a8dafe1fed6c0003712"} Mar 14 07:22:15 crc kubenswrapper[5129]: I0314 07:22:15.846567 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:22:15 crc kubenswrapper[5129]: I0314 07:22:15.848901 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd6bbc76c-9rshh" event={"ID":"e4553e57-9090-44cb-a8af-7297e4c624c0","Type":"ContainerStarted","Data":"bc3211d9096a638fa8c4213b5bf198de4c37facb10847c262d7a2dcc1726dfc9"} Mar 14 07:22:15 crc kubenswrapper[5129]: I0314 07:22:15.849259 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:15 crc kubenswrapper[5129]: I0314 07:22:15.888779 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.398459245 podStartE2EDuration="5.888750582s" podCreationTimestamp="2026-03-14 07:22:10 +0000 UTC" firstStartedPulling="2026-03-14 07:22:11.748774402 +0000 UTC m=+1394.500689586" lastFinishedPulling="2026-03-14 07:22:15.239065739 +0000 UTC m=+1397.990980923" observedRunningTime="2026-03-14 07:22:15.877116401 +0000 UTC m=+1398.629031585" watchObservedRunningTime="2026-03-14 07:22:15.888750582 +0000 UTC m=+1398.640665806" Mar 14 07:22:15 crc kubenswrapper[5129]: I0314 07:22:15.920516 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7fd6bbc76c-9rshh" podStartSLOduration=2.920492431 podStartE2EDuration="2.920492431s" podCreationTimestamp="2026-03-14 07:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:15.915324823 +0000 UTC m=+1398.667240037" watchObservedRunningTime="2026-03-14 07:22:15.920492431 +0000 UTC m=+1398.672407625" Mar 14 07:22:16 crc kubenswrapper[5129]: I0314 07:22:16.862694 5129 generic.go:334] "Generic (PLEG): container finished" podID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerID="39c6a62ea5b9ad99beda56d6e7b2e80859bc775547164da440ad044eca959f8e" exitCode=0 Mar 14 07:22:16 crc kubenswrapper[5129]: I0314 07:22:16.862903 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798779f645-jt8hz" event={"ID":"f8803f0b-1655-4721-a92e-8241f500d9a5","Type":"ContainerDied","Data":"39c6a62ea5b9ad99beda56d6e7b2e80859bc775547164da440ad044eca959f8e"} Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.134279 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.198750 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-public-tls-certs\") pod \"f8803f0b-1655-4721-a92e-8241f500d9a5\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.198843 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr8nz\" (UniqueName: \"kubernetes.io/projected/f8803f0b-1655-4721-a92e-8241f500d9a5-kube-api-access-wr8nz\") pod \"f8803f0b-1655-4721-a92e-8241f500d9a5\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.198890 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-httpd-config\") pod \"f8803f0b-1655-4721-a92e-8241f500d9a5\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.198954 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-config\") pod \"f8803f0b-1655-4721-a92e-8241f500d9a5\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.199035 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-combined-ca-bundle\") pod \"f8803f0b-1655-4721-a92e-8241f500d9a5\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.199065 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-internal-tls-certs\") pod \"f8803f0b-1655-4721-a92e-8241f500d9a5\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.199105 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-ovndb-tls-certs\") pod \"f8803f0b-1655-4721-a92e-8241f500d9a5\" (UID: \"f8803f0b-1655-4721-a92e-8241f500d9a5\") " Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.207696 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f8803f0b-1655-4721-a92e-8241f500d9a5" (UID: "f8803f0b-1655-4721-a92e-8241f500d9a5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.213707 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8803f0b-1655-4721-a92e-8241f500d9a5-kube-api-access-wr8nz" (OuterVolumeSpecName: "kube-api-access-wr8nz") pod "f8803f0b-1655-4721-a92e-8241f500d9a5" (UID: "f8803f0b-1655-4721-a92e-8241f500d9a5"). InnerVolumeSpecName "kube-api-access-wr8nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.258748 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-config" (OuterVolumeSpecName: "config") pod "f8803f0b-1655-4721-a92e-8241f500d9a5" (UID: "f8803f0b-1655-4721-a92e-8241f500d9a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.276446 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f8803f0b-1655-4721-a92e-8241f500d9a5" (UID: "f8803f0b-1655-4721-a92e-8241f500d9a5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.302617 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr8nz\" (UniqueName: \"kubernetes.io/projected/f8803f0b-1655-4721-a92e-8241f500d9a5-kube-api-access-wr8nz\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.302691 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.302702 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.302712 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.310483 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f8803f0b-1655-4721-a92e-8241f500d9a5" (UID: "f8803f0b-1655-4721-a92e-8241f500d9a5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.319114 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8803f0b-1655-4721-a92e-8241f500d9a5" (UID: "f8803f0b-1655-4721-a92e-8241f500d9a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.324567 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f8803f0b-1655-4721-a92e-8241f500d9a5" (UID: "f8803f0b-1655-4721-a92e-8241f500d9a5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.406032 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.406067 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.406075 5129 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8803f0b-1655-4721-a92e-8241f500d9a5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.409730 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.475724 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-xtd6w"] Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.475935 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" podUID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerName="dnsmasq-dns" containerID="cri-o://53edb1146f5f8f489bc6613f5e07c4b08ed1f9ea90aafb42bc12fb7af35e5dfc" gracePeriod=10 Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.589641 5129 scope.go:117] "RemoveContainer" containerID="9ab487e892f5b469301d30f09c86d4c09d4838db765c227e622aae61b0503382" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.594746 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.633928 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:17 crc kubenswrapper[5129]: E0314 07:22:17.785188 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd8255e_0613_43d3_a721_eb5cde92ae4f.slice/crio-conmon-53edb1146f5f8f489bc6613f5e07c4b08ed1f9ea90aafb42bc12fb7af35e5dfc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd8255e_0613_43d3_a721_eb5cde92ae4f.slice/crio-53edb1146f5f8f489bc6613f5e07c4b08ed1f9ea90aafb42bc12fb7af35e5dfc.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.891672 5129 generic.go:334] "Generic (PLEG): container finished" podID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerID="53edb1146f5f8f489bc6613f5e07c4b08ed1f9ea90aafb42bc12fb7af35e5dfc" exitCode=0 Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.892091 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" event={"ID":"7dd8255e-0613-43d3-a721-eb5cde92ae4f","Type":"ContainerDied","Data":"53edb1146f5f8f489bc6613f5e07c4b08ed1f9ea90aafb42bc12fb7af35e5dfc"} Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.911089 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798779f645-jt8hz" event={"ID":"f8803f0b-1655-4721-a92e-8241f500d9a5","Type":"ContainerDied","Data":"8054a637992c524ebc44788bf45661558d65c19eb31bfd95c92ef46f20cf3e49"} Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.911183 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798779f645-jt8hz" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.911260 5129 scope.go:117] "RemoveContainer" containerID="e7914ce04ee9148973aef1627d62c03d2644a8e703facc10c2922ec5de2807dc" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.911864 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="cinder-scheduler" containerID="cri-o://54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e" gracePeriod=30 Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.911917 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="probe" containerID="cri-o://e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1" gracePeriod=30 Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.961693 5129 scope.go:117] "RemoveContainer" containerID="39c6a62ea5b9ad99beda56d6e7b2e80859bc775547164da440ad044eca959f8e" Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.967152 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-798779f645-jt8hz"] Mar 14 07:22:17 crc kubenswrapper[5129]: I0314 07:22:17.973366 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-798779f645-jt8hz"] Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.051121 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" path="/var/lib/kubelet/pods/f8803f0b-1655-4721-a92e-8241f500d9a5/volumes" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.070030 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.129201 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-nb\") pod \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.129246 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-svc\") pod \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.129326 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-config\") pod \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.129507 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn8xj\" (UniqueName: \"kubernetes.io/projected/7dd8255e-0613-43d3-a721-eb5cde92ae4f-kube-api-access-sn8xj\") pod \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.129539 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-sb\") pod \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.130265 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-swift-storage-0\") pod \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\" (UID: \"7dd8255e-0613-43d3-a721-eb5cde92ae4f\") " Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.157016 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd8255e-0613-43d3-a721-eb5cde92ae4f-kube-api-access-sn8xj" (OuterVolumeSpecName: "kube-api-access-sn8xj") pod "7dd8255e-0613-43d3-a721-eb5cde92ae4f" (UID: "7dd8255e-0613-43d3-a721-eb5cde92ae4f"). InnerVolumeSpecName "kube-api-access-sn8xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.209390 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7dd8255e-0613-43d3-a721-eb5cde92ae4f" (UID: "7dd8255e-0613-43d3-a721-eb5cde92ae4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.218066 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7dd8255e-0613-43d3-a721-eb5cde92ae4f" (UID: "7dd8255e-0613-43d3-a721-eb5cde92ae4f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.220282 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7dd8255e-0613-43d3-a721-eb5cde92ae4f" (UID: "7dd8255e-0613-43d3-a721-eb5cde92ae4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.229079 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7dd8255e-0613-43d3-a721-eb5cde92ae4f" (UID: "7dd8255e-0613-43d3-a721-eb5cde92ae4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.232691 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn8xj\" (UniqueName: \"kubernetes.io/projected/7dd8255e-0613-43d3-a721-eb5cde92ae4f-kube-api-access-sn8xj\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.232720 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.232731 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.232741 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.232749 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.233232 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-config" (OuterVolumeSpecName: "config") pod "7dd8255e-0613-43d3-a721-eb5cde92ae4f" (UID: "7dd8255e-0613-43d3-a721-eb5cde92ae4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.334163 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dd8255e-0613-43d3-a721-eb5cde92ae4f-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.921237 5129 generic.go:334] "Generic (PLEG): container finished" podID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerID="e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1" exitCode=0 Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.921304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e85da6c4-a9a7-4d1c-baf9-67777bf925e4","Type":"ContainerDied","Data":"e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1"} Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.923264 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" event={"ID":"7dd8255e-0613-43d3-a721-eb5cde92ae4f","Type":"ContainerDied","Data":"df11f17919d7149ea860ee5cf97d86549a8f9011446c8448b6031c76ecc27464"} Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.923302 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-xtd6w" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.923371 5129 scope.go:117] "RemoveContainer" containerID="53edb1146f5f8f489bc6613f5e07c4b08ed1f9ea90aafb42bc12fb7af35e5dfc" Mar 14 07:22:18 crc kubenswrapper[5129]: I0314 07:22:18.944504 5129 scope.go:117] "RemoveContainer" containerID="929c627cb87f00956cc9d542f9760717e4d500d49f2dc7e682c12c51fce27b12" Mar 14 07:22:19 crc kubenswrapper[5129]: I0314 07:22:19.000505 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-xtd6w"] Mar 14 07:22:19 crc kubenswrapper[5129]: I0314 07:22:19.007022 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-xtd6w"] Mar 14 07:22:19 crc kubenswrapper[5129]: I0314 07:22:19.342817 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.051915 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" path="/var/lib/kubelet/pods/7dd8255e-0613-43d3-a721-eb5cde92ae4f/volumes" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.767346 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.878700 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-scripts\") pod \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.878760 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-etc-machine-id\") pod \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.878890 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e85da6c4-a9a7-4d1c-baf9-67777bf925e4" (UID: "e85da6c4-a9a7-4d1c-baf9-67777bf925e4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.878913 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-combined-ca-bundle\") pod \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.878954 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data\") pod \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.879027 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data-custom\") pod \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.879107 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g64zr\" (UniqueName: \"kubernetes.io/projected/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-kube-api-access-g64zr\") pod \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\" (UID: \"e85da6c4-a9a7-4d1c-baf9-67777bf925e4\") " Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.879586 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.890217 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e85da6c4-a9a7-4d1c-baf9-67777bf925e4" (UID: "e85da6c4-a9a7-4d1c-baf9-67777bf925e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.890255 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-scripts" (OuterVolumeSpecName: "scripts") pod "e85da6c4-a9a7-4d1c-baf9-67777bf925e4" (UID: "e85da6c4-a9a7-4d1c-baf9-67777bf925e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.890371 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-kube-api-access-g64zr" (OuterVolumeSpecName: "kube-api-access-g64zr") pod "e85da6c4-a9a7-4d1c-baf9-67777bf925e4" (UID: "e85da6c4-a9a7-4d1c-baf9-67777bf925e4"). InnerVolumeSpecName "kube-api-access-g64zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.943676 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85da6c4-a9a7-4d1c-baf9-67777bf925e4" (UID: "e85da6c4-a9a7-4d1c-baf9-67777bf925e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.954577 5129 generic.go:334] "Generic (PLEG): container finished" podID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerID="54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e" exitCode=0 Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.954646 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e85da6c4-a9a7-4d1c-baf9-67777bf925e4","Type":"ContainerDied","Data":"54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e"} Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.954721 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.954747 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e85da6c4-a9a7-4d1c-baf9-67777bf925e4","Type":"ContainerDied","Data":"5a0ead8315ddfc4701d9be88e2e8eba4e9c4f26fcf9b62fe6be8c6c7a64cf77d"} Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.954798 5129 scope.go:117] "RemoveContainer" containerID="e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.976391 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data" (OuterVolumeSpecName: "config-data") pod "e85da6c4-a9a7-4d1c-baf9-67777bf925e4" (UID: "e85da6c4-a9a7-4d1c-baf9-67777bf925e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.986969 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.986999 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.987009 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.987019 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g64zr\" (UniqueName: \"kubernetes.io/projected/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-kube-api-access-g64zr\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.987028 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85da6c4-a9a7-4d1c-baf9-67777bf925e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:20 crc kubenswrapper[5129]: I0314 07:22:20.990244 5129 scope.go:117] "RemoveContainer" containerID="54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.005458 5129 scope.go:117] "RemoveContainer" containerID="e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1" Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.005828 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1\": container with ID starting with e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1 not found: ID does not exist" containerID="e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.005865 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1"} err="failed to get container status \"e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1\": rpc error: code = NotFound desc = could not find container \"e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1\": container with ID starting with e2021e39f00b32741f46431ba0263f5febc6ead8fc83d21ae5eeaeb05afe22b1 not found: ID does not exist" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.005892 5129 scope.go:117] "RemoveContainer" containerID="54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e" Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.006228 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e\": container with ID starting with 54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e not found: ID does not exist" containerID="54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.006319 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e"} err="failed to get container status \"54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e\": rpc error: code = NotFound desc = could not find container \"54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e\": container with ID starting with 54c4e5e103da85eef14350d71cc80bf6d594ca2c6cf104e4dc39b976a4b5ff8e not found: ID does not exist" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.158936 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.250063 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.293035 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.303905 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.314508 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.323217 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.323689 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerName="init" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.323713 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerName="init" Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.323728 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="probe" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.323737 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="probe" Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.323752 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-api" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.323761 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-api" Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.323774 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="cinder-scheduler" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.323781 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="cinder-scheduler" Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.323817 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerName="dnsmasq-dns" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.323827 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerName="dnsmasq-dns" Mar 14 07:22:21 crc kubenswrapper[5129]: E0314 07:22:21.323841 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-httpd" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.323850 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-httpd" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.324043 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="probe" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.324059 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd8255e-0613-43d3-a721-eb5cde92ae4f" containerName="dnsmasq-dns" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.324078 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-api" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.324088 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8803f0b-1655-4721-a92e-8241f500d9a5" containerName="neutron-httpd" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.324109 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" containerName="cinder-scheduler" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.325115 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.332273 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.346048 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.388376 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-586cb48554-pw8mx"] Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.388810 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-586cb48554-pw8mx" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-log" containerID="cri-o://bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044" gracePeriod=30 Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.389013 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-586cb48554-pw8mx" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-api" containerID="cri-o://89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c" gracePeriod=30 Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.397401 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.397728 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcpz\" (UniqueName: \"kubernetes.io/projected/7132e8d4-728a-4852-bcd9-833a9bd05878-kube-api-access-czcpz\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.397759 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.397786 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.397850 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-scripts\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.397904 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7132e8d4-728a-4852-bcd9-833a9bd05878-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.499510 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7132e8d4-728a-4852-bcd9-833a9bd05878-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.499628 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.499673 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcpz\" (UniqueName: \"kubernetes.io/projected/7132e8d4-728a-4852-bcd9-833a9bd05878-kube-api-access-czcpz\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.499690 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.499708 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.499747 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-scripts\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.500700 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7132e8d4-728a-4852-bcd9-833a9bd05878-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.507178 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-scripts\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.508318 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.511412 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.517234 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.517529 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcpz\" (UniqueName: \"kubernetes.io/projected/7132e8d4-728a-4852-bcd9-833a9bd05878-kube-api-access-czcpz\") pod \"cinder-scheduler-0\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.648737 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:22:21 crc kubenswrapper[5129]: I0314 07:22:21.999517 5129 generic.go:334] "Generic (PLEG): container finished" podID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerID="bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044" exitCode=143 Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:21.999970 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586cb48554-pw8mx" event={"ID":"abf3a56d-8229-4cb2-8c84-b8f12e11753f","Type":"ContainerDied","Data":"bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044"} Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.047417 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85da6c4-a9a7-4d1c-baf9-67777bf925e4" path="/var/lib/kubelet/pods/e85da6c4-a9a7-4d1c-baf9-67777bf925e4/volumes" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.092118 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.291243 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.292504 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.298245 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pxln6" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.298425 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.298546 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.308798 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.421169 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.421223 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6mx\" (UniqueName: \"kubernetes.io/projected/7f518e1a-89ff-4a92-8eca-81092a6c6dab-kube-api-access-sd6mx\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.421262 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.421474 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.522941 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.523011 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6mx\" (UniqueName: \"kubernetes.io/projected/7f518e1a-89ff-4a92-8eca-81092a6c6dab-kube-api-access-sd6mx\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.523068 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.523111 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.524205 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.526542 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.528170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.541506 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6mx\" (UniqueName: \"kubernetes.io/projected/7f518e1a-89ff-4a92-8eca-81092a6c6dab-kube-api-access-sd6mx\") pod \"openstackclient\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.571081 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.571759 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.581852 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.610183 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.611834 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.625310 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 07:22:22 crc kubenswrapper[5129]: E0314 07:22:22.713904 5129 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 14 07:22:22 crc kubenswrapper[5129]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_7f518e1a-89ff-4a92-8eca-81092a6c6dab_0(4ebfd63dfb2913155adf0c2f306515c23f02076e4de998f8b4ec1e18a868c043): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4ebfd63dfb2913155adf0c2f306515c23f02076e4de998f8b4ec1e18a868c043" Netns:"/var/run/netns/85515984-c952-4281-b2f5-3b6de8cd52cc" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=4ebfd63dfb2913155adf0c2f306515c23f02076e4de998f8b4ec1e18a868c043;K8S_POD_UID=7f518e1a-89ff-4a92-8eca-81092a6c6dab" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/7f518e1a-89ff-4a92-8eca-81092a6c6dab]: expected pod UID "7f518e1a-89ff-4a92-8eca-81092a6c6dab" but got "6dcacc6d-2066-4bbf-a65c-8ff457d6235b" from Kube API Mar 14 07:22:22 crc kubenswrapper[5129]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 07:22:22 crc kubenswrapper[5129]: > Mar 14 07:22:22 crc kubenswrapper[5129]: E0314 07:22:22.714032 5129 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 14 07:22:22 crc kubenswrapper[5129]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_7f518e1a-89ff-4a92-8eca-81092a6c6dab_0(4ebfd63dfb2913155adf0c2f306515c23f02076e4de998f8b4ec1e18a868c043): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4ebfd63dfb2913155adf0c2f306515c23f02076e4de998f8b4ec1e18a868c043" Netns:"/var/run/netns/85515984-c952-4281-b2f5-3b6de8cd52cc" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=4ebfd63dfb2913155adf0c2f306515c23f02076e4de998f8b4ec1e18a868c043;K8S_POD_UID=7f518e1a-89ff-4a92-8eca-81092a6c6dab" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/7f518e1a-89ff-4a92-8eca-81092a6c6dab]: expected pod UID "7f518e1a-89ff-4a92-8eca-81092a6c6dab" but got "6dcacc6d-2066-4bbf-a65c-8ff457d6235b" from Kube API Mar 14 07:22:22 crc kubenswrapper[5129]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 14 07:22:22 crc kubenswrapper[5129]: > pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.726701 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.726753 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4vx\" (UniqueName: \"kubernetes.io/projected/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-kube-api-access-2f4vx\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.726810 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.727033 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.828678 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.828724 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4vx\" (UniqueName: \"kubernetes.io/projected/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-kube-api-access-2f4vx\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.828763 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.828816 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.830311 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.833577 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.835054 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:22 crc kubenswrapper[5129]: I0314 07:22:22.845524 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4vx\" (UniqueName: \"kubernetes.io/projected/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-kube-api-access-2f4vx\") pod \"openstackclient\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " pod="openstack/openstackclient" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.020955 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.021175 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7132e8d4-728a-4852-bcd9-833a9bd05878","Type":"ContainerStarted","Data":"76f24021b697ad5485f9f5594cd9b1485876631247e65a9f1fe131e045674477"} Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.021378 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7132e8d4-728a-4852-bcd9-833a9bd05878","Type":"ContainerStarted","Data":"0a63e977059e2bebfc4d29b936c8bfb628ac660dc2cfb4ee7f87653bb9143760"} Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.024684 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7f518e1a-89ff-4a92-8eca-81092a6c6dab" podUID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.033463 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.059904 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.133247 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd6mx\" (UniqueName: \"kubernetes.io/projected/7f518e1a-89ff-4a92-8eca-81092a6c6dab-kube-api-access-sd6mx\") pod \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.133479 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-combined-ca-bundle\") pod \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.133587 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config\") pod \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.133671 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config-secret\") pod \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\" (UID: \"7f518e1a-89ff-4a92-8eca-81092a6c6dab\") " Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.142970 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7f518e1a-89ff-4a92-8eca-81092a6c6dab" (UID: "7f518e1a-89ff-4a92-8eca-81092a6c6dab"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.149587 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f518e1a-89ff-4a92-8eca-81092a6c6dab-kube-api-access-sd6mx" (OuterVolumeSpecName: "kube-api-access-sd6mx") pod "7f518e1a-89ff-4a92-8eca-81092a6c6dab" (UID: "7f518e1a-89ff-4a92-8eca-81092a6c6dab"). InnerVolumeSpecName "kube-api-access-sd6mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.149710 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7f518e1a-89ff-4a92-8eca-81092a6c6dab" (UID: "7f518e1a-89ff-4a92-8eca-81092a6c6dab"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.150364 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f518e1a-89ff-4a92-8eca-81092a6c6dab" (UID: "7f518e1a-89ff-4a92-8eca-81092a6c6dab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.235693 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.235722 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.235731 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f518e1a-89ff-4a92-8eca-81092a6c6dab-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.235743 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd6mx\" (UniqueName: \"kubernetes.io/projected/7f518e1a-89ff-4a92-8eca-81092a6c6dab-kube-api-access-sd6mx\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.546295 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 07:22:23 crc kubenswrapper[5129]: I0314 07:22:23.996690 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 07:22:24 crc kubenswrapper[5129]: I0314 07:22:24.058718 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f518e1a-89ff-4a92-8eca-81092a6c6dab" path="/var/lib/kubelet/pods/7f518e1a-89ff-4a92-8eca-81092a6c6dab/volumes" Mar 14 07:22:24 crc kubenswrapper[5129]: I0314 07:22:24.079238 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6dcacc6d-2066-4bbf-a65c-8ff457d6235b","Type":"ContainerStarted","Data":"a8bc1aebe224746624bb9e89e519b99b1e2e86c69cecfc4c354ca5fbeb41dffa"} Mar 14 07:22:24 crc kubenswrapper[5129]: I0314 07:22:24.097508 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:22:24 crc kubenswrapper[5129]: I0314 07:22:24.098702 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7132e8d4-728a-4852-bcd9-833a9bd05878","Type":"ContainerStarted","Data":"0b249c015ec3a36e3def27e75975febe3abb0dc9b079bfbd50022f9693560988"} Mar 14 07:22:24 crc kubenswrapper[5129]: I0314 07:22:24.121968 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7f518e1a-89ff-4a92-8eca-81092a6c6dab" podUID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" Mar 14 07:22:24 crc kubenswrapper[5129]: I0314 07:22:24.124038 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.124022545 podStartE2EDuration="3.124022545s" podCreationTimestamp="2026-03-14 07:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:24.117181771 +0000 UTC m=+1406.869096955" watchObservedRunningTime="2026-03-14 07:22:24.124022545 +0000 UTC m=+1406.875937729" Mar 14 07:22:24 crc kubenswrapper[5129]: I0314 07:22:24.953578 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.069566 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-internal-tls-certs\") pod \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.069740 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-combined-ca-bundle\") pod \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.069766 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-config-data\") pod \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.069791 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf3a56d-8229-4cb2-8c84-b8f12e11753f-logs\") pod \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.069816 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbvgz\" (UniqueName: \"kubernetes.io/projected/abf3a56d-8229-4cb2-8c84-b8f12e11753f-kube-api-access-dbvgz\") pod \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.069860 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-scripts\") pod \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.069935 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-public-tls-certs\") pod \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\" (UID: \"abf3a56d-8229-4cb2-8c84-b8f12e11753f\") " Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.070321 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abf3a56d-8229-4cb2-8c84-b8f12e11753f-logs" (OuterVolumeSpecName: "logs") pod "abf3a56d-8229-4cb2-8c84-b8f12e11753f" (UID: "abf3a56d-8229-4cb2-8c84-b8f12e11753f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.084840 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-scripts" (OuterVolumeSpecName: "scripts") pod "abf3a56d-8229-4cb2-8c84-b8f12e11753f" (UID: "abf3a56d-8229-4cb2-8c84-b8f12e11753f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.094747 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf3a56d-8229-4cb2-8c84-b8f12e11753f-kube-api-access-dbvgz" (OuterVolumeSpecName: "kube-api-access-dbvgz") pod "abf3a56d-8229-4cb2-8c84-b8f12e11753f" (UID: "abf3a56d-8229-4cb2-8c84-b8f12e11753f"). InnerVolumeSpecName "kube-api-access-dbvgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.124885 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-config-data" (OuterVolumeSpecName: "config-data") pod "abf3a56d-8229-4cb2-8c84-b8f12e11753f" (UID: "abf3a56d-8229-4cb2-8c84-b8f12e11753f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.130350 5129 generic.go:334] "Generic (PLEG): container finished" podID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerID="89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c" exitCode=0 Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.130685 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586cb48554-pw8mx" event={"ID":"abf3a56d-8229-4cb2-8c84-b8f12e11753f","Type":"ContainerDied","Data":"89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c"} Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.130743 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586cb48554-pw8mx" event={"ID":"abf3a56d-8229-4cb2-8c84-b8f12e11753f","Type":"ContainerDied","Data":"2052816c5c5effe5ed1b04b1c5af4261524c0a9b579bd45cef3f326de28bd443"} Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.130763 5129 scope.go:117] "RemoveContainer" containerID="89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.130768 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586cb48554-pw8mx" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.172132 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.173561 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.173685 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf3a56d-8229-4cb2-8c84-b8f12e11753f-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.173711 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbvgz\" (UniqueName: \"kubernetes.io/projected/abf3a56d-8229-4cb2-8c84-b8f12e11753f-kube-api-access-dbvgz\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.176870 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf3a56d-8229-4cb2-8c84-b8f12e11753f" (UID: "abf3a56d-8229-4cb2-8c84-b8f12e11753f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.190113 5129 scope.go:117] "RemoveContainer" containerID="bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.194248 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "abf3a56d-8229-4cb2-8c84-b8f12e11753f" (UID: "abf3a56d-8229-4cb2-8c84-b8f12e11753f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.194136 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "abf3a56d-8229-4cb2-8c84-b8f12e11753f" (UID: "abf3a56d-8229-4cb2-8c84-b8f12e11753f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.207092 5129 scope.go:117] "RemoveContainer" containerID="89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c" Mar 14 07:22:25 crc kubenswrapper[5129]: E0314 07:22:25.207579 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c\": container with ID starting with 89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c not found: ID does not exist" containerID="89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.207617 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c"} err="failed to get container status \"89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c\": rpc error: code = NotFound desc = could not find container \"89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c\": container with ID starting with 89be6468868206b8ef75f2a751b7422d7bb17faca56f1a6498091a0de034e89c not found: ID does not exist" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.207639 5129 scope.go:117] "RemoveContainer" containerID="bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044" Mar 14 07:22:25 crc kubenswrapper[5129]: E0314 07:22:25.207989 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044\": container with ID starting with bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044 not found: ID does not exist" containerID="bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.208036 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044"} err="failed to get container status \"bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044\": rpc error: code = NotFound desc = could not find container \"bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044\": container with ID starting with bfdba54752345d810fef18c5d01e38f5e42ccb3533ed5c1793e550d7c0d7b044 not found: ID does not exist" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.280963 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.280995 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.281026 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf3a56d-8229-4cb2-8c84-b8f12e11753f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.461050 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-586cb48554-pw8mx"] Mar 14 07:22:25 crc kubenswrapper[5129]: I0314 07:22:25.471802 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-586cb48554-pw8mx"] Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.047345 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" path="/var/lib/kubelet/pods/abf3a56d-8229-4cb2-8c84-b8f12e11753f/volumes" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.523136 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7957bb5589-vf68m"] Mar 14 07:22:26 crc kubenswrapper[5129]: E0314 07:22:26.523781 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-log" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.523798 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-log" Mar 14 07:22:26 crc kubenswrapper[5129]: E0314 07:22:26.523824 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-api" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.523830 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-api" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.523982 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-log" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.524012 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf3a56d-8229-4cb2-8c84-b8f12e11753f" containerName="placement-api" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.524840 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.527194 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.527422 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.537111 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.569290 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7957bb5589-vf68m"] Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602089 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blm7t\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-kube-api-access-blm7t\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602145 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-etc-swift\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602174 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-run-httpd\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602218 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-log-httpd\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602240 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-combined-ca-bundle\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602263 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-public-tls-certs\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602293 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-config-data\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.602336 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-internal-tls-certs\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.649872 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.704622 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-log-httpd\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.704666 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-combined-ca-bundle\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.704691 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-public-tls-certs\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.704737 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-config-data\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.704796 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-internal-tls-certs\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.704852 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blm7t\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-kube-api-access-blm7t\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.704880 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-etc-swift\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.705001 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-run-httpd\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.705780 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-run-httpd\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.705809 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-log-httpd\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.708752 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-config-data\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.709099 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-etc-swift\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.710214 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-combined-ca-bundle\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.721290 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-internal-tls-certs\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.722054 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-public-tls-certs\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.738584 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blm7t\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-kube-api-access-blm7t\") pod \"swift-proxy-7957bb5589-vf68m\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:26 crc kubenswrapper[5129]: I0314 07:22:26.844171 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:27 crc kubenswrapper[5129]: I0314 07:22:27.355049 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7957bb5589-vf68m"] Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.163393 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957bb5589-vf68m" event={"ID":"26087e66-e9e8-451d-80b4-d288468202f1","Type":"ContainerStarted","Data":"827ff25380bea7a7b669c68f4f4faa1199ffe0abbccc374df9dfa9bf6a471dee"} Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.164010 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957bb5589-vf68m" event={"ID":"26087e66-e9e8-451d-80b4-d288468202f1","Type":"ContainerStarted","Data":"94ccaa2244dfc8d149d667d1f5c396aa3ecfa3171455d755f9e5a589e59121ff"} Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.164032 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957bb5589-vf68m" event={"ID":"26087e66-e9e8-451d-80b4-d288468202f1","Type":"ContainerStarted","Data":"c76099919cebb4d60d6551c9ef741a06bfa26a5b6753af84e79bff21e1b74618"} Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.164078 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.164101 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.191274 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7957bb5589-vf68m" podStartSLOduration=2.191249257 podStartE2EDuration="2.191249257s" podCreationTimestamp="2026-03-14 07:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:28.182469992 +0000 UTC m=+1410.934385176" watchObservedRunningTime="2026-03-14 07:22:28.191249257 +0000 UTC m=+1410.943164441" Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.351169 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.351465 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-central-agent" containerID="cri-o://026ce796c3ea7a332fd8a306f8d3aea9d566cc065d6db6597b9a30f2fb8388af" gracePeriod=30 Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.354041 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="proxy-httpd" containerID="cri-o://577a4dc0999e91f6f331ec853bc9022d5b7360d235462a8dafe1fed6c0003712" gracePeriod=30 Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.354174 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="sg-core" containerID="cri-o://04b8fd8314078510468789115b21ff897749d3e2ae3d588707c6fa3c2db3e992" gracePeriod=30 Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.354211 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-notification-agent" containerID="cri-o://3c7bc269d0a30369279ed545b8ed2a866536a55d65895da7616a09aee3c3f0fb" gracePeriod=30 Mar 14 07:22:28 crc kubenswrapper[5129]: I0314 07:22:28.365746 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.172:3000/\": EOF" Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180717 5129 generic.go:334] "Generic (PLEG): container finished" podID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerID="577a4dc0999e91f6f331ec853bc9022d5b7360d235462a8dafe1fed6c0003712" exitCode=0 Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180758 5129 generic.go:334] "Generic (PLEG): container finished" podID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerID="04b8fd8314078510468789115b21ff897749d3e2ae3d588707c6fa3c2db3e992" exitCode=2 Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180767 5129 generic.go:334] "Generic (PLEG): container finished" podID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerID="3c7bc269d0a30369279ed545b8ed2a866536a55d65895da7616a09aee3c3f0fb" exitCode=0 Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180773 5129 generic.go:334] "Generic (PLEG): container finished" podID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerID="026ce796c3ea7a332fd8a306f8d3aea9d566cc065d6db6597b9a30f2fb8388af" exitCode=0 Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180816 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerDied","Data":"577a4dc0999e91f6f331ec853bc9022d5b7360d235462a8dafe1fed6c0003712"} Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180872 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerDied","Data":"04b8fd8314078510468789115b21ff897749d3e2ae3d588707c6fa3c2db3e992"} Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180882 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerDied","Data":"3c7bc269d0a30369279ed545b8ed2a866536a55d65895da7616a09aee3c3f0fb"} Mar 14 07:22:29 crc kubenswrapper[5129]: I0314 07:22:29.180891 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerDied","Data":"026ce796c3ea7a332fd8a306f8d3aea9d566cc065d6db6597b9a30f2fb8388af"} Mar 14 07:22:31 crc kubenswrapper[5129]: I0314 07:22:31.867918 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.100893 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-8mzrf"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.102357 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.110514 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8mzrf"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.190653 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dwf9c"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.196872 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.216545 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dwf9c"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.225458 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0796-account-create-update-nndld"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.226877 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.230829 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.234213 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0796-account-create-update-nndld"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.246512 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-operator-scripts\") pod \"nova-api-db-create-8mzrf\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.246547 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkpwj\" (UniqueName: \"kubernetes.io/projected/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-kube-api-access-qkpwj\") pod \"nova-cell0-db-create-dwf9c\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.246623 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmj8w\" (UniqueName: \"kubernetes.io/projected/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-kube-api-access-dmj8w\") pod \"nova-api-db-create-8mzrf\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.246702 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-operator-scripts\") pod \"nova-cell0-db-create-dwf9c\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.348169 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmj8w\" (UniqueName: \"kubernetes.io/projected/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-kube-api-access-dmj8w\") pod \"nova-api-db-create-8mzrf\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.348489 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2hc8\" (UniqueName: \"kubernetes.io/projected/d6e40311-e1d7-4a06-895e-9681160e38da-kube-api-access-q2hc8\") pod \"nova-api-0796-account-create-update-nndld\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.348514 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-operator-scripts\") pod \"nova-cell0-db-create-dwf9c\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.348553 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6e40311-e1d7-4a06-895e-9681160e38da-operator-scripts\") pod \"nova-api-0796-account-create-update-nndld\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.348588 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-operator-scripts\") pod \"nova-api-db-create-8mzrf\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.348618 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkpwj\" (UniqueName: \"kubernetes.io/projected/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-kube-api-access-qkpwj\") pod \"nova-cell0-db-create-dwf9c\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.349617 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-operator-scripts\") pod \"nova-cell0-db-create-dwf9c\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.349900 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-operator-scripts\") pod \"nova-api-db-create-8mzrf\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.374160 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmj8w\" (UniqueName: \"kubernetes.io/projected/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-kube-api-access-dmj8w\") pod \"nova-api-db-create-8mzrf\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.383797 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dsnv9"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.392651 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.386082 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkpwj\" (UniqueName: \"kubernetes.io/projected/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-kube-api-access-qkpwj\") pod \"nova-cell0-db-create-dwf9c\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.416736 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dsnv9"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.427759 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.435537 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-c46zq"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.440377 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.446196 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.450071 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5gg\" (UniqueName: \"kubernetes.io/projected/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-kube-api-access-nq5gg\") pod \"nova-cell1-db-create-dsnv9\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.450151 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-operator-scripts\") pod \"nova-cell1-db-create-dsnv9\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.450182 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2hc8\" (UniqueName: \"kubernetes.io/projected/d6e40311-e1d7-4a06-895e-9681160e38da-kube-api-access-q2hc8\") pod \"nova-api-0796-account-create-update-nndld\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.450238 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6e40311-e1d7-4a06-895e-9681160e38da-operator-scripts\") pod \"nova-api-0796-account-create-update-nndld\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.451073 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6e40311-e1d7-4a06-895e-9681160e38da-operator-scripts\") pod \"nova-api-0796-account-create-update-nndld\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.455151 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-c46zq"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.472047 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2hc8\" (UniqueName: \"kubernetes.io/projected/d6e40311-e1d7-4a06-895e-9681160e38da-kube-api-access-q2hc8\") pod \"nova-api-0796-account-create-update-nndld\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.553690 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-operator-scripts\") pod \"nova-cell1-db-create-dsnv9\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.554023 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-operator-scripts\") pod \"nova-cell0-3b40-account-create-update-c46zq\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.554055 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5gg\" (UniqueName: \"kubernetes.io/projected/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-kube-api-access-nq5gg\") pod \"nova-cell1-db-create-dsnv9\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.554075 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hbr\" (UniqueName: \"kubernetes.io/projected/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-kube-api-access-c5hbr\") pod \"nova-cell0-3b40-account-create-update-c46zq\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.554735 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-operator-scripts\") pod \"nova-cell1-db-create-dsnv9\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.582117 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5gg\" (UniqueName: \"kubernetes.io/projected/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-kube-api-access-nq5gg\") pod \"nova-cell1-db-create-dsnv9\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.589178 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.600002 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.604069 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-168c-account-create-update-dbtdl"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.605297 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.609503 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.611722 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-dbtdl"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.632083 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.655436 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-operator-scripts\") pod \"nova-cell0-3b40-account-create-update-c46zq\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.655818 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hbr\" (UniqueName: \"kubernetes.io/projected/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-kube-api-access-c5hbr\") pod \"nova-cell0-3b40-account-create-update-c46zq\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.655883 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-operator-scripts\") pod \"nova-cell1-168c-account-create-update-dbtdl\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.656069 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xmhn\" (UniqueName: \"kubernetes.io/projected/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-kube-api-access-8xmhn\") pod \"nova-cell1-168c-account-create-update-dbtdl\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.656923 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-operator-scripts\") pod \"nova-cell0-3b40-account-create-update-c46zq\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.676298 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hbr\" (UniqueName: \"kubernetes.io/projected/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-kube-api-access-c5hbr\") pod \"nova-cell0-3b40-account-create-update-c46zq\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759323 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7zzj\" (UniqueName: \"kubernetes.io/projected/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-kube-api-access-h7zzj\") pod \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759513 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-log-httpd\") pod \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759670 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-run-httpd\") pod \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759728 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-scripts\") pod \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759822 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-combined-ca-bundle\") pod \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759844 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-sg-core-conf-yaml\") pod \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759879 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-config-data\") pod \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\" (UID: \"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc\") " Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.759968 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" (UID: "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.760637 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-operator-scripts\") pod \"nova-cell1-168c-account-create-update-dbtdl\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.760696 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" (UID: "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.760757 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xmhn\" (UniqueName: \"kubernetes.io/projected/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-kube-api-access-8xmhn\") pod \"nova-cell1-168c-account-create-update-dbtdl\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.760854 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.760869 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.761495 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-operator-scripts\") pod \"nova-cell1-168c-account-create-update-dbtdl\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.764449 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-scripts" (OuterVolumeSpecName: "scripts") pod "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" (UID: "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.767191 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-kube-api-access-h7zzj" (OuterVolumeSpecName: "kube-api-access-h7zzj") pod "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" (UID: "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc"). InnerVolumeSpecName "kube-api-access-h7zzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.781665 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xmhn\" (UniqueName: \"kubernetes.io/projected/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-kube-api-access-8xmhn\") pod \"nova-cell1-168c-account-create-update-dbtdl\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.796425 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" (UID: "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.810099 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.836998 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" (UID: "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.840041 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.868179 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.868207 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.868216 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7zzj\" (UniqueName: \"kubernetes.io/projected/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-kube-api-access-h7zzj\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.868226 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.890829 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-config-data" (OuterVolumeSpecName: "config-data") pod "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" (UID: "16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.941287 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8mzrf"] Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.941681 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:33 crc kubenswrapper[5129]: W0314 07:22:33.964505 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1acddb0d_6c0b_420c_9df5_f1b89d56b21e.slice/crio-08e0c5f21048978b0d88127efe758a3b5adfde32aca9d89be30ceae95c8dd119 WatchSource:0}: Error finding container 08e0c5f21048978b0d88127efe758a3b5adfde32aca9d89be30ceae95c8dd119: Status 404 returned error can't find the container with id 08e0c5f21048978b0d88127efe758a3b5adfde32aca9d89be30ceae95c8dd119 Mar 14 07:22:33 crc kubenswrapper[5129]: I0314 07:22:33.969516 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.180250 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dwf9c"] Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.200386 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0796-account-create-update-nndld"] Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.256279 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dwf9c" event={"ID":"c7a67053-ce35-4d1a-bdee-7d89e882b4b1","Type":"ContainerStarted","Data":"f6bb68ce140788dc51ca504ffa291de3291210541939b493a08f0e95280d08c5"} Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.262247 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6dcacc6d-2066-4bbf-a65c-8ff457d6235b","Type":"ContainerStarted","Data":"8d3e839655b386c621e99ce51cd89ad667b865776cb2c358ac6411cb869d4ee6"} Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.270185 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc","Type":"ContainerDied","Data":"1fab7382c4421818f90ce9aac8701bbf824a7472b1f09c761bfad83b2a19d54e"} Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.270239 5129 scope.go:117] "RemoveContainer" containerID="577a4dc0999e91f6f331ec853bc9022d5b7360d235462a8dafe1fed6c0003712" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.270411 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.275261 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0796-account-create-update-nndld" event={"ID":"d6e40311-e1d7-4a06-895e-9681160e38da","Type":"ContainerStarted","Data":"fe84c73a8aeb04b8751e9c96721e4a95a2a9e62eecefc3c2192f7c8f0ec2ea14"} Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.282085 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8mzrf" event={"ID":"1acddb0d-6c0b-420c-9df5-f1b89d56b21e","Type":"ContainerStarted","Data":"ca5d78ebc9e5c0ead9b825452ce86e555b271077f92d69b6cf145c6d03e2b989"} Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.282568 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8mzrf" event={"ID":"1acddb0d-6c0b-420c-9df5-f1b89d56b21e","Type":"ContainerStarted","Data":"08e0c5f21048978b0d88127efe758a3b5adfde32aca9d89be30ceae95c8dd119"} Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.287713 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.551626336 podStartE2EDuration="12.287672813s" podCreationTimestamp="2026-03-14 07:22:22 +0000 UTC" firstStartedPulling="2026-03-14 07:22:23.552662328 +0000 UTC m=+1406.304577512" lastFinishedPulling="2026-03-14 07:22:33.288708815 +0000 UTC m=+1416.040623989" observedRunningTime="2026-03-14 07:22:34.279801813 +0000 UTC m=+1417.031717007" watchObservedRunningTime="2026-03-14 07:22:34.287672813 +0000 UTC m=+1417.039587997" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.314773 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-8mzrf" podStartSLOduration=1.314747867 podStartE2EDuration="1.314747867s" podCreationTimestamp="2026-03-14 07:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:34.302085049 +0000 UTC m=+1417.054000233" watchObservedRunningTime="2026-03-14 07:22:34.314747867 +0000 UTC m=+1417.066663051" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.339371 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dsnv9"] Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.423369 5129 scope.go:117] "RemoveContainer" containerID="04b8fd8314078510468789115b21ff897749d3e2ae3d588707c6fa3c2db3e992" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.446264 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.457421 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.471971 5129 scope.go:117] "RemoveContainer" containerID="3c7bc269d0a30369279ed545b8ed2a866536a55d65895da7616a09aee3c3f0fb" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474086 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:34 crc kubenswrapper[5129]: E0314 07:22:34.474517 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-notification-agent" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474533 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-notification-agent" Mar 14 07:22:34 crc kubenswrapper[5129]: E0314 07:22:34.474546 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="proxy-httpd" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474551 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="proxy-httpd" Mar 14 07:22:34 crc kubenswrapper[5129]: E0314 07:22:34.474562 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="sg-core" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474568 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="sg-core" Mar 14 07:22:34 crc kubenswrapper[5129]: E0314 07:22:34.474613 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-central-agent" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474619 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-central-agent" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474779 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-central-agent" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474794 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="sg-core" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474808 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="ceilometer-notification-agent" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.474819 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" containerName="proxy-httpd" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.478340 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.480815 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.481002 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.501663 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-c46zq"] Mar 14 07:22:34 crc kubenswrapper[5129]: W0314 07:22:34.515135 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b4c1ed9_1abf_442c_a30c_92249cfd9fe4.slice/crio-e41c0666d107da427cd6b16c31137dcfdbc9535b09982b6859242c16325ddc65 WatchSource:0}: Error finding container e41c0666d107da427cd6b16c31137dcfdbc9535b09982b6859242c16325ddc65: Status 404 returned error can't find the container with id e41c0666d107da427cd6b16c31137dcfdbc9535b09982b6859242c16325ddc65 Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.517208 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.539970 5129 scope.go:117] "RemoveContainer" containerID="026ce796c3ea7a332fd8a306f8d3aea9d566cc065d6db6597b9a30f2fb8388af" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.547459 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-dbtdl"] Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.599681 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-log-httpd\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.599746 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.599783 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-scripts\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.599814 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-config-data\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.599870 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55t7x\" (UniqueName: \"kubernetes.io/projected/3065de65-644e-4977-a64c-71aa308a7401-kube-api-access-55t7x\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.599895 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.599934 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-run-httpd\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.702109 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-log-httpd\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.702366 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.702472 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-scripts\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.702667 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-config-data\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.702814 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55t7x\" (UniqueName: \"kubernetes.io/projected/3065de65-644e-4977-a64c-71aa308a7401-kube-api-access-55t7x\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.702917 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.703037 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-run-httpd\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.703643 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-run-httpd\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.703898 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-log-httpd\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.709587 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.712123 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-config-data\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.712409 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-scripts\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.713288 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.727520 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55t7x\" (UniqueName: \"kubernetes.io/projected/3065de65-644e-4977-a64c-71aa308a7401-kube-api-access-55t7x\") pod \"ceilometer-0\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " pod="openstack/ceilometer-0" Mar 14 07:22:34 crc kubenswrapper[5129]: I0314 07:22:34.828115 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.030741 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.266978 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.295408 5129 generic.go:334] "Generic (PLEG): container finished" podID="937d0a55-2e9a-471b-b4d6-50cc8c4ddd33" containerID="3960bf45564cd7a5d0c3f2cfcfc3a0b700a9a46196ffca4ba11e5884fbce96d1" exitCode=0 Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.295493 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dsnv9" event={"ID":"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33","Type":"ContainerDied","Data":"3960bf45564cd7a5d0c3f2cfcfc3a0b700a9a46196ffca4ba11e5884fbce96d1"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.295525 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dsnv9" event={"ID":"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33","Type":"ContainerStarted","Data":"2af3f009dbacc5b84574c078dcaf543a5ac9527e436bdda959eabe4cef8d21fa"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.297387 5129 generic.go:334] "Generic (PLEG): container finished" podID="c7a67053-ce35-4d1a-bdee-7d89e882b4b1" containerID="4f31926dace18ee6fa2886d83ca81ff1aa2ddb06023c8ecc5fd52e3b66425fc3" exitCode=0 Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.297429 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dwf9c" event={"ID":"c7a67053-ce35-4d1a-bdee-7d89e882b4b1","Type":"ContainerDied","Data":"4f31926dace18ee6fa2886d83ca81ff1aa2ddb06023c8ecc5fd52e3b66425fc3"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.298949 5129 generic.go:334] "Generic (PLEG): container finished" podID="2b4c1ed9-1abf-442c-a30c-92249cfd9fe4" containerID="a731b5c1d187c5f7593f08c748a82216a8d2c9ca3e2b7a83bff17ae61bc36cd3" exitCode=0 Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.299021 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b40-account-create-update-c46zq" event={"ID":"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4","Type":"ContainerDied","Data":"a731b5c1d187c5f7593f08c748a82216a8d2c9ca3e2b7a83bff17ae61bc36cd3"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.299046 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b40-account-create-update-c46zq" event={"ID":"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4","Type":"ContainerStarted","Data":"e41c0666d107da427cd6b16c31137dcfdbc9535b09982b6859242c16325ddc65"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.307222 5129 generic.go:334] "Generic (PLEG): container finished" podID="d6e40311-e1d7-4a06-895e-9681160e38da" containerID="801f9c6b9b4609a23fdfd6c9e39ce8d8281239d0826281a0aafbad38b75cd0a6" exitCode=0 Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.307315 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0796-account-create-update-nndld" event={"ID":"d6e40311-e1d7-4a06-895e-9681160e38da","Type":"ContainerDied","Data":"801f9c6b9b4609a23fdfd6c9e39ce8d8281239d0826281a0aafbad38b75cd0a6"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.315068 5129 generic.go:334] "Generic (PLEG): container finished" podID="81a20ac3-5616-4b0b-9fd3-09ca4d863c24" containerID="4649aa78d1e37425574ecfe4a03fda95b21e45c63c426d03ab34b339d2cbed7b" exitCode=0 Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.315156 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-168c-account-create-update-dbtdl" event={"ID":"81a20ac3-5616-4b0b-9fd3-09ca4d863c24","Type":"ContainerDied","Data":"4649aa78d1e37425574ecfe4a03fda95b21e45c63c426d03ab34b339d2cbed7b"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.315191 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-168c-account-create-update-dbtdl" event={"ID":"81a20ac3-5616-4b0b-9fd3-09ca4d863c24","Type":"ContainerStarted","Data":"db5de0eb25331e0def3dec75f4f4bf4738057072aebe2768ed1444527f14cc4b"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.318185 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerStarted","Data":"4907b07defa1059e2ea33b60562276bee7cbbe12b0fb46d1c3f5b1706e0294d7"} Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.319446 5129 generic.go:334] "Generic (PLEG): container finished" podID="1acddb0d-6c0b-420c-9df5-f1b89d56b21e" containerID="ca5d78ebc9e5c0ead9b825452ce86e555b271077f92d69b6cf145c6d03e2b989" exitCode=0 Mar 14 07:22:35 crc kubenswrapper[5129]: I0314 07:22:35.319862 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8mzrf" event={"ID":"1acddb0d-6c0b-420c-9df5-f1b89d56b21e","Type":"ContainerDied","Data":"ca5d78ebc9e5c0ead9b825452ce86e555b271077f92d69b6cf145c6d03e2b989"} Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.063102 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc" path="/var/lib/kubelet/pods/16e4a3fd-270a-4dd0-b1ca-b6bf8893e4fc/volumes" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.328040 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerStarted","Data":"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7"} Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.787460 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.852188 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2hc8\" (UniqueName: \"kubernetes.io/projected/d6e40311-e1d7-4a06-895e-9681160e38da-kube-api-access-q2hc8\") pod \"d6e40311-e1d7-4a06-895e-9681160e38da\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.852434 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6e40311-e1d7-4a06-895e-9681160e38da-operator-scripts\") pod \"d6e40311-e1d7-4a06-895e-9681160e38da\" (UID: \"d6e40311-e1d7-4a06-895e-9681160e38da\") " Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.852962 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e40311-e1d7-4a06-895e-9681160e38da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6e40311-e1d7-4a06-895e-9681160e38da" (UID: "d6e40311-e1d7-4a06-895e-9681160e38da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.858480 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.859006 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e40311-e1d7-4a06-895e-9681160e38da-kube-api-access-q2hc8" (OuterVolumeSpecName: "kube-api-access-q2hc8") pod "d6e40311-e1d7-4a06-895e-9681160e38da" (UID: "d6e40311-e1d7-4a06-895e-9681160e38da"). InnerVolumeSpecName "kube-api-access-q2hc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.863167 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.915414 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.954183 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2hc8\" (UniqueName: \"kubernetes.io/projected/d6e40311-e1d7-4a06-895e-9681160e38da-kube-api-access-q2hc8\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.954221 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6e40311-e1d7-4a06-895e-9681160e38da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.984135 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:36 crc kubenswrapper[5129]: I0314 07:22:36.991074 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.055833 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5hbr\" (UniqueName: \"kubernetes.io/projected/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-kube-api-access-c5hbr\") pod \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.055879 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xmhn\" (UniqueName: \"kubernetes.io/projected/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-kube-api-access-8xmhn\") pod \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.055945 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-operator-scripts\") pod \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\" (UID: \"81a20ac3-5616-4b0b-9fd3-09ca4d863c24\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.055997 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-operator-scripts\") pod \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\" (UID: \"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.056045 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkpwj\" (UniqueName: \"kubernetes.io/projected/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-kube-api-access-qkpwj\") pod \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.056070 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-operator-scripts\") pod \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\" (UID: \"c7a67053-ce35-4d1a-bdee-7d89e882b4b1\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.056843 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81a20ac3-5616-4b0b-9fd3-09ca4d863c24" (UID: "81a20ac3-5616-4b0b-9fd3-09ca4d863c24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.056868 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7a67053-ce35-4d1a-bdee-7d89e882b4b1" (UID: "c7a67053-ce35-4d1a-bdee-7d89e882b4b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.057677 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b4c1ed9-1abf-442c-a30c-92249cfd9fe4" (UID: "2b4c1ed9-1abf-442c-a30c-92249cfd9fe4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.058388 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.058413 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.058425 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.061231 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-kube-api-access-qkpwj" (OuterVolumeSpecName: "kube-api-access-qkpwj") pod "c7a67053-ce35-4d1a-bdee-7d89e882b4b1" (UID: "c7a67053-ce35-4d1a-bdee-7d89e882b4b1"). InnerVolumeSpecName "kube-api-access-qkpwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.061385 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-kube-api-access-c5hbr" (OuterVolumeSpecName: "kube-api-access-c5hbr") pod "2b4c1ed9-1abf-442c-a30c-92249cfd9fe4" (UID: "2b4c1ed9-1abf-442c-a30c-92249cfd9fe4"). InnerVolumeSpecName "kube-api-access-c5hbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.061849 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-kube-api-access-8xmhn" (OuterVolumeSpecName: "kube-api-access-8xmhn") pod "81a20ac3-5616-4b0b-9fd3-09ca4d863c24" (UID: "81a20ac3-5616-4b0b-9fd3-09ca4d863c24"). InnerVolumeSpecName "kube-api-access-8xmhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.065906 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.072285 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.165264 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmj8w\" (UniqueName: \"kubernetes.io/projected/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-kube-api-access-dmj8w\") pod \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.165322 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-operator-scripts\") pod \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.166413 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "937d0a55-2e9a-471b-b4d6-50cc8c4ddd33" (UID: "937d0a55-2e9a-471b-b4d6-50cc8c4ddd33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.166457 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-operator-scripts\") pod \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\" (UID: \"1acddb0d-6c0b-420c-9df5-f1b89d56b21e\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.166834 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5gg\" (UniqueName: \"kubernetes.io/projected/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-kube-api-access-nq5gg\") pod \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\" (UID: \"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33\") " Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.167363 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1acddb0d-6c0b-420c-9df5-f1b89d56b21e" (UID: "1acddb0d-6c0b-420c-9df5-f1b89d56b21e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.168086 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.168342 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5hbr\" (UniqueName: \"kubernetes.io/projected/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4-kube-api-access-c5hbr\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.168422 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xmhn\" (UniqueName: \"kubernetes.io/projected/81a20ac3-5616-4b0b-9fd3-09ca4d863c24-kube-api-access-8xmhn\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.168491 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.168633 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkpwj\" (UniqueName: \"kubernetes.io/projected/c7a67053-ce35-4d1a-bdee-7d89e882b4b1-kube-api-access-qkpwj\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.168929 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-kube-api-access-dmj8w" (OuterVolumeSpecName: "kube-api-access-dmj8w") pod "1acddb0d-6c0b-420c-9df5-f1b89d56b21e" (UID: "1acddb0d-6c0b-420c-9df5-f1b89d56b21e"). InnerVolumeSpecName "kube-api-access-dmj8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.171430 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-kube-api-access-nq5gg" (OuterVolumeSpecName: "kube-api-access-nq5gg") pod "937d0a55-2e9a-471b-b4d6-50cc8c4ddd33" (UID: "937d0a55-2e9a-471b-b4d6-50cc8c4ddd33"). InnerVolumeSpecName "kube-api-access-nq5gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.270071 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq5gg\" (UniqueName: \"kubernetes.io/projected/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33-kube-api-access-nq5gg\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.270103 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmj8w\" (UniqueName: \"kubernetes.io/projected/1acddb0d-6c0b-420c-9df5-f1b89d56b21e-kube-api-access-dmj8w\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.340129 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-168c-account-create-update-dbtdl" event={"ID":"81a20ac3-5616-4b0b-9fd3-09ca4d863c24","Type":"ContainerDied","Data":"db5de0eb25331e0def3dec75f4f4bf4738057072aebe2768ed1444527f14cc4b"} Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.340207 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db5de0eb25331e0def3dec75f4f4bf4738057072aebe2768ed1444527f14cc4b" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.340355 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-dbtdl" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.342749 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerStarted","Data":"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4"} Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.344928 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8mzrf" event={"ID":"1acddb0d-6c0b-420c-9df5-f1b89d56b21e","Type":"ContainerDied","Data":"08e0c5f21048978b0d88127efe758a3b5adfde32aca9d89be30ceae95c8dd119"} Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.344954 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08e0c5f21048978b0d88127efe758a3b5adfde32aca9d89be30ceae95c8dd119" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.345002 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8mzrf" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.347931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dsnv9" event={"ID":"937d0a55-2e9a-471b-b4d6-50cc8c4ddd33","Type":"ContainerDied","Data":"2af3f009dbacc5b84574c078dcaf543a5ac9527e436bdda959eabe4cef8d21fa"} Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.347971 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2af3f009dbacc5b84574c078dcaf543a5ac9527e436bdda959eabe4cef8d21fa" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.348207 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dsnv9" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.349544 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dwf9c" event={"ID":"c7a67053-ce35-4d1a-bdee-7d89e882b4b1","Type":"ContainerDied","Data":"f6bb68ce140788dc51ca504ffa291de3291210541939b493a08f0e95280d08c5"} Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.349567 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6bb68ce140788dc51ca504ffa291de3291210541939b493a08f0e95280d08c5" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.349549 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dwf9c" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.353632 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b40-account-create-update-c46zq" event={"ID":"2b4c1ed9-1abf-442c-a30c-92249cfd9fe4","Type":"ContainerDied","Data":"e41c0666d107da427cd6b16c31137dcfdbc9535b09982b6859242c16325ddc65"} Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.353663 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e41c0666d107da427cd6b16c31137dcfdbc9535b09982b6859242c16325ddc65" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.353706 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-c46zq" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.363695 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0796-account-create-update-nndld" event={"ID":"d6e40311-e1d7-4a06-895e-9681160e38da","Type":"ContainerDied","Data":"fe84c73a8aeb04b8751e9c96721e4a95a2a9e62eecefc3c2192f7c8f0ec2ea14"} Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.363717 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-nndld" Mar 14 07:22:37 crc kubenswrapper[5129]: I0314 07:22:37.363732 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe84c73a8aeb04b8751e9c96721e4a95a2a9e62eecefc3c2192f7c8f0ec2ea14" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.371766 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerStarted","Data":"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6"} Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.438179 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.438464 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-log" containerID="cri-o://caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775" gracePeriod=30 Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.438497 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-httpd" containerID="cri-o://af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c" gracePeriod=30 Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.602644 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-twj2g"] Mar 14 07:22:38 crc kubenswrapper[5129]: E0314 07:22:38.603029 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a67053-ce35-4d1a-bdee-7d89e882b4b1" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603046 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a67053-ce35-4d1a-bdee-7d89e882b4b1" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: E0314 07:22:38.603058 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a20ac3-5616-4b0b-9fd3-09ca4d863c24" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603065 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a20ac3-5616-4b0b-9fd3-09ca4d863c24" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: E0314 07:22:38.603079 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937d0a55-2e9a-471b-b4d6-50cc8c4ddd33" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603084 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="937d0a55-2e9a-471b-b4d6-50cc8c4ddd33" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: E0314 07:22:38.603094 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4c1ed9-1abf-442c-a30c-92249cfd9fe4" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603100 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4c1ed9-1abf-442c-a30c-92249cfd9fe4" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: E0314 07:22:38.603110 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e40311-e1d7-4a06-895e-9681160e38da" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603116 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e40311-e1d7-4a06-895e-9681160e38da" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: E0314 07:22:38.603130 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acddb0d-6c0b-420c-9df5-f1b89d56b21e" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603137 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acddb0d-6c0b-420c-9df5-f1b89d56b21e" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603287 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e40311-e1d7-4a06-895e-9681160e38da" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603296 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a20ac3-5616-4b0b-9fd3-09ca4d863c24" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603307 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a67053-ce35-4d1a-bdee-7d89e882b4b1" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603314 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="937d0a55-2e9a-471b-b4d6-50cc8c4ddd33" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603323 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4c1ed9-1abf-442c-a30c-92249cfd9fe4" containerName="mariadb-account-create-update" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603337 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acddb0d-6c0b-420c-9df5-f1b89d56b21e" containerName="mariadb-database-create" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.603873 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.608941 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.609208 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.609257 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q47ks" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.619114 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-twj2g"] Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.694203 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-scripts\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.694521 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.694570 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-config-data\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.698721 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftf6\" (UniqueName: \"kubernetes.io/projected/3b3c5416-85f2-4109-892a-33079d1541d9-kube-api-access-xftf6\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.800063 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-config-data\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.800134 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftf6\" (UniqueName: \"kubernetes.io/projected/3b3c5416-85f2-4109-892a-33079d1541d9-kube-api-access-xftf6\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.800211 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-scripts\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.800241 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.806678 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.807969 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-scripts\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.810509 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-config-data\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.818004 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftf6\" (UniqueName: \"kubernetes.io/projected/3b3c5416-85f2-4109-892a-33079d1541d9-kube-api-access-xftf6\") pod \"nova-cell0-conductor-db-sync-twj2g\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:38 crc kubenswrapper[5129]: I0314 07:22:38.920858 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:22:39 crc kubenswrapper[5129]: I0314 07:22:39.374615 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-twj2g"] Mar 14 07:22:39 crc kubenswrapper[5129]: I0314 07:22:39.385382 5129 generic.go:334] "Generic (PLEG): container finished" podID="90a4650a-066d-455d-987d-a67b396fd4d9" containerID="caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775" exitCode=143 Mar 14 07:22:39 crc kubenswrapper[5129]: I0314 07:22:39.385424 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90a4650a-066d-455d-987d-a67b396fd4d9","Type":"ContainerDied","Data":"caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775"} Mar 14 07:22:39 crc kubenswrapper[5129]: W0314 07:22:39.402144 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b3c5416_85f2_4109_892a_33079d1541d9.slice/crio-67751cd09f0b0f699d1d45740e7b8cc7e01b12ce8135f4d322dd550d4549d150 WatchSource:0}: Error finding container 67751cd09f0b0f699d1d45740e7b8cc7e01b12ce8135f4d322dd550d4549d150: Status 404 returned error can't find the container with id 67751cd09f0b0f699d1d45740e7b8cc7e01b12ce8135f4d322dd550d4549d150 Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.398694 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-twj2g" event={"ID":"3b3c5416-85f2-4109-892a-33079d1541d9","Type":"ContainerStarted","Data":"67751cd09f0b0f699d1d45740e7b8cc7e01b12ce8135f4d322dd550d4549d150"} Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.403223 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerStarted","Data":"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f"} Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.403393 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-central-agent" containerID="cri-o://f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" gracePeriod=30 Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.403422 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.403439 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="proxy-httpd" containerID="cri-o://da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" gracePeriod=30 Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.403495 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-notification-agent" containerID="cri-o://f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" gracePeriod=30 Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.403594 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="sg-core" containerID="cri-o://7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" gracePeriod=30 Mar 14 07:22:40 crc kubenswrapper[5129]: I0314 07:22:40.430676 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.364778788 podStartE2EDuration="6.430659865s" podCreationTimestamp="2026-03-14 07:22:34 +0000 UTC" firstStartedPulling="2026-03-14 07:22:35.267661444 +0000 UTC m=+1418.019576628" lastFinishedPulling="2026-03-14 07:22:39.333542521 +0000 UTC m=+1422.085457705" observedRunningTime="2026-03-14 07:22:40.42933904 +0000 UTC m=+1423.181254234" watchObservedRunningTime="2026-03-14 07:22:40.430659865 +0000 UTC m=+1423.182575049" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.205144 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.348448 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-scripts\") pod \"3065de65-644e-4977-a64c-71aa308a7401\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.348500 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-config-data\") pod \"3065de65-644e-4977-a64c-71aa308a7401\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.348596 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-sg-core-conf-yaml\") pod \"3065de65-644e-4977-a64c-71aa308a7401\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.348725 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-combined-ca-bundle\") pod \"3065de65-644e-4977-a64c-71aa308a7401\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.348775 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-log-httpd\") pod \"3065de65-644e-4977-a64c-71aa308a7401\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.348817 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55t7x\" (UniqueName: \"kubernetes.io/projected/3065de65-644e-4977-a64c-71aa308a7401-kube-api-access-55t7x\") pod \"3065de65-644e-4977-a64c-71aa308a7401\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.348851 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-run-httpd\") pod \"3065de65-644e-4977-a64c-71aa308a7401\" (UID: \"3065de65-644e-4977-a64c-71aa308a7401\") " Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.349459 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3065de65-644e-4977-a64c-71aa308a7401" (UID: "3065de65-644e-4977-a64c-71aa308a7401"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.349631 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3065de65-644e-4977-a64c-71aa308a7401" (UID: "3065de65-644e-4977-a64c-71aa308a7401"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.355047 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3065de65-644e-4977-a64c-71aa308a7401-kube-api-access-55t7x" (OuterVolumeSpecName: "kube-api-access-55t7x") pod "3065de65-644e-4977-a64c-71aa308a7401" (UID: "3065de65-644e-4977-a64c-71aa308a7401"). InnerVolumeSpecName "kube-api-access-55t7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.373576 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-scripts" (OuterVolumeSpecName: "scripts") pod "3065de65-644e-4977-a64c-71aa308a7401" (UID: "3065de65-644e-4977-a64c-71aa308a7401"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.374369 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3065de65-644e-4977-a64c-71aa308a7401" (UID: "3065de65-644e-4977-a64c-71aa308a7401"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416631 5129 generic.go:334] "Generic (PLEG): container finished" podID="3065de65-644e-4977-a64c-71aa308a7401" containerID="da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" exitCode=0 Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416671 5129 generic.go:334] "Generic (PLEG): container finished" podID="3065de65-644e-4977-a64c-71aa308a7401" containerID="7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" exitCode=2 Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416681 5129 generic.go:334] "Generic (PLEG): container finished" podID="3065de65-644e-4977-a64c-71aa308a7401" containerID="f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" exitCode=0 Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416689 5129 generic.go:334] "Generic (PLEG): container finished" podID="3065de65-644e-4977-a64c-71aa308a7401" containerID="f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" exitCode=0 Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416712 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerDied","Data":"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f"} Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416742 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerDied","Data":"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6"} Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416758 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerDied","Data":"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4"} Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416769 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerDied","Data":"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7"} Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416781 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3065de65-644e-4977-a64c-71aa308a7401","Type":"ContainerDied","Data":"4907b07defa1059e2ea33b60562276bee7cbbe12b0fb46d1c3f5b1706e0294d7"} Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416798 5129 scope.go:117] "RemoveContainer" containerID="da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.416960 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.426562 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3065de65-644e-4977-a64c-71aa308a7401" (UID: "3065de65-644e-4977-a64c-71aa308a7401"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.451344 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.451411 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.451426 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.451439 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.451453 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55t7x\" (UniqueName: \"kubernetes.io/projected/3065de65-644e-4977-a64c-71aa308a7401-kube-api-access-55t7x\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.451464 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3065de65-644e-4977-a64c-71aa308a7401-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.451587 5129 scope.go:117] "RemoveContainer" containerID="7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.482012 5129 scope.go:117] "RemoveContainer" containerID="f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.483823 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-config-data" (OuterVolumeSpecName: "config-data") pod "3065de65-644e-4977-a64c-71aa308a7401" (UID: "3065de65-644e-4977-a64c-71aa308a7401"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.500617 5129 scope.go:117] "RemoveContainer" containerID="f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.528394 5129 scope.go:117] "RemoveContainer" containerID="da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.528836 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": container with ID starting with da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f not found: ID does not exist" containerID="da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.528905 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f"} err="failed to get container status \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": rpc error: code = NotFound desc = could not find container \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": container with ID starting with da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.528956 5129 scope.go:117] "RemoveContainer" containerID="7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.529570 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": container with ID starting with 7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6 not found: ID does not exist" containerID="7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.529613 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6"} err="failed to get container status \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": rpc error: code = NotFound desc = could not find container \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": container with ID starting with 7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.529635 5129 scope.go:117] "RemoveContainer" containerID="f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.529894 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": container with ID starting with f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4 not found: ID does not exist" containerID="f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.529940 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4"} err="failed to get container status \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": rpc error: code = NotFound desc = could not find container \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": container with ID starting with f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.529956 5129 scope.go:117] "RemoveContainer" containerID="f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.530336 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": container with ID starting with f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7 not found: ID does not exist" containerID="f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.530376 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7"} err="failed to get container status \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": rpc error: code = NotFound desc = could not find container \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": container with ID starting with f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.530389 5129 scope.go:117] "RemoveContainer" containerID="da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.530888 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f"} err="failed to get container status \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": rpc error: code = NotFound desc = could not find container \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": container with ID starting with da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.530937 5129 scope.go:117] "RemoveContainer" containerID="7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.531424 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6"} err="failed to get container status \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": rpc error: code = NotFound desc = could not find container \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": container with ID starting with 7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.531442 5129 scope.go:117] "RemoveContainer" containerID="f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.531685 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4"} err="failed to get container status \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": rpc error: code = NotFound desc = could not find container \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": container with ID starting with f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.531704 5129 scope.go:117] "RemoveContainer" containerID="f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.531968 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7"} err="failed to get container status \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": rpc error: code = NotFound desc = could not find container \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": container with ID starting with f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.531981 5129 scope.go:117] "RemoveContainer" containerID="da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.532467 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f"} err="failed to get container status \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": rpc error: code = NotFound desc = could not find container \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": container with ID starting with da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.532482 5129 scope.go:117] "RemoveContainer" containerID="7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533350 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6"} err="failed to get container status \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": rpc error: code = NotFound desc = could not find container \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": container with ID starting with 7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533370 5129 scope.go:117] "RemoveContainer" containerID="f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533647 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4"} err="failed to get container status \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": rpc error: code = NotFound desc = could not find container \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": container with ID starting with f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533660 5129 scope.go:117] "RemoveContainer" containerID="f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533814 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7"} err="failed to get container status \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": rpc error: code = NotFound desc = could not find container \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": container with ID starting with f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533829 5129 scope.go:117] "RemoveContainer" containerID="da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533981 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f"} err="failed to get container status \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": rpc error: code = NotFound desc = could not find container \"da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f\": container with ID starting with da33728a191ce7c43079a556a34f792ade459b7601ff7f7a42d60cbf2a6b1d2f not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.533999 5129 scope.go:117] "RemoveContainer" containerID="7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.534182 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6"} err="failed to get container status \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": rpc error: code = NotFound desc = could not find container \"7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6\": container with ID starting with 7141b7c346660cdfb65aac6db294604b04e420a9d327fbd6e7723f750a7888b6 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.534199 5129 scope.go:117] "RemoveContainer" containerID="f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.534578 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4"} err="failed to get container status \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": rpc error: code = NotFound desc = could not find container \"f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4\": container with ID starting with f754f1909c18bd75cec909f56cb2d119431ab863c719cdf7807b9ff0150e03b4 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.534621 5129 scope.go:117] "RemoveContainer" containerID="f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.535185 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7"} err="failed to get container status \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": rpc error: code = NotFound desc = could not find container \"f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7\": container with ID starting with f5076878630c7aed7b8d58b90d7b455bb4c198e628fed849fe1941108c1da7b7 not found: ID does not exist" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.553159 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3065de65-644e-4977-a64c-71aa308a7401-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.818318 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.837377 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.848649 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.849050 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-central-agent" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849061 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-central-agent" Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.849079 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="sg-core" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849084 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="sg-core" Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.849105 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="proxy-httpd" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849111 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="proxy-httpd" Mar 14 07:22:41 crc kubenswrapper[5129]: E0314 07:22:41.849127 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-notification-agent" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849133 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-notification-agent" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849298 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-notification-agent" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849311 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="ceilometer-central-agent" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849325 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="sg-core" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.849341 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3065de65-644e-4977-a64c-71aa308a7401" containerName="proxy-httpd" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.865289 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.865377 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.869151 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.869313 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.961369 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.961737 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-log" containerID="cri-o://0360e27fb95df7cd16afa2a836684c36cdef6ee7059adcb428f3ebdbf54b7f69" gracePeriod=30 Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.961863 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-log-httpd\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.961869 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-httpd" containerID="cri-o://10f5dac652b667a1b23938a6a39a2c391a126cd5863bfdd75416622e28ddf506" gracePeriod=30 Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.961999 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-run-httpd\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.962176 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79dvc\" (UniqueName: \"kubernetes.io/projected/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-kube-api-access-79dvc\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.962300 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.962465 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.962744 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-config-data\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:41 crc kubenswrapper[5129]: I0314 07:22:41.962793 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-scripts\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.047225 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3065de65-644e-4977-a64c-71aa308a7401" path="/var/lib/kubelet/pods/3065de65-644e-4977-a64c-71aa308a7401/volumes" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.065010 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-config-data\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.065059 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-scripts\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.065096 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-log-httpd\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.065138 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-run-httpd\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.065162 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79dvc\" (UniqueName: \"kubernetes.io/projected/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-kube-api-access-79dvc\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.065191 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.065236 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.066681 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-log-httpd\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.066749 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-run-httpd\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.072681 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.073643 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-scripts\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.074545 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-config-data\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.079522 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.094302 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79dvc\" (UniqueName: \"kubernetes.io/projected/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-kube-api-access-79dvc\") pod \"ceilometer-0\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.126225 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.126886 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.225157 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269241 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-scripts\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269312 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-config-data\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269346 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-combined-ca-bundle\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269421 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-httpd-run\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269470 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-logs\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269553 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-public-tls-certs\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269591 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn4z8\" (UniqueName: \"kubernetes.io/projected/90a4650a-066d-455d-987d-a67b396fd4d9-kube-api-access-bn4z8\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.269643 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"90a4650a-066d-455d-987d-a67b396fd4d9\" (UID: \"90a4650a-066d-455d-987d-a67b396fd4d9\") " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.270844 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.273033 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.273397 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-logs" (OuterVolumeSpecName: "logs") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.276311 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-scripts" (OuterVolumeSpecName: "scripts") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.278513 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a4650a-066d-455d-987d-a67b396fd4d9-kube-api-access-bn4z8" (OuterVolumeSpecName: "kube-api-access-bn4z8") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "kube-api-access-bn4z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.305840 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.328455 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.333657 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-config-data" (OuterVolumeSpecName: "config-data") pod "90a4650a-066d-455d-987d-a67b396fd4d9" (UID: "90a4650a-066d-455d-987d-a67b396fd4d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372009 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372273 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372371 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372435 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372496 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372553 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4650a-066d-455d-987d-a67b396fd4d9-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372627 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a4650a-066d-455d-987d-a67b396fd4d9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.372685 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn4z8\" (UniqueName: \"kubernetes.io/projected/90a4650a-066d-455d-987d-a67b396fd4d9-kube-api-access-bn4z8\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.395923 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.430678 5129 generic.go:334] "Generic (PLEG): container finished" podID="90a4650a-066d-455d-987d-a67b396fd4d9" containerID="af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c" exitCode=0 Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.431721 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90a4650a-066d-455d-987d-a67b396fd4d9","Type":"ContainerDied","Data":"af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c"} Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.431846 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90a4650a-066d-455d-987d-a67b396fd4d9","Type":"ContainerDied","Data":"d7838e8a795c34cca10b06e361f98d01ce99725bee18229a7703a2ac15865a32"} Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.431944 5129 scope.go:117] "RemoveContainer" containerID="af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.432179 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.460104 5129 generic.go:334] "Generic (PLEG): container finished" podID="bb182a34-3807-465d-b706-929d0abe4904" containerID="0360e27fb95df7cd16afa2a836684c36cdef6ee7059adcb428f3ebdbf54b7f69" exitCode=143 Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.460172 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb182a34-3807-465d-b706-929d0abe4904","Type":"ContainerDied","Data":"0360e27fb95df7cd16afa2a836684c36cdef6ee7059adcb428f3ebdbf54b7f69"} Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.486491 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.491975 5129 scope.go:117] "RemoveContainer" containerID="caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.494221 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.503333 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.520971 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:42 crc kubenswrapper[5129]: E0314 07:22:42.521351 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-httpd" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.521367 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-httpd" Mar 14 07:22:42 crc kubenswrapper[5129]: E0314 07:22:42.521380 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-log" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.521386 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-log" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.521566 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-log" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.521579 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" containerName="glance-httpd" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.522536 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.528952 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.529159 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.530065 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.545924 5129 scope.go:117] "RemoveContainer" containerID="af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c" Mar 14 07:22:42 crc kubenswrapper[5129]: E0314 07:22:42.549780 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c\": container with ID starting with af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c not found: ID does not exist" containerID="af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.549815 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c"} err="failed to get container status \"af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c\": rpc error: code = NotFound desc = could not find container \"af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c\": container with ID starting with af802b2d7421fed32265120f9b6ea9961c9329ba6323820ed2fe2213b264238c not found: ID does not exist" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.549842 5129 scope.go:117] "RemoveContainer" containerID="caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775" Mar 14 07:22:42 crc kubenswrapper[5129]: E0314 07:22:42.550504 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775\": container with ID starting with caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775 not found: ID does not exist" containerID="caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.550548 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775"} err="failed to get container status \"caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775\": rpc error: code = NotFound desc = could not find container \"caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775\": container with ID starting with caa8b38235def44e44e3cb70551fb1ad580918f03302884d0e6b1226ec93a775 not found: ID does not exist" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.587988 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.588030 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.588054 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-logs\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.588085 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-config-data\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.588111 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-scripts\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.588131 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfzf8\" (UniqueName: \"kubernetes.io/projected/73a06d78-48be-4099-b7fa-be0557b6138e-kube-api-access-wfzf8\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.588147 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.588215 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.682808 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:42 crc kubenswrapper[5129]: W0314 07:22:42.683256 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b8dd3c8_eafc_404e_b160_0e03cfd43ef5.slice/crio-7c212026b3c5c3a22bc9bee324f509eb7cf06fbb7792cc46e65a4797d90dc116 WatchSource:0}: Error finding container 7c212026b3c5c3a22bc9bee324f509eb7cf06fbb7792cc46e65a4797d90dc116: Status 404 returned error can't find the container with id 7c212026b3c5c3a22bc9bee324f509eb7cf06fbb7792cc46e65a4797d90dc116 Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690036 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-config-data\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690094 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-scripts\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690122 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfzf8\" (UniqueName: \"kubernetes.io/projected/73a06d78-48be-4099-b7fa-be0557b6138e-kube-api-access-wfzf8\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690143 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690237 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690333 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690356 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.690545 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-logs\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.691118 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-logs\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.691184 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.691269 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.697404 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-config-data\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.698028 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.700205 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-scripts\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.701894 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.713960 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfzf8\" (UniqueName: \"kubernetes.io/projected/73a06d78-48be-4099-b7fa-be0557b6138e-kube-api-access-wfzf8\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.742114 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " pod="openstack/glance-default-external-api-0" Mar 14 07:22:42 crc kubenswrapper[5129]: I0314 07:22:42.860526 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:22:43 crc kubenswrapper[5129]: I0314 07:22:43.471572 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerStarted","Data":"7c212026b3c5c3a22bc9bee324f509eb7cf06fbb7792cc46e65a4797d90dc116"} Mar 14 07:22:43 crc kubenswrapper[5129]: I0314 07:22:43.499594 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:22:43 crc kubenswrapper[5129]: W0314 07:22:43.519359 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73a06d78_48be_4099_b7fa_be0557b6138e.slice/crio-8597ce79a857cf34460f8fa1817f53b9faa5fe60bf1de30cf299a12972307200 WatchSource:0}: Error finding container 8597ce79a857cf34460f8fa1817f53b9faa5fe60bf1de30cf299a12972307200: Status 404 returned error can't find the container with id 8597ce79a857cf34460f8fa1817f53b9faa5fe60bf1de30cf299a12972307200 Mar 14 07:22:43 crc kubenswrapper[5129]: I0314 07:22:43.891404 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:22:43 crc kubenswrapper[5129]: I0314 07:22:43.958230 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f8d656d4-bfw28"] Mar 14 07:22:43 crc kubenswrapper[5129]: I0314 07:22:43.958507 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f8d656d4-bfw28" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-api" containerID="cri-o://79c401fa0036c778172e1bdeb74a8a13f2286ff8e77fe730b93bf628fd2f40be" gracePeriod=30 Mar 14 07:22:43 crc kubenswrapper[5129]: I0314 07:22:43.959421 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f8d656d4-bfw28" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-httpd" containerID="cri-o://14e1f2bdf58c845f4d0ee2c1e34bb9aa30c8152615ec6a1270697b0b4d949a6b" gracePeriod=30 Mar 14 07:22:44 crc kubenswrapper[5129]: I0314 07:22:44.050952 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a4650a-066d-455d-987d-a67b396fd4d9" path="/var/lib/kubelet/pods/90a4650a-066d-455d-987d-a67b396fd4d9/volumes" Mar 14 07:22:44 crc kubenswrapper[5129]: I0314 07:22:44.484956 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerStarted","Data":"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2"} Mar 14 07:22:44 crc kubenswrapper[5129]: I0314 07:22:44.486671 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a06d78-48be-4099-b7fa-be0557b6138e","Type":"ContainerStarted","Data":"8597ce79a857cf34460f8fa1817f53b9faa5fe60bf1de30cf299a12972307200"} Mar 14 07:22:44 crc kubenswrapper[5129]: I0314 07:22:44.491304 5129 generic.go:334] "Generic (PLEG): container finished" podID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerID="14e1f2bdf58c845f4d0ee2c1e34bb9aa30c8152615ec6a1270697b0b4d949a6b" exitCode=0 Mar 14 07:22:44 crc kubenswrapper[5129]: I0314 07:22:44.491348 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8d656d4-bfw28" event={"ID":"d0a291cb-33cf-45e8-8a69-697f1503e4fb","Type":"ContainerDied","Data":"14e1f2bdf58c845f4d0ee2c1e34bb9aa30c8152615ec6a1270697b0b4d949a6b"} Mar 14 07:22:45 crc kubenswrapper[5129]: I0314 07:22:45.503138 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a06d78-48be-4099-b7fa-be0557b6138e","Type":"ContainerStarted","Data":"a0297ed0b9af03c2d6e5fa02d749275ab59da2eca117907d6e9e8d9642cba071"} Mar 14 07:22:45 crc kubenswrapper[5129]: I0314 07:22:45.505928 5129 generic.go:334] "Generic (PLEG): container finished" podID="bb182a34-3807-465d-b706-929d0abe4904" containerID="10f5dac652b667a1b23938a6a39a2c391a126cd5863bfdd75416622e28ddf506" exitCode=0 Mar 14 07:22:45 crc kubenswrapper[5129]: I0314 07:22:45.505962 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb182a34-3807-465d-b706-929d0abe4904","Type":"ContainerDied","Data":"10f5dac652b667a1b23938a6a39a2c391a126cd5863bfdd75416622e28ddf506"} Mar 14 07:22:47 crc kubenswrapper[5129]: I0314 07:22:47.238757 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": dial tcp 10.217.0.156:9292: connect: connection refused" Mar 14 07:22:47 crc kubenswrapper[5129]: I0314 07:22:47.238774 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": dial tcp 10.217.0.156:9292: connect: connection refused" Mar 14 07:22:50 crc kubenswrapper[5129]: I0314 07:22:50.562866 5129 generic.go:334] "Generic (PLEG): container finished" podID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerID="79c401fa0036c778172e1bdeb74a8a13f2286ff8e77fe730b93bf628fd2f40be" exitCode=0 Mar 14 07:22:50 crc kubenswrapper[5129]: I0314 07:22:50.562932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8d656d4-bfw28" event={"ID":"d0a291cb-33cf-45e8-8a69-697f1503e4fb","Type":"ContainerDied","Data":"79c401fa0036c778172e1bdeb74a8a13f2286ff8e77fe730b93bf628fd2f40be"} Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.159968 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.273870 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-ovndb-tls-certs\") pod \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.273943 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-combined-ca-bundle\") pod \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.274027 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-config\") pod \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.274066 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x97mr\" (UniqueName: \"kubernetes.io/projected/d0a291cb-33cf-45e8-8a69-697f1503e4fb-kube-api-access-x97mr\") pod \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.274091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-httpd-config\") pod \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\" (UID: \"d0a291cb-33cf-45e8-8a69-697f1503e4fb\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.308573 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a291cb-33cf-45e8-8a69-697f1503e4fb-kube-api-access-x97mr" (OuterVolumeSpecName: "kube-api-access-x97mr") pod "d0a291cb-33cf-45e8-8a69-697f1503e4fb" (UID: "d0a291cb-33cf-45e8-8a69-697f1503e4fb"). InnerVolumeSpecName "kube-api-access-x97mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.308997 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d0a291cb-33cf-45e8-8a69-697f1503e4fb" (UID: "d0a291cb-33cf-45e8-8a69-697f1503e4fb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.376484 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x97mr\" (UniqueName: \"kubernetes.io/projected/d0a291cb-33cf-45e8-8a69-697f1503e4fb-kube-api-access-x97mr\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.376519 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.392667 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0a291cb-33cf-45e8-8a69-697f1503e4fb" (UID: "d0a291cb-33cf-45e8-8a69-697f1503e4fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.433245 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d0a291cb-33cf-45e8-8a69-697f1503e4fb" (UID: "d0a291cb-33cf-45e8-8a69-697f1503e4fb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.451974 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-config" (OuterVolumeSpecName: "config") pod "d0a291cb-33cf-45e8-8a69-697f1503e4fb" (UID: "d0a291cb-33cf-45e8-8a69-697f1503e4fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.478417 5129 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.478766 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.478865 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0a291cb-33cf-45e8-8a69-697f1503e4fb-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.574238 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerStarted","Data":"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1"} Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.577888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a06d78-48be-4099-b7fa-be0557b6138e","Type":"ContainerStarted","Data":"94ff5f18b233b64c1c783dbe228636e28477df4e6e43805ef408f8e1f99138ec"} Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.580089 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f8d656d4-bfw28" event={"ID":"d0a291cb-33cf-45e8-8a69-697f1503e4fb","Type":"ContainerDied","Data":"e6dd8fa37c015d99e06d39dc2d2c6bca96f12281515a6eeed438bd636ae71e71"} Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.580145 5129 scope.go:117] "RemoveContainer" containerID="14e1f2bdf58c845f4d0ee2c1e34bb9aa30c8152615ec6a1270697b0b4d949a6b" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.580292 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f8d656d4-bfw28" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.582131 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-twj2g" event={"ID":"3b3c5416-85f2-4109-892a-33079d1541d9","Type":"ContainerStarted","Data":"26b4aec00cbbff617806653a000ecc4903072ef2737594d30098d8439aeda080"} Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.610044 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.61002194 podStartE2EDuration="9.61002194s" podCreationTimestamp="2026-03-14 07:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:51.608550431 +0000 UTC m=+1434.360465625" watchObservedRunningTime="2026-03-14 07:22:51.61002194 +0000 UTC m=+1434.361937134" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.640288 5129 scope.go:117] "RemoveContainer" containerID="79c401fa0036c778172e1bdeb74a8a13f2286ff8e77fe730b93bf628fd2f40be" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.662543 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f8d656d4-bfw28"] Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.672653 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f8d656d4-bfw28"] Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.685200 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-twj2g" podStartSLOduration=1.984810157 podStartE2EDuration="13.685180701s" podCreationTimestamp="2026-03-14 07:22:38 +0000 UTC" firstStartedPulling="2026-03-14 07:22:39.408735363 +0000 UTC m=+1422.160650537" lastFinishedPulling="2026-03-14 07:22:51.109105897 +0000 UTC m=+1433.861021081" observedRunningTime="2026-03-14 07:22:51.656173615 +0000 UTC m=+1434.408088799" watchObservedRunningTime="2026-03-14 07:22:51.685180701 +0000 UTC m=+1434.437095905" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.957104 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.990656 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-scripts\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.990707 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-internal-tls-certs\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.990804 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.990834 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-httpd-run\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.990848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-config-data\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.990901 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-combined-ca-bundle\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.990969 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvbd4\" (UniqueName: \"kubernetes.io/projected/bb182a34-3807-465d-b706-929d0abe4904-kube-api-access-kvbd4\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.991035 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-logs\") pod \"bb182a34-3807-465d-b706-929d0abe4904\" (UID: \"bb182a34-3807-465d-b706-929d0abe4904\") " Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.991683 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-logs" (OuterVolumeSpecName: "logs") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.991707 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:51 crc kubenswrapper[5129]: I0314 07:22:51.999714 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.015489 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb182a34-3807-465d-b706-929d0abe4904-kube-api-access-kvbd4" (OuterVolumeSpecName: "kube-api-access-kvbd4") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "kube-api-access-kvbd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.017060 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-scripts" (OuterVolumeSpecName: "scripts") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.066118 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" path="/var/lib/kubelet/pods/d0a291cb-33cf-45e8-8a69-697f1503e4fb/volumes" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.066426 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.096340 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-config-data" (OuterVolumeSpecName: "config-data") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.097592 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvbd4\" (UniqueName: \"kubernetes.io/projected/bb182a34-3807-465d-b706-929d0abe4904-kube-api-access-kvbd4\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.097689 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.097701 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.097722 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.097732 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb182a34-3807-465d-b706-929d0abe4904-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.097742 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.097751 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.113941 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bb182a34-3807-465d-b706-929d0abe4904" (UID: "bb182a34-3807-465d-b706-929d0abe4904"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.148301 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.199178 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb182a34-3807-465d-b706-929d0abe4904-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.199213 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.605547 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerStarted","Data":"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b"} Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.608171 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb182a34-3807-465d-b706-929d0abe4904","Type":"ContainerDied","Data":"7f5fafd24b6f41853d26333565ae435f18109848a05677376eb869366381cbfa"} Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.608241 5129 scope.go:117] "RemoveContainer" containerID="10f5dac652b667a1b23938a6a39a2c391a126cd5863bfdd75416622e28ddf506" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.608242 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.641497 5129 scope.go:117] "RemoveContainer" containerID="0360e27fb95df7cd16afa2a836684c36cdef6ee7059adcb428f3ebdbf54b7f69" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.650253 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.678812 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.692809 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:52 crc kubenswrapper[5129]: E0314 07:22:52.694059 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-httpd" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694145 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-httpd" Mar 14 07:22:52 crc kubenswrapper[5129]: E0314 07:22:52.694232 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-httpd" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694301 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-httpd" Mar 14 07:22:52 crc kubenswrapper[5129]: E0314 07:22:52.694359 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-api" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694416 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-api" Mar 14 07:22:52 crc kubenswrapper[5129]: E0314 07:22:52.694479 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-log" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694535 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-log" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694772 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-httpd" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694845 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-httpd" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694913 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a291cb-33cf-45e8-8a69-697f1503e4fb" containerName="neutron-api" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.694974 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb182a34-3807-465d-b706-929d0abe4904" containerName="glance-log" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.696046 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.700220 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.709584 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.727525 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.812786 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wpl\" (UniqueName: \"kubernetes.io/projected/b0a5119c-8784-48e5-841a-654dc253f0d0-kube-api-access-v7wpl\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.812855 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.812965 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.812998 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.813048 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.813072 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.813089 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.813111 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.862488 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.862747 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.891999 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.903808 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.914822 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.914878 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.914903 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.914934 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.915014 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wpl\" (UniqueName: \"kubernetes.io/projected/b0a5119c-8784-48e5-841a-654dc253f0d0-kube-api-access-v7wpl\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.915059 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.915150 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.915183 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.915907 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.915919 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.915927 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.922252 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.922626 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.923212 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.935217 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.940258 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wpl\" (UniqueName: \"kubernetes.io/projected/b0a5119c-8784-48e5-841a-654dc253f0d0-kube-api-access-v7wpl\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:52 crc kubenswrapper[5129]: I0314 07:22:52.977064 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " pod="openstack/glance-default-internal-api-0" Mar 14 07:22:53 crc kubenswrapper[5129]: I0314 07:22:53.016614 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:22:53 crc kubenswrapper[5129]: I0314 07:22:53.617649 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:22:53 crc kubenswrapper[5129]: I0314 07:22:53.618071 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 07:22:53 crc kubenswrapper[5129]: I0314 07:22:53.675163 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:22:53 crc kubenswrapper[5129]: W0314 07:22:53.687103 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a5119c_8784_48e5_841a_654dc253f0d0.slice/crio-b183b39a72e933f206d3aff379c314c3a7df9dcb04f04bccb4ce2810cbc81c44 WatchSource:0}: Error finding container b183b39a72e933f206d3aff379c314c3a7df9dcb04f04bccb4ce2810cbc81c44: Status 404 returned error can't find the container with id b183b39a72e933f206d3aff379c314c3a7df9dcb04f04bccb4ce2810cbc81c44 Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.048959 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb182a34-3807-465d-b706-929d0abe4904" path="/var/lib/kubelet/pods/bb182a34-3807-465d-b706-929d0abe4904/volumes" Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.636456 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0a5119c-8784-48e5-841a-654dc253f0d0","Type":"ContainerStarted","Data":"487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636"} Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.636791 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0a5119c-8784-48e5-841a-654dc253f0d0","Type":"ContainerStarted","Data":"b183b39a72e933f206d3aff379c314c3a7df9dcb04f04bccb4ce2810cbc81c44"} Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.644293 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerStarted","Data":"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1"} Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.644565 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="sg-core" containerID="cri-o://c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" gracePeriod=30 Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.644642 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="proxy-httpd" containerID="cri-o://5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" gracePeriod=30 Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.644584 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-central-agent" containerID="cri-o://218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" gracePeriod=30 Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.644656 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-notification-agent" containerID="cri-o://ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" gracePeriod=30 Mar 14 07:22:54 crc kubenswrapper[5129]: I0314 07:22:54.713820 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.74484053 podStartE2EDuration="13.713797775s" podCreationTimestamp="2026-03-14 07:22:41 +0000 UTC" firstStartedPulling="2026-03-14 07:22:42.68641274 +0000 UTC m=+1425.438327924" lastFinishedPulling="2026-03-14 07:22:53.655369975 +0000 UTC m=+1436.407285169" observedRunningTime="2026-03-14 07:22:54.707708202 +0000 UTC m=+1437.459623396" watchObservedRunningTime="2026-03-14 07:22:54.713797775 +0000 UTC m=+1437.465712959" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.395242 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.473344 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-combined-ca-bundle\") pod \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.473406 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79dvc\" (UniqueName: \"kubernetes.io/projected/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-kube-api-access-79dvc\") pod \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.473435 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-run-httpd\") pod \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.473502 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-scripts\") pod \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.473538 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-log-httpd\") pod \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.473634 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-config-data\") pod \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.473715 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-sg-core-conf-yaml\") pod \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\" (UID: \"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5\") " Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.474432 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" (UID: "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.475138 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" (UID: "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.482745 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-scripts" (OuterVolumeSpecName: "scripts") pod "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" (UID: "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.484585 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-kube-api-access-79dvc" (OuterVolumeSpecName: "kube-api-access-79dvc") pod "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" (UID: "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5"). InnerVolumeSpecName "kube-api-access-79dvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.504992 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" (UID: "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.567502 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" (UID: "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.575691 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.575724 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.575732 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.575742 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.575753 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79dvc\" (UniqueName: \"kubernetes.io/projected/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-kube-api-access-79dvc\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.575761 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.594315 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-config-data" (OuterVolumeSpecName: "config-data") pod "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" (UID: "0b8dd3c8-eafc-404e-b160-0e03cfd43ef5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655558 5129 generic.go:334] "Generic (PLEG): container finished" podID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerID="5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" exitCode=0 Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655594 5129 generic.go:334] "Generic (PLEG): container finished" podID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerID="c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" exitCode=2 Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655624 5129 generic.go:334] "Generic (PLEG): container finished" podID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerID="ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" exitCode=0 Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655634 5129 generic.go:334] "Generic (PLEG): container finished" podID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerID="218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" exitCode=0 Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655683 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerDied","Data":"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1"} Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655713 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerDied","Data":"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b"} Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655729 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerDied","Data":"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1"} Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655740 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerDied","Data":"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2"} Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655751 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b8dd3c8-eafc-404e-b160-0e03cfd43ef5","Type":"ContainerDied","Data":"7c212026b3c5c3a22bc9bee324f509eb7cf06fbb7792cc46e65a4797d90dc116"} Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655771 5129 scope.go:117] "RemoveContainer" containerID="5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.655915 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.663455 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0a5119c-8784-48e5-841a-654dc253f0d0","Type":"ContainerStarted","Data":"51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d"} Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.677550 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.686794 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.686777487 podStartE2EDuration="3.686777487s" podCreationTimestamp="2026-03-14 07:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:22:55.681773104 +0000 UTC m=+1438.433688288" watchObservedRunningTime="2026-03-14 07:22:55.686777487 +0000 UTC m=+1438.438692671" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.754451 5129 scope.go:117] "RemoveContainer" containerID="c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.757768 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.767926 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788120 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.788520 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="sg-core" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788542 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="sg-core" Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.788554 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-notification-agent" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788561 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-notification-agent" Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.788586 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="proxy-httpd" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788592 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="proxy-httpd" Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.788621 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-central-agent" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788627 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-central-agent" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788792 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="sg-core" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788805 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-notification-agent" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788820 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="proxy-httpd" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.788829 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" containerName="ceilometer-central-agent" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.790340 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.792780 5129 scope.go:117] "RemoveContainer" containerID="ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.793175 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.794091 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.806105 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.829839 5129 scope.go:117] "RemoveContainer" containerID="218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.855651 5129 scope.go:117] "RemoveContainer" containerID="5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.856160 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": container with ID starting with 5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1 not found: ID does not exist" containerID="5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.856187 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1"} err="failed to get container status \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": rpc error: code = NotFound desc = could not find container \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": container with ID starting with 5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.856206 5129 scope.go:117] "RemoveContainer" containerID="c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.856567 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": container with ID starting with c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b not found: ID does not exist" containerID="c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.856585 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b"} err="failed to get container status \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": rpc error: code = NotFound desc = could not find container \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": container with ID starting with c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.856652 5129 scope.go:117] "RemoveContainer" containerID="ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.856996 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": container with ID starting with ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1 not found: ID does not exist" containerID="ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.857014 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1"} err="failed to get container status \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": rpc error: code = NotFound desc = could not find container \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": container with ID starting with ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.857026 5129 scope.go:117] "RemoveContainer" containerID="218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" Mar 14 07:22:55 crc kubenswrapper[5129]: E0314 07:22:55.857354 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": container with ID starting with 218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2 not found: ID does not exist" containerID="218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.857371 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2"} err="failed to get container status \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": rpc error: code = NotFound desc = could not find container \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": container with ID starting with 218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.857386 5129 scope.go:117] "RemoveContainer" containerID="5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.857735 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1"} err="failed to get container status \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": rpc error: code = NotFound desc = could not find container \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": container with ID starting with 5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.857754 5129 scope.go:117] "RemoveContainer" containerID="c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.858080 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b"} err="failed to get container status \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": rpc error: code = NotFound desc = could not find container \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": container with ID starting with c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.858095 5129 scope.go:117] "RemoveContainer" containerID="ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.858390 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1"} err="failed to get container status \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": rpc error: code = NotFound desc = could not find container \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": container with ID starting with ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.858418 5129 scope.go:117] "RemoveContainer" containerID="218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.858720 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2"} err="failed to get container status \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": rpc error: code = NotFound desc = could not find container \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": container with ID starting with 218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.858736 5129 scope.go:117] "RemoveContainer" containerID="5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.859037 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1"} err="failed to get container status \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": rpc error: code = NotFound desc = could not find container \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": container with ID starting with 5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.859053 5129 scope.go:117] "RemoveContainer" containerID="c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.859343 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b"} err="failed to get container status \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": rpc error: code = NotFound desc = could not find container \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": container with ID starting with c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.859358 5129 scope.go:117] "RemoveContainer" containerID="ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.859691 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1"} err="failed to get container status \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": rpc error: code = NotFound desc = could not find container \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": container with ID starting with ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.859846 5129 scope.go:117] "RemoveContainer" containerID="218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860098 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2"} err="failed to get container status \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": rpc error: code = NotFound desc = could not find container \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": container with ID starting with 218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860125 5129 scope.go:117] "RemoveContainer" containerID="5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860317 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1"} err="failed to get container status \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": rpc error: code = NotFound desc = could not find container \"5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1\": container with ID starting with 5af7ca9fd78afc2ed1147691c14f0a7e2ba90131b1882f0876ae8676c8ba77d1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860331 5129 scope.go:117] "RemoveContainer" containerID="c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860550 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b"} err="failed to get container status \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": rpc error: code = NotFound desc = could not find container \"c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b\": container with ID starting with c741a598513a1117bf3d4aaed3a731303527de91307bc3b908d72d7b91018f4b not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860564 5129 scope.go:117] "RemoveContainer" containerID="ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860840 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1"} err="failed to get container status \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": rpc error: code = NotFound desc = could not find container \"ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1\": container with ID starting with ff7ceb52c2705816c8037fd72a4d8640ab85869dd6b0f2a9a2144876ae950bd1 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.860856 5129 scope.go:117] "RemoveContainer" containerID="218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.861068 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2"} err="failed to get container status \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": rpc error: code = NotFound desc = could not find container \"218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2\": container with ID starting with 218a088394b0cceeffa9ffccede744c435cb9d51bc6138b66f818f54f74cc3f2 not found: ID does not exist" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.874850 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.882104 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snq5x\" (UniqueName: \"kubernetes.io/projected/d0e587b9-b290-49ee-8a37-e2361f851cce-kube-api-access-snq5x\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.882335 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-config-data\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.882437 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-run-httpd\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.882560 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-scripts\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.882630 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.882709 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.882861 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-log-httpd\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.984278 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-log-httpd\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.984378 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snq5x\" (UniqueName: \"kubernetes.io/projected/d0e587b9-b290-49ee-8a37-e2361f851cce-kube-api-access-snq5x\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.984482 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-config-data\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.984506 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-run-httpd\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.984528 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-scripts\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.984552 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.984591 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.986042 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-log-httpd\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.987007 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-run-httpd\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:55 crc kubenswrapper[5129]: I0314 07:22:55.989634 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-scripts\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.000583 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.000947 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-config-data\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.001393 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.005087 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snq5x\" (UniqueName: \"kubernetes.io/projected/d0e587b9-b290-49ee-8a37-e2361f851cce-kube-api-access-snq5x\") pod \"ceilometer-0\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " pod="openstack/ceilometer-0" Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.064914 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8dd3c8-eafc-404e-b160-0e03cfd43ef5" path="/var/lib/kubelet/pods/0b8dd3c8-eafc-404e-b160-0e03cfd43ef5/volumes" Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.117395 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.565113 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:22:56 crc kubenswrapper[5129]: W0314 07:22:56.568530 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0e587b9_b290_49ee_8a37_e2361f851cce.slice/crio-55a0c0fdc935f2c37fa0d21d255e21e47a2330198fdfdef18b48a61132dc7275 WatchSource:0}: Error finding container 55a0c0fdc935f2c37fa0d21d255e21e47a2330198fdfdef18b48a61132dc7275: Status 404 returned error can't find the container with id 55a0c0fdc935f2c37fa0d21d255e21e47a2330198fdfdef18b48a61132dc7275 Mar 14 07:22:56 crc kubenswrapper[5129]: I0314 07:22:56.674099 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerStarted","Data":"55a0c0fdc935f2c37fa0d21d255e21e47a2330198fdfdef18b48a61132dc7275"} Mar 14 07:22:57 crc kubenswrapper[5129]: I0314 07:22:57.686350 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerStarted","Data":"f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b"} Mar 14 07:22:58 crc kubenswrapper[5129]: I0314 07:22:58.728932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerStarted","Data":"65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51"} Mar 14 07:22:58 crc kubenswrapper[5129]: I0314 07:22:58.729482 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerStarted","Data":"f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510"} Mar 14 07:23:01 crc kubenswrapper[5129]: I0314 07:23:01.758005 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerStarted","Data":"b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9"} Mar 14 07:23:01 crc kubenswrapper[5129]: I0314 07:23:01.759787 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:23:01 crc kubenswrapper[5129]: I0314 07:23:01.762033 5129 generic.go:334] "Generic (PLEG): container finished" podID="3b3c5416-85f2-4109-892a-33079d1541d9" containerID="26b4aec00cbbff617806653a000ecc4903072ef2737594d30098d8439aeda080" exitCode=0 Mar 14 07:23:01 crc kubenswrapper[5129]: I0314 07:23:01.762073 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-twj2g" event={"ID":"3b3c5416-85f2-4109-892a-33079d1541d9","Type":"ContainerDied","Data":"26b4aec00cbbff617806653a000ecc4903072ef2737594d30098d8439aeda080"} Mar 14 07:23:01 crc kubenswrapper[5129]: I0314 07:23:01.794891 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.580041515 podStartE2EDuration="6.794057815s" podCreationTimestamp="2026-03-14 07:22:55 +0000 UTC" firstStartedPulling="2026-03-14 07:22:56.570720559 +0000 UTC m=+1439.322635743" lastFinishedPulling="2026-03-14 07:23:00.784736849 +0000 UTC m=+1443.536652043" observedRunningTime="2026-03-14 07:23:01.785385972 +0000 UTC m=+1444.537301156" watchObservedRunningTime="2026-03-14 07:23:01.794057815 +0000 UTC m=+1444.545972999" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.016959 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.017254 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.059915 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.063402 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.136038 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.224083 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xftf6\" (UniqueName: \"kubernetes.io/projected/3b3c5416-85f2-4109-892a-33079d1541d9-kube-api-access-xftf6\") pod \"3b3c5416-85f2-4109-892a-33079d1541d9\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.224162 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-config-data\") pod \"3b3c5416-85f2-4109-892a-33079d1541d9\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.224278 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-combined-ca-bundle\") pod \"3b3c5416-85f2-4109-892a-33079d1541d9\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.224377 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-scripts\") pod \"3b3c5416-85f2-4109-892a-33079d1541d9\" (UID: \"3b3c5416-85f2-4109-892a-33079d1541d9\") " Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.229847 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-scripts" (OuterVolumeSpecName: "scripts") pod "3b3c5416-85f2-4109-892a-33079d1541d9" (UID: "3b3c5416-85f2-4109-892a-33079d1541d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.230136 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b3c5416-85f2-4109-892a-33079d1541d9-kube-api-access-xftf6" (OuterVolumeSpecName: "kube-api-access-xftf6") pod "3b3c5416-85f2-4109-892a-33079d1541d9" (UID: "3b3c5416-85f2-4109-892a-33079d1541d9"). InnerVolumeSpecName "kube-api-access-xftf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.254323 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b3c5416-85f2-4109-892a-33079d1541d9" (UID: "3b3c5416-85f2-4109-892a-33079d1541d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.254562 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-config-data" (OuterVolumeSpecName: "config-data") pod "3b3c5416-85f2-4109-892a-33079d1541d9" (UID: "3b3c5416-85f2-4109-892a-33079d1541d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.327044 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.327076 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.327085 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xftf6\" (UniqueName: \"kubernetes.io/projected/3b3c5416-85f2-4109-892a-33079d1541d9-kube-api-access-xftf6\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.327097 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3c5416-85f2-4109-892a-33079d1541d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.782803 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-twj2g" event={"ID":"3b3c5416-85f2-4109-892a-33079d1541d9","Type":"ContainerDied","Data":"67751cd09f0b0f699d1d45740e7b8cc7e01b12ce8135f4d322dd550d4549d150"} Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.783289 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67751cd09f0b0f699d1d45740e7b8cc7e01b12ce8135f4d322dd550d4549d150" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.782887 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-twj2g" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.798071 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.798149 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.904662 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:23:03 crc kubenswrapper[5129]: E0314 07:23:03.905111 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3c5416-85f2-4109-892a-33079d1541d9" containerName="nova-cell0-conductor-db-sync" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.905128 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3c5416-85f2-4109-892a-33079d1541d9" containerName="nova-cell0-conductor-db-sync" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.905286 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b3c5416-85f2-4109-892a-33079d1541d9" containerName="nova-cell0-conductor-db-sync" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.905921 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.909498 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q47ks" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.909813 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.920955 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.938395 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.938934 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:03 crc kubenswrapper[5129]: I0314 07:23:03.939224 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmcqr\" (UniqueName: \"kubernetes.io/projected/288de2f6-818d-4167-8511-76f958542fbd-kube-api-access-kmcqr\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.040768 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.040855 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmcqr\" (UniqueName: \"kubernetes.io/projected/288de2f6-818d-4167-8511-76f958542fbd-kube-api-access-kmcqr\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.040914 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.056348 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.056882 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmcqr\" (UniqueName: \"kubernetes.io/projected/288de2f6-818d-4167-8511-76f958542fbd-kube-api-access-kmcqr\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.060813 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.276990 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.726129 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:23:04 crc kubenswrapper[5129]: W0314 07:23:04.729914 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod288de2f6_818d_4167_8511_76f958542fbd.slice/crio-bd03bffeeaeddba14cd269c20a41101d398671043ab76fd99636edd7cd4a2a76 WatchSource:0}: Error finding container bd03bffeeaeddba14cd269c20a41101d398671043ab76fd99636edd7cd4a2a76: Status 404 returned error can't find the container with id bd03bffeeaeddba14cd269c20a41101d398671043ab76fd99636edd7cd4a2a76 Mar 14 07:23:04 crc kubenswrapper[5129]: I0314 07:23:04.791103 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"288de2f6-818d-4167-8511-76f958542fbd","Type":"ContainerStarted","Data":"bd03bffeeaeddba14cd269c20a41101d398671043ab76fd99636edd7cd4a2a76"} Mar 14 07:23:05 crc kubenswrapper[5129]: I0314 07:23:05.638554 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:05 crc kubenswrapper[5129]: I0314 07:23:05.721317 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 07:23:05 crc kubenswrapper[5129]: I0314 07:23:05.801499 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"288de2f6-818d-4167-8511-76f958542fbd","Type":"ContainerStarted","Data":"c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1"} Mar 14 07:23:05 crc kubenswrapper[5129]: I0314 07:23:05.829070 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.829049254 podStartE2EDuration="2.829049254s" podCreationTimestamp="2026-03-14 07:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:05.816307983 +0000 UTC m=+1448.568223167" watchObservedRunningTime="2026-03-14 07:23:05.829049254 +0000 UTC m=+1448.580964438" Mar 14 07:23:06 crc kubenswrapper[5129]: I0314 07:23:06.808750 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.322752 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.914685 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mvf6f"] Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.916043 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.920791 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.921285 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.979823 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvf6f"] Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.981299 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-config-data\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.981350 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-scripts\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.981401 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:09 crc kubenswrapper[5129]: I0314 07:23:09.981433 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bd5m\" (UniqueName: \"kubernetes.io/projected/b3ca802f-a617-4766-a97b-e8bafe556ce5-kube-api-access-6bd5m\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.083023 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.083104 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bd5m\" (UniqueName: \"kubernetes.io/projected/b3ca802f-a617-4766-a97b-e8bafe556ce5-kube-api-access-6bd5m\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.084028 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-config-data\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.084104 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-scripts\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.094301 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.094360 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-config-data\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.095364 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-scripts\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.105445 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.107779 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.113456 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.117117 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bd5m\" (UniqueName: \"kubernetes.io/projected/b3ca802f-a617-4766-a97b-e8bafe556ce5-kube-api-access-6bd5m\") pod \"nova-cell0-cell-mapping-mvf6f\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.134326 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.137696 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.140932 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.150254 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.182053 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.186351 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.186386 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mz6\" (UniqueName: \"kubernetes.io/projected/743861a7-8a16-4a62-8339-1a02ec991d70-kube-api-access-n5mz6\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.186439 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-config-data\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.186459 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233658de-bdd1-4846-bfbd-69142d762c00-logs\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.186475 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.186524 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-config-data\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.186591 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrngp\" (UniqueName: \"kubernetes.io/projected/233658de-bdd1-4846-bfbd-69142d762c00-kube-api-access-zrngp\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.247250 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.269828 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.271697 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.275910 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.291798 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.291847 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mz6\" (UniqueName: \"kubernetes.io/projected/743861a7-8a16-4a62-8339-1a02ec991d70-kube-api-access-n5mz6\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.291898 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-config-data\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.291927 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233658de-bdd1-4846-bfbd-69142d762c00-logs\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.291951 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.292015 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-config-data\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.292108 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrngp\" (UniqueName: \"kubernetes.io/projected/233658de-bdd1-4846-bfbd-69142d762c00-kube-api-access-zrngp\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.301629 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233658de-bdd1-4846-bfbd-69142d762c00-logs\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.306232 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.313732 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.314254 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-config-data\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.314515 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-config-data\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.332036 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrngp\" (UniqueName: \"kubernetes.io/projected/233658de-bdd1-4846-bfbd-69142d762c00-kube-api-access-zrngp\") pod \"nova-api-0\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.333493 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.334673 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.341000 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.358017 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.365668 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mz6\" (UniqueName: \"kubernetes.io/projected/743861a7-8a16-4a62-8339-1a02ec991d70-kube-api-access-n5mz6\") pod \"nova-scheduler-0\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.372952 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.394527 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.394623 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzp25\" (UniqueName: \"kubernetes.io/projected/61d5618c-9c6e-4837-a29e-132de4b39fb4-kube-api-access-kzp25\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.394646 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjm9x\" (UniqueName: \"kubernetes.io/projected/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-kube-api-access-gjm9x\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.394695 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-logs\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.394718 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.394752 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-config-data\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.394773 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.400262 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69b4446475-ttqvv"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.401830 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.413655 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-ttqvv"] Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.501475 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-logs\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.501809 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.501837 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.501867 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-svc\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.501890 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.501907 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-config-data\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.501929 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.502094 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj7s2\" (UniqueName: \"kubernetes.io/projected/6622666d-efc0-49fe-84d3-d8b8113a2ee2-kube-api-access-bj7s2\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.502119 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.502175 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-config\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.502194 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.502212 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzp25\" (UniqueName: \"kubernetes.io/projected/61d5618c-9c6e-4837-a29e-132de4b39fb4-kube-api-access-kzp25\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.502235 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjm9x\" (UniqueName: \"kubernetes.io/projected/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-kube-api-access-gjm9x\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.503709 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-logs\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.507808 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.507951 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.507951 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.509022 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.510860 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-config-data\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.516988 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.520505 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjm9x\" (UniqueName: \"kubernetes.io/projected/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-kube-api-access-gjm9x\") pod \"nova-metadata-0\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.521644 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzp25\" (UniqueName: \"kubernetes.io/projected/61d5618c-9c6e-4837-a29e-132de4b39fb4-kube-api-access-kzp25\") pod \"nova-cell1-novncproxy-0\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.605166 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-config\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.605240 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-config\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.605272 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.605373 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.605418 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-svc\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.606578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.607532 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-svc\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.607569 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.607664 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.608723 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.608935 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj7s2\" (UniqueName: \"kubernetes.io/projected/6622666d-efc0-49fe-84d3-d8b8113a2ee2-kube-api-access-bj7s2\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.628765 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj7s2\" (UniqueName: \"kubernetes.io/projected/6622666d-efc0-49fe-84d3-d8b8113a2ee2-kube-api-access-bj7s2\") pod \"dnsmasq-dns-69b4446475-ttqvv\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.759407 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.774336 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.794300 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.807857 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvf6f"] Mar 14 07:23:10 crc kubenswrapper[5129]: W0314 07:23:10.812416 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3ca802f_a617_4766_a97b_e8bafe556ce5.slice/crio-3d81b2604eedbac7b0946b70c9fb85dbedb407442347d0eab0f397262bc52f1d WatchSource:0}: Error finding container 3d81b2604eedbac7b0946b70c9fb85dbedb407442347d0eab0f397262bc52f1d: Status 404 returned error can't find the container with id 3d81b2604eedbac7b0946b70c9fb85dbedb407442347d0eab0f397262bc52f1d Mar 14 07:23:10 crc kubenswrapper[5129]: I0314 07:23:10.859286 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvf6f" event={"ID":"b3ca802f-a617-4766-a97b-e8bafe556ce5","Type":"ContainerStarted","Data":"3d81b2604eedbac7b0946b70c9fb85dbedb407442347d0eab0f397262bc52f1d"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.014742 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jmzpz"] Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.016753 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.019815 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.019891 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.060625 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jmzpz"] Mar 14 07:23:11 crc kubenswrapper[5129]: W0314 07:23:11.092650 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod743861a7_8a16_4a62_8339_1a02ec991d70.slice/crio-79d8fdbb4a624cddd7d801f571b6bfc39abcc502782a953cc54dec02745f9ad7 WatchSource:0}: Error finding container 79d8fdbb4a624cddd7d801f571b6bfc39abcc502782a953cc54dec02745f9ad7: Status 404 returned error can't find the container with id 79d8fdbb4a624cddd7d801f571b6bfc39abcc502782a953cc54dec02745f9ad7 Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.119303 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.124176 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.124595 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-config-data\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.125001 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-scripts\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.125058 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrb8x\" (UniqueName: \"kubernetes.io/projected/21298577-e14d-4394-9a86-f9e488d33659-kube-api-access-qrb8x\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.224806 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.226673 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-scripts\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.226730 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrb8x\" (UniqueName: \"kubernetes.io/projected/21298577-e14d-4394-9a86-f9e488d33659-kube-api-access-qrb8x\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.226775 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.226850 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-config-data\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.239647 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.247209 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-scripts\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.251619 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-config-data\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.252583 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrb8x\" (UniqueName: \"kubernetes.io/projected/21298577-e14d-4394-9a86-f9e488d33659-kube-api-access-qrb8x\") pod \"nova-cell1-conductor-db-sync-jmzpz\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.354129 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.367139 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:11 crc kubenswrapper[5129]: W0314 07:23:11.370956 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61d5618c_9c6e_4837_a29e_132de4b39fb4.slice/crio-b4f600446ad9ade115ee60899571cc449305d37a122faf39908f60b8d1aae567 WatchSource:0}: Error finding container b4f600446ad9ade115ee60899571cc449305d37a122faf39908f60b8d1aae567: Status 404 returned error can't find the container with id b4f600446ad9ade115ee60899571cc449305d37a122faf39908f60b8d1aae567 Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.382394 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.534169 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-ttqvv"] Mar 14 07:23:11 crc kubenswrapper[5129]: W0314 07:23:11.535739 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6622666d_efc0_49fe_84d3_d8b8113a2ee2.slice/crio-6f4c55f3734dd8d051cff530ce688022fffc996d0e721c664d0a18bc4764d82d WatchSource:0}: Error finding container 6f4c55f3734dd8d051cff530ce688022fffc996d0e721c664d0a18bc4764d82d: Status 404 returned error can't find the container with id 6f4c55f3734dd8d051cff530ce688022fffc996d0e721c664d0a18bc4764d82d Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.871400 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233658de-bdd1-4846-bfbd-69142d762c00","Type":"ContainerStarted","Data":"4ac77b7705811ff389a06246fa65be810d5010b7f4a0272275a4a9478cfae82a"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.873302 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvf6f" event={"ID":"b3ca802f-a617-4766-a97b-e8bafe556ce5","Type":"ContainerStarted","Data":"4d62c3f1e4e66d656296bfd613aa4d41a73979018105522b96e8ba63503eb9d8"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.879833 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0ac46d2-d376-41c8-a5c0-b88e24948a7b","Type":"ContainerStarted","Data":"9668e0ebc00f1a65be712f2b1e7a2dcd3d25b185da009b01eece639c41c0bf5b"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.881704 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"61d5618c-9c6e-4837-a29e-132de4b39fb4","Type":"ContainerStarted","Data":"b4f600446ad9ade115ee60899571cc449305d37a122faf39908f60b8d1aae567"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.885053 5129 generic.go:334] "Generic (PLEG): container finished" podID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerID="f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a" exitCode=0 Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.885119 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" event={"ID":"6622666d-efc0-49fe-84d3-d8b8113a2ee2","Type":"ContainerDied","Data":"f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.885146 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" event={"ID":"6622666d-efc0-49fe-84d3-d8b8113a2ee2","Type":"ContainerStarted","Data":"6f4c55f3734dd8d051cff530ce688022fffc996d0e721c664d0a18bc4764d82d"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.887728 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jmzpz"] Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.890848 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"743861a7-8a16-4a62-8339-1a02ec991d70","Type":"ContainerStarted","Data":"79d8fdbb4a624cddd7d801f571b6bfc39abcc502782a953cc54dec02745f9ad7"} Mar 14 07:23:11 crc kubenswrapper[5129]: I0314 07:23:11.894925 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mvf6f" podStartSLOduration=2.894905112 podStartE2EDuration="2.894905112s" podCreationTimestamp="2026-03-14 07:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:11.889456816 +0000 UTC m=+1454.641372010" watchObservedRunningTime="2026-03-14 07:23:11.894905112 +0000 UTC m=+1454.646820296" Mar 14 07:23:12 crc kubenswrapper[5129]: I0314 07:23:12.917200 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" event={"ID":"6622666d-efc0-49fe-84d3-d8b8113a2ee2","Type":"ContainerStarted","Data":"f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9"} Mar 14 07:23:12 crc kubenswrapper[5129]: I0314 07:23:12.918746 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:12 crc kubenswrapper[5129]: I0314 07:23:12.922118 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" event={"ID":"21298577-e14d-4394-9a86-f9e488d33659","Type":"ContainerStarted","Data":"3ad8b9fa221f117d804328f3d1d23c11e664b5ca221ea492eeb968a42da6af5c"} Mar 14 07:23:12 crc kubenswrapper[5129]: I0314 07:23:12.922150 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" event={"ID":"21298577-e14d-4394-9a86-f9e488d33659","Type":"ContainerStarted","Data":"f1d9c9b1765f4ad1158efcdd82e9fd114d9e0f1783d29187b9a794377465ae2a"} Mar 14 07:23:12 crc kubenswrapper[5129]: I0314 07:23:12.942205 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" podStartSLOduration=2.942186713 podStartE2EDuration="2.942186713s" podCreationTimestamp="2026-03-14 07:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:12.937673652 +0000 UTC m=+1455.689588856" watchObservedRunningTime="2026-03-14 07:23:12.942186713 +0000 UTC m=+1455.694101897" Mar 14 07:23:12 crc kubenswrapper[5129]: I0314 07:23:12.961938 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" podStartSLOduration=2.961916821 podStartE2EDuration="2.961916821s" podCreationTimestamp="2026-03-14 07:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:12.953168607 +0000 UTC m=+1455.705083791" watchObservedRunningTime="2026-03-14 07:23:12.961916821 +0000 UTC m=+1455.713832005" Mar 14 07:23:13 crc kubenswrapper[5129]: I0314 07:23:13.588786 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:13 crc kubenswrapper[5129]: I0314 07:23:13.650251 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.080488 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.948485 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"61d5618c-9c6e-4837-a29e-132de4b39fb4","Type":"ContainerStarted","Data":"0b9da3364ea01b83674a5a6e3b83350d9864f04d94c06411a5408eb52341cc40"} Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.948549 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="61d5618c-9c6e-4837-a29e-132de4b39fb4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0b9da3364ea01b83674a5a6e3b83350d9864f04d94c06411a5408eb52341cc40" gracePeriod=30 Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.950623 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"743861a7-8a16-4a62-8339-1a02ec991d70","Type":"ContainerStarted","Data":"468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c"} Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.952934 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233658de-bdd1-4846-bfbd-69142d762c00","Type":"ContainerStarted","Data":"69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140"} Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.952971 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233658de-bdd1-4846-bfbd-69142d762c00","Type":"ContainerStarted","Data":"2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c"} Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.954829 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0ac46d2-d376-41c8-a5c0-b88e24948a7b","Type":"ContainerStarted","Data":"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295"} Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.954854 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0ac46d2-d376-41c8-a5c0-b88e24948a7b","Type":"ContainerStarted","Data":"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef"} Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.954930 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-log" containerID="cri-o://6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef" gracePeriod=30 Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.954968 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-metadata" containerID="cri-o://744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295" gracePeriod=30 Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.969251 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.696220063 podStartE2EDuration="5.969233615s" podCreationTimestamp="2026-03-14 07:23:10 +0000 UTC" firstStartedPulling="2026-03-14 07:23:11.375724801 +0000 UTC m=+1454.127639985" lastFinishedPulling="2026-03-14 07:23:14.648738313 +0000 UTC m=+1457.400653537" observedRunningTime="2026-03-14 07:23:15.964647942 +0000 UTC m=+1458.716563136" watchObservedRunningTime="2026-03-14 07:23:15.969233615 +0000 UTC m=+1458.721148799" Mar 14 07:23:15 crc kubenswrapper[5129]: I0314 07:23:15.989342 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.557231903 podStartE2EDuration="5.989317202s" podCreationTimestamp="2026-03-14 07:23:10 +0000 UTC" firstStartedPulling="2026-03-14 07:23:11.224933596 +0000 UTC m=+1453.976848780" lastFinishedPulling="2026-03-14 07:23:14.657018895 +0000 UTC m=+1457.408934079" observedRunningTime="2026-03-14 07:23:15.984253276 +0000 UTC m=+1458.736168460" watchObservedRunningTime="2026-03-14 07:23:15.989317202 +0000 UTC m=+1458.741232386" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.009461 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.727198561 podStartE2EDuration="6.00944276s" podCreationTimestamp="2026-03-14 07:23:10 +0000 UTC" firstStartedPulling="2026-03-14 07:23:11.366566146 +0000 UTC m=+1454.118481330" lastFinishedPulling="2026-03-14 07:23:14.648810345 +0000 UTC m=+1457.400725529" observedRunningTime="2026-03-14 07:23:16.002037652 +0000 UTC m=+1458.753952836" watchObservedRunningTime="2026-03-14 07:23:16.00944276 +0000 UTC m=+1458.761357934" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.572262 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.600771 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.047245064 podStartE2EDuration="6.600749441s" podCreationTimestamp="2026-03-14 07:23:10 +0000 UTC" firstStartedPulling="2026-03-14 07:23:11.09537155 +0000 UTC m=+1453.847286734" lastFinishedPulling="2026-03-14 07:23:14.648875927 +0000 UTC m=+1457.400791111" observedRunningTime="2026-03-14 07:23:16.023298301 +0000 UTC m=+1458.775213485" watchObservedRunningTime="2026-03-14 07:23:16.600749441 +0000 UTC m=+1459.352664625" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.639746 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-config-data\") pod \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.639833 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-logs\") pod \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.639898 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-combined-ca-bundle\") pod \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.640047 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjm9x\" (UniqueName: \"kubernetes.io/projected/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-kube-api-access-gjm9x\") pod \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\" (UID: \"b0ac46d2-d376-41c8-a5c0-b88e24948a7b\") " Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.640560 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-logs" (OuterVolumeSpecName: "logs") pod "b0ac46d2-d376-41c8-a5c0-b88e24948a7b" (UID: "b0ac46d2-d376-41c8-a5c0-b88e24948a7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.644892 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-kube-api-access-gjm9x" (OuterVolumeSpecName: "kube-api-access-gjm9x") pod "b0ac46d2-d376-41c8-a5c0-b88e24948a7b" (UID: "b0ac46d2-d376-41c8-a5c0-b88e24948a7b"). InnerVolumeSpecName "kube-api-access-gjm9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.666965 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0ac46d2-d376-41c8-a5c0-b88e24948a7b" (UID: "b0ac46d2-d376-41c8-a5c0-b88e24948a7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.674483 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-config-data" (OuterVolumeSpecName: "config-data") pod "b0ac46d2-d376-41c8-a5c0-b88e24948a7b" (UID: "b0ac46d2-d376-41c8-a5c0-b88e24948a7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.741664 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjm9x\" (UniqueName: \"kubernetes.io/projected/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-kube-api-access-gjm9x\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.741695 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.741705 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.741713 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac46d2-d376-41c8-a5c0-b88e24948a7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.971197 5129 generic.go:334] "Generic (PLEG): container finished" podID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerID="744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295" exitCode=0 Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.971239 5129 generic.go:334] "Generic (PLEG): container finished" podID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerID="6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef" exitCode=143 Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.971838 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0ac46d2-d376-41c8-a5c0-b88e24948a7b","Type":"ContainerDied","Data":"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295"} Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.971869 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0ac46d2-d376-41c8-a5c0-b88e24948a7b","Type":"ContainerDied","Data":"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef"} Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.971882 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0ac46d2-d376-41c8-a5c0-b88e24948a7b","Type":"ContainerDied","Data":"9668e0ebc00f1a65be712f2b1e7a2dcd3d25b185da009b01eece639c41c0bf5b"} Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.971896 5129 scope.go:117] "RemoveContainer" containerID="744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.971898 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:16 crc kubenswrapper[5129]: I0314 07:23:16.994454 5129 scope.go:117] "RemoveContainer" containerID="6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.005322 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.015481 5129 scope.go:117] "RemoveContainer" containerID="744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295" Mar 14 07:23:17 crc kubenswrapper[5129]: E0314 07:23:17.016155 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295\": container with ID starting with 744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295 not found: ID does not exist" containerID="744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.016184 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295"} err="failed to get container status \"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295\": rpc error: code = NotFound desc = could not find container \"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295\": container with ID starting with 744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295 not found: ID does not exist" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.016207 5129 scope.go:117] "RemoveContainer" containerID="6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef" Mar 14 07:23:17 crc kubenswrapper[5129]: E0314 07:23:17.016950 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef\": container with ID starting with 6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef not found: ID does not exist" containerID="6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.016977 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef"} err="failed to get container status \"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef\": rpc error: code = NotFound desc = could not find container \"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef\": container with ID starting with 6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef not found: ID does not exist" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.016995 5129 scope.go:117] "RemoveContainer" containerID="744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.017190 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295"} err="failed to get container status \"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295\": rpc error: code = NotFound desc = could not find container \"744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295\": container with ID starting with 744393b9dfe14f2f291da6833621cd71e230fae3adcc89cb434d136e3d2fe295 not found: ID does not exist" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.017216 5129 scope.go:117] "RemoveContainer" containerID="6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.017397 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef"} err="failed to get container status \"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef\": rpc error: code = NotFound desc = could not find container \"6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef\": container with ID starting with 6acf083f2a67a1fe203e1676c8aca02464774c50503351cf26e62018455140ef not found: ID does not exist" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.020976 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.034281 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:17 crc kubenswrapper[5129]: E0314 07:23:17.034834 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-metadata" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.034862 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-metadata" Mar 14 07:23:17 crc kubenswrapper[5129]: E0314 07:23:17.034891 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-log" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.034899 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-log" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.035149 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-log" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.035172 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" containerName="nova-metadata-metadata" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.036078 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.038263 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.038581 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.049950 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.148581 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ede4e3-f6c3-4d3b-becd-c5493390e15b-logs\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.148678 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.148754 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.148804 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-config-data\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.148819 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbjw\" (UniqueName: \"kubernetes.io/projected/93ede4e3-f6c3-4d3b-becd-c5493390e15b-kube-api-access-dvbjw\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.250505 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.250544 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-config-data\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.250559 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbjw\" (UniqueName: \"kubernetes.io/projected/93ede4e3-f6c3-4d3b-becd-c5493390e15b-kube-api-access-dvbjw\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.250683 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ede4e3-f6c3-4d3b-becd-c5493390e15b-logs\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.250730 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.252010 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ede4e3-f6c3-4d3b-becd-c5493390e15b-logs\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.254570 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.254648 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-config-data\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.256219 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.279200 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbjw\" (UniqueName: \"kubernetes.io/projected/93ede4e3-f6c3-4d3b-becd-c5493390e15b-kube-api-access-dvbjw\") pod \"nova-metadata-0\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.353134 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.840916 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:17 crc kubenswrapper[5129]: W0314 07:23:17.844075 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ede4e3_f6c3_4d3b_becd_c5493390e15b.slice/crio-c8322e68ec30620e42a5e04f3b244c872f677f063d921955a5dce9de015d56d5 WatchSource:0}: Error finding container c8322e68ec30620e42a5e04f3b244c872f677f063d921955a5dce9de015d56d5: Status 404 returned error can't find the container with id c8322e68ec30620e42a5e04f3b244c872f677f063d921955a5dce9de015d56d5 Mar 14 07:23:17 crc kubenswrapper[5129]: I0314 07:23:17.982247 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ede4e3-f6c3-4d3b-becd-c5493390e15b","Type":"ContainerStarted","Data":"c8322e68ec30620e42a5e04f3b244c872f677f063d921955a5dce9de015d56d5"} Mar 14 07:23:18 crc kubenswrapper[5129]: I0314 07:23:18.067276 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ac46d2-d376-41c8-a5c0-b88e24948a7b" path="/var/lib/kubelet/pods/b0ac46d2-d376-41c8-a5c0-b88e24948a7b/volumes" Mar 14 07:23:19 crc kubenswrapper[5129]: I0314 07:23:19.000192 5129 generic.go:334] "Generic (PLEG): container finished" podID="21298577-e14d-4394-9a86-f9e488d33659" containerID="3ad8b9fa221f117d804328f3d1d23c11e664b5ca221ea492eeb968a42da6af5c" exitCode=0 Mar 14 07:23:19 crc kubenswrapper[5129]: I0314 07:23:19.000386 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" event={"ID":"21298577-e14d-4394-9a86-f9e488d33659","Type":"ContainerDied","Data":"3ad8b9fa221f117d804328f3d1d23c11e664b5ca221ea492eeb968a42da6af5c"} Mar 14 07:23:19 crc kubenswrapper[5129]: I0314 07:23:19.003076 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ede4e3-f6c3-4d3b-becd-c5493390e15b","Type":"ContainerStarted","Data":"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c"} Mar 14 07:23:19 crc kubenswrapper[5129]: I0314 07:23:19.003100 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ede4e3-f6c3-4d3b-becd-c5493390e15b","Type":"ContainerStarted","Data":"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd"} Mar 14 07:23:19 crc kubenswrapper[5129]: I0314 07:23:19.007204 5129 generic.go:334] "Generic (PLEG): container finished" podID="b3ca802f-a617-4766-a97b-e8bafe556ce5" containerID="4d62c3f1e4e66d656296bfd613aa4d41a73979018105522b96e8ba63503eb9d8" exitCode=0 Mar 14 07:23:19 crc kubenswrapper[5129]: I0314 07:23:19.007256 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvf6f" event={"ID":"b3ca802f-a617-4766-a97b-e8bafe556ce5","Type":"ContainerDied","Data":"4d62c3f1e4e66d656296bfd613aa4d41a73979018105522b96e8ba63503eb9d8"} Mar 14 07:23:19 crc kubenswrapper[5129]: I0314 07:23:19.058411 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.058388708 podStartE2EDuration="2.058388708s" podCreationTimestamp="2026-03-14 07:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:19.051695319 +0000 UTC m=+1461.803610513" watchObservedRunningTime="2026-03-14 07:23:19.058388708 +0000 UTC m=+1461.810303902" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.508031 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.509333 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.509686 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.515195 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.518271 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.519159 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.560217 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.632783 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-combined-ca-bundle\") pod \"21298577-e14d-4394-9a86-f9e488d33659\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.632872 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bd5m\" (UniqueName: \"kubernetes.io/projected/b3ca802f-a617-4766-a97b-e8bafe556ce5-kube-api-access-6bd5m\") pod \"b3ca802f-a617-4766-a97b-e8bafe556ce5\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.632912 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-config-data\") pod \"b3ca802f-a617-4766-a97b-e8bafe556ce5\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.632927 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-combined-ca-bundle\") pod \"b3ca802f-a617-4766-a97b-e8bafe556ce5\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.632951 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrb8x\" (UniqueName: \"kubernetes.io/projected/21298577-e14d-4394-9a86-f9e488d33659-kube-api-access-qrb8x\") pod \"21298577-e14d-4394-9a86-f9e488d33659\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.633001 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-scripts\") pod \"21298577-e14d-4394-9a86-f9e488d33659\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.633043 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-scripts\") pod \"b3ca802f-a617-4766-a97b-e8bafe556ce5\" (UID: \"b3ca802f-a617-4766-a97b-e8bafe556ce5\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.633182 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-config-data\") pod \"21298577-e14d-4394-9a86-f9e488d33659\" (UID: \"21298577-e14d-4394-9a86-f9e488d33659\") " Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.638623 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ca802f-a617-4766-a97b-e8bafe556ce5-kube-api-access-6bd5m" (OuterVolumeSpecName: "kube-api-access-6bd5m") pod "b3ca802f-a617-4766-a97b-e8bafe556ce5" (UID: "b3ca802f-a617-4766-a97b-e8bafe556ce5"). InnerVolumeSpecName "kube-api-access-6bd5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.640337 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21298577-e14d-4394-9a86-f9e488d33659-kube-api-access-qrb8x" (OuterVolumeSpecName: "kube-api-access-qrb8x") pod "21298577-e14d-4394-9a86-f9e488d33659" (UID: "21298577-e14d-4394-9a86-f9e488d33659"). InnerVolumeSpecName "kube-api-access-qrb8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.641437 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-scripts" (OuterVolumeSpecName: "scripts") pod "b3ca802f-a617-4766-a97b-e8bafe556ce5" (UID: "b3ca802f-a617-4766-a97b-e8bafe556ce5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.646863 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-scripts" (OuterVolumeSpecName: "scripts") pod "21298577-e14d-4394-9a86-f9e488d33659" (UID: "21298577-e14d-4394-9a86-f9e488d33659"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.669479 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3ca802f-a617-4766-a97b-e8bafe556ce5" (UID: "b3ca802f-a617-4766-a97b-e8bafe556ce5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.669949 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-config-data" (OuterVolumeSpecName: "config-data") pod "21298577-e14d-4394-9a86-f9e488d33659" (UID: "21298577-e14d-4394-9a86-f9e488d33659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.671274 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-config-data" (OuterVolumeSpecName: "config-data") pod "b3ca802f-a617-4766-a97b-e8bafe556ce5" (UID: "b3ca802f-a617-4766-a97b-e8bafe556ce5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.672964 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21298577-e14d-4394-9a86-f9e488d33659" (UID: "21298577-e14d-4394-9a86-f9e488d33659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735651 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bd5m\" (UniqueName: \"kubernetes.io/projected/b3ca802f-a617-4766-a97b-e8bafe556ce5-kube-api-access-6bd5m\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735715 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735738 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735756 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrb8x\" (UniqueName: \"kubernetes.io/projected/21298577-e14d-4394-9a86-f9e488d33659-kube-api-access-qrb8x\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735772 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735788 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ca802f-a617-4766-a97b-e8bafe556ce5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735804 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.735820 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21298577-e14d-4394-9a86-f9e488d33659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.775094 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.796823 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.883085 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-vh8bg"] Mar 14 07:23:20 crc kubenswrapper[5129]: I0314 07:23:20.883438 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" podUID="21f24524-573b-4948-b866-2dc0828a860f" containerName="dnsmasq-dns" containerID="cri-o://cc33061d7eea06e4d1027500b33204f6b46c9cab6e12536bff0261c3fc7da2f0" gracePeriod=10 Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.031027 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.031015 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jmzpz" event={"ID":"21298577-e14d-4394-9a86-f9e488d33659","Type":"ContainerDied","Data":"f1d9c9b1765f4ad1158efcdd82e9fd114d9e0f1783d29187b9a794377465ae2a"} Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.031671 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d9c9b1765f4ad1158efcdd82e9fd114d9e0f1783d29187b9a794377465ae2a" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.039749 5129 generic.go:334] "Generic (PLEG): container finished" podID="21f24524-573b-4948-b866-2dc0828a860f" containerID="cc33061d7eea06e4d1027500b33204f6b46c9cab6e12536bff0261c3fc7da2f0" exitCode=0 Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.039856 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" event={"ID":"21f24524-573b-4948-b866-2dc0828a860f","Type":"ContainerDied","Data":"cc33061d7eea06e4d1027500b33204f6b46c9cab6e12536bff0261c3fc7da2f0"} Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.043883 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvf6f" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.043887 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvf6f" event={"ID":"b3ca802f-a617-4766-a97b-e8bafe556ce5","Type":"ContainerDied","Data":"3d81b2604eedbac7b0946b70c9fb85dbedb407442347d0eab0f397262bc52f1d"} Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.044019 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d81b2604eedbac7b0946b70c9fb85dbedb407442347d0eab0f397262bc52f1d" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.083310 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.138699 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:23:21 crc kubenswrapper[5129]: E0314 07:23:21.139265 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21298577-e14d-4394-9a86-f9e488d33659" containerName="nova-cell1-conductor-db-sync" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.139283 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="21298577-e14d-4394-9a86-f9e488d33659" containerName="nova-cell1-conductor-db-sync" Mar 14 07:23:21 crc kubenswrapper[5129]: E0314 07:23:21.139297 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ca802f-a617-4766-a97b-e8bafe556ce5" containerName="nova-manage" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.139304 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ca802f-a617-4766-a97b-e8bafe556ce5" containerName="nova-manage" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.139520 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="21298577-e14d-4394-9a86-f9e488d33659" containerName="nova-cell1-conductor-db-sync" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.139534 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ca802f-a617-4766-a97b-e8bafe556ce5" containerName="nova-manage" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.140281 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.145025 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.173977 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.247041 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvwsc\" (UniqueName: \"kubernetes.io/projected/df407ca4-4a5d-404c-ab22-89bcde2439c4-kube-api-access-hvwsc\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.247086 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.247163 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.255519 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.278702 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.279043 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-log" containerID="cri-o://3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd" gracePeriod=30 Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.279236 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-metadata" containerID="cri-o://04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c" gracePeriod=30 Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.349799 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvwsc\" (UniqueName: \"kubernetes.io/projected/df407ca4-4a5d-404c-ab22-89bcde2439c4-kube-api-access-hvwsc\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.349859 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.349969 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.355488 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.363321 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.375836 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvwsc\" (UniqueName: \"kubernetes.io/projected/df407ca4-4a5d-404c-ab22-89bcde2439c4-kube-api-access-hvwsc\") pod \"nova-cell1-conductor-0\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.452941 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.457820 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.551911 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-svc\") pod \"21f24524-573b-4948-b866-2dc0828a860f\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.552353 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-swift-storage-0\") pod \"21f24524-573b-4948-b866-2dc0828a860f\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.552458 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-config\") pod \"21f24524-573b-4948-b866-2dc0828a860f\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.552506 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-sb\") pod \"21f24524-573b-4948-b866-2dc0828a860f\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.552553 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xp4\" (UniqueName: \"kubernetes.io/projected/21f24524-573b-4948-b866-2dc0828a860f-kube-api-access-p6xp4\") pod \"21f24524-573b-4948-b866-2dc0828a860f\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.552574 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-nb\") pod \"21f24524-573b-4948-b866-2dc0828a860f\" (UID: \"21f24524-573b-4948-b866-2dc0828a860f\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.580163 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f24524-573b-4948-b866-2dc0828a860f-kube-api-access-p6xp4" (OuterVolumeSpecName: "kube-api-access-p6xp4") pod "21f24524-573b-4948-b866-2dc0828a860f" (UID: "21f24524-573b-4948-b866-2dc0828a860f"). InnerVolumeSpecName "kube-api-access-p6xp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.590839 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.592097 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.617243 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21f24524-573b-4948-b866-2dc0828a860f" (UID: "21f24524-573b-4948-b866-2dc0828a860f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.629958 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-config" (OuterVolumeSpecName: "config") pod "21f24524-573b-4948-b866-2dc0828a860f" (UID: "21f24524-573b-4948-b866-2dc0828a860f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.649056 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21f24524-573b-4948-b866-2dc0828a860f" (UID: "21f24524-573b-4948-b866-2dc0828a860f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.651005 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21f24524-573b-4948-b866-2dc0828a860f" (UID: "21f24524-573b-4948-b866-2dc0828a860f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.655592 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.656752 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.656779 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.656804 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xp4\" (UniqueName: \"kubernetes.io/projected/21f24524-573b-4948-b866-2dc0828a860f-kube-api-access-p6xp4\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.656818 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.656936 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21f24524-573b-4948-b866-2dc0828a860f" (UID: "21f24524-573b-4948-b866-2dc0828a860f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.677565 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.758756 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f24524-573b-4948-b866-2dc0828a860f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.848335 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.961302 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-combined-ca-bundle\") pod \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.961538 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvbjw\" (UniqueName: \"kubernetes.io/projected/93ede4e3-f6c3-4d3b-becd-c5493390e15b-kube-api-access-dvbjw\") pod \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.961564 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ede4e3-f6c3-4d3b-becd-c5493390e15b-logs\") pod \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.961617 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-config-data\") pod \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.961665 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-nova-metadata-tls-certs\") pod \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\" (UID: \"93ede4e3-f6c3-4d3b-becd-c5493390e15b\") " Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.962127 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ede4e3-f6c3-4d3b-becd-c5493390e15b-logs" (OuterVolumeSpecName: "logs") pod "93ede4e3-f6c3-4d3b-becd-c5493390e15b" (UID: "93ede4e3-f6c3-4d3b-becd-c5493390e15b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.968070 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ede4e3-f6c3-4d3b-becd-c5493390e15b-kube-api-access-dvbjw" (OuterVolumeSpecName: "kube-api-access-dvbjw") pod "93ede4e3-f6c3-4d3b-becd-c5493390e15b" (UID: "93ede4e3-f6c3-4d3b-becd-c5493390e15b"). InnerVolumeSpecName "kube-api-access-dvbjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:21 crc kubenswrapper[5129]: I0314 07:23:21.991545 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93ede4e3-f6c3-4d3b-becd-c5493390e15b" (UID: "93ede4e3-f6c3-4d3b-becd-c5493390e15b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.015268 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:23:22 crc kubenswrapper[5129]: W0314 07:23:22.016353 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf407ca4_4a5d_404c_ab22_89bcde2439c4.slice/crio-4d2e504d56287a429d6548e6ffbfede164dc4e28142a036fe5d350c48f4a09b6 WatchSource:0}: Error finding container 4d2e504d56287a429d6548e6ffbfede164dc4e28142a036fe5d350c48f4a09b6: Status 404 returned error can't find the container with id 4d2e504d56287a429d6548e6ffbfede164dc4e28142a036fe5d350c48f4a09b6 Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.022474 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "93ede4e3-f6c3-4d3b-becd-c5493390e15b" (UID: "93ede4e3-f6c3-4d3b-becd-c5493390e15b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.021656 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-config-data" (OuterVolumeSpecName: "config-data") pod "93ede4e3-f6c3-4d3b-becd-c5493390e15b" (UID: "93ede4e3-f6c3-4d3b-becd-c5493390e15b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.052892 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" event={"ID":"21f24524-573b-4948-b866-2dc0828a860f","Type":"ContainerDied","Data":"f1e594c26072910157d861498a6a38e073ecc3ba4a314306dd150fe8eb6fbd14"} Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.052941 5129 scope.go:117] "RemoveContainer" containerID="cc33061d7eea06e4d1027500b33204f6b46c9cab6e12536bff0261c3fc7da2f0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.053339 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-vh8bg" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.056010 5129 generic.go:334] "Generic (PLEG): container finished" podID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerID="04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c" exitCode=0 Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.056045 5129 generic.go:334] "Generic (PLEG): container finished" podID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerID="3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd" exitCode=143 Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.056104 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ede4e3-f6c3-4d3b-becd-c5493390e15b","Type":"ContainerDied","Data":"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c"} Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.056114 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.056126 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ede4e3-f6c3-4d3b-becd-c5493390e15b","Type":"ContainerDied","Data":"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd"} Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.056228 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93ede4e3-f6c3-4d3b-becd-c5493390e15b","Type":"ContainerDied","Data":"c8322e68ec30620e42a5e04f3b244c872f677f063d921955a5dce9de015d56d5"} Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.057004 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-log" containerID="cri-o://2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c" gracePeriod=30 Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.057253 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df407ca4-4a5d-404c-ab22-89bcde2439c4","Type":"ContainerStarted","Data":"4d2e504d56287a429d6548e6ffbfede164dc4e28142a036fe5d350c48f4a09b6"} Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.057942 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-api" containerID="cri-o://69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140" gracePeriod=30 Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.064406 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvbjw\" (UniqueName: \"kubernetes.io/projected/93ede4e3-f6c3-4d3b-becd-c5493390e15b-kube-api-access-dvbjw\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.064438 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93ede4e3-f6c3-4d3b-becd-c5493390e15b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.064451 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.064464 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.064475 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ede4e3-f6c3-4d3b-becd-c5493390e15b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.095092 5129 scope.go:117] "RemoveContainer" containerID="86b1d76f2f88d2f87fa1167fa9ed1d09d2e2515fe9194103ba75fd503968c821" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.135516 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-vh8bg"] Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.156691 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-vh8bg"] Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.169644 5129 scope.go:117] "RemoveContainer" containerID="04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.186758 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.202412 5129 scope.go:117] "RemoveContainer" containerID="3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.223927 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.236522 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.237465 5129 scope.go:117] "RemoveContainer" containerID="04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c" Mar 14 07:23:22 crc kubenswrapper[5129]: E0314 07:23:22.237671 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-metadata" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.237694 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-metadata" Mar 14 07:23:22 crc kubenswrapper[5129]: E0314 07:23:22.237724 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f24524-573b-4948-b866-2dc0828a860f" containerName="init" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.237731 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f24524-573b-4948-b866-2dc0828a860f" containerName="init" Mar 14 07:23:22 crc kubenswrapper[5129]: E0314 07:23:22.237742 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-log" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.237748 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-log" Mar 14 07:23:22 crc kubenswrapper[5129]: E0314 07:23:22.237768 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f24524-573b-4948-b866-2dc0828a860f" containerName="dnsmasq-dns" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.237773 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f24524-573b-4948-b866-2dc0828a860f" containerName="dnsmasq-dns" Mar 14 07:23:22 crc kubenswrapper[5129]: E0314 07:23:22.238208 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c\": container with ID starting with 04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c not found: ID does not exist" containerID="04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.238249 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c"} err="failed to get container status \"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c\": rpc error: code = NotFound desc = could not find container \"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c\": container with ID starting with 04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c not found: ID does not exist" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.238270 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-metadata" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.238293 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f24524-573b-4948-b866-2dc0828a860f" containerName="dnsmasq-dns" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.238307 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" containerName="nova-metadata-log" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.238277 5129 scope.go:117] "RemoveContainer" containerID="3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd" Mar 14 07:23:22 crc kubenswrapper[5129]: E0314 07:23:22.239417 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd\": container with ID starting with 3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd not found: ID does not exist" containerID="3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.239438 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd"} err="failed to get container status \"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd\": rpc error: code = NotFound desc = could not find container \"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd\": container with ID starting with 3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd not found: ID does not exist" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.239457 5129 scope.go:117] "RemoveContainer" containerID="04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.240510 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.240516 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c"} err="failed to get container status \"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c\": rpc error: code = NotFound desc = could not find container \"04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c\": container with ID starting with 04f6ec1266c9b39336b806d1751efa5f4b0fb7b9eede3ccc0e677bccd550381c not found: ID does not exist" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.240536 5129 scope.go:117] "RemoveContainer" containerID="3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.241190 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd"} err="failed to get container status \"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd\": rpc error: code = NotFound desc = could not find container \"3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd\": container with ID starting with 3feb6162dfdb96595c57a2cb639b8ee29465f500d167fa7d4ec340654847d8dd not found: ID does not exist" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.243152 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.244455 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.247162 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.371705 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.371995 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.372123 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5pn\" (UniqueName: \"kubernetes.io/projected/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-kube-api-access-zr5pn\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.372249 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-logs\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.372385 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-config-data\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.474745 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.475461 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5pn\" (UniqueName: \"kubernetes.io/projected/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-kube-api-access-zr5pn\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.475672 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-logs\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.475838 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-config-data\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.476004 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.476109 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-logs\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.479423 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.480185 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-config-data\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.480693 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.493817 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5pn\" (UniqueName: \"kubernetes.io/projected/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-kube-api-access-zr5pn\") pod \"nova-metadata-0\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " pod="openstack/nova-metadata-0" Mar 14 07:23:22 crc kubenswrapper[5129]: I0314 07:23:22.558957 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:23:23 crc kubenswrapper[5129]: I0314 07:23:23.058867 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:23:23 crc kubenswrapper[5129]: I0314 07:23:23.066078 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df407ca4-4a5d-404c-ab22-89bcde2439c4","Type":"ContainerStarted","Data":"d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e"} Mar 14 07:23:23 crc kubenswrapper[5129]: I0314 07:23:23.066210 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:23 crc kubenswrapper[5129]: I0314 07:23:23.086800 5129 generic.go:334] "Generic (PLEG): container finished" podID="233658de-bdd1-4846-bfbd-69142d762c00" containerID="2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c" exitCode=143 Mar 14 07:23:23 crc kubenswrapper[5129]: I0314 07:23:23.087019 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="743861a7-8a16-4a62-8339-1a02ec991d70" containerName="nova-scheduler-scheduler" containerID="cri-o://468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" gracePeriod=30 Mar 14 07:23:23 crc kubenswrapper[5129]: I0314 07:23:23.087084 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233658de-bdd1-4846-bfbd-69142d762c00","Type":"ContainerDied","Data":"2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c"} Mar 14 07:23:23 crc kubenswrapper[5129]: W0314 07:23:23.089549 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3ba50bc_2f3a_4d00_bad8_639b7ad63ebe.slice/crio-bca274f8535dd4f7decf1cac753bc98f2ff2b0127f475f113b8af2eafad77788 WatchSource:0}: Error finding container bca274f8535dd4f7decf1cac753bc98f2ff2b0127f475f113b8af2eafad77788: Status 404 returned error can't find the container with id bca274f8535dd4f7decf1cac753bc98f2ff2b0127f475f113b8af2eafad77788 Mar 14 07:23:23 crc kubenswrapper[5129]: I0314 07:23:23.099911 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.099899153 podStartE2EDuration="2.099899153s" podCreationTimestamp="2026-03-14 07:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:23.094824117 +0000 UTC m=+1465.846739311" watchObservedRunningTime="2026-03-14 07:23:23.099899153 +0000 UTC m=+1465.851814337" Mar 14 07:23:24 crc kubenswrapper[5129]: I0314 07:23:24.057106 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f24524-573b-4948-b866-2dc0828a860f" path="/var/lib/kubelet/pods/21f24524-573b-4948-b866-2dc0828a860f/volumes" Mar 14 07:23:24 crc kubenswrapper[5129]: I0314 07:23:24.058673 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ede4e3-f6c3-4d3b-becd-c5493390e15b" path="/var/lib/kubelet/pods/93ede4e3-f6c3-4d3b-becd-c5493390e15b/volumes" Mar 14 07:23:24 crc kubenswrapper[5129]: I0314 07:23:24.100489 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe","Type":"ContainerStarted","Data":"bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9"} Mar 14 07:23:24 crc kubenswrapper[5129]: I0314 07:23:24.100534 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe","Type":"ContainerStarted","Data":"c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3"} Mar 14 07:23:24 crc kubenswrapper[5129]: I0314 07:23:24.100544 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe","Type":"ContainerStarted","Data":"bca274f8535dd4f7decf1cac753bc98f2ff2b0127f475f113b8af2eafad77788"} Mar 14 07:23:24 crc kubenswrapper[5129]: I0314 07:23:24.130691 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.130662493 podStartE2EDuration="2.130662493s" podCreationTimestamp="2026-03-14 07:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:24.119379671 +0000 UTC m=+1466.871294855" watchObservedRunningTime="2026-03-14 07:23:24.130662493 +0000 UTC m=+1466.882577677" Mar 14 07:23:25 crc kubenswrapper[5129]: E0314 07:23:25.519829 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:23:25 crc kubenswrapper[5129]: E0314 07:23:25.522384 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:23:25 crc kubenswrapper[5129]: E0314 07:23:25.524753 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:23:25 crc kubenswrapper[5129]: E0314 07:23:25.524842 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="743861a7-8a16-4a62-8339-1a02ec991d70" containerName="nova-scheduler-scheduler" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.140944 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.695794 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.762950 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5mz6\" (UniqueName: \"kubernetes.io/projected/743861a7-8a16-4a62-8339-1a02ec991d70-kube-api-access-n5mz6\") pod \"743861a7-8a16-4a62-8339-1a02ec991d70\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.763041 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-combined-ca-bundle\") pod \"743861a7-8a16-4a62-8339-1a02ec991d70\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.763156 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-config-data\") pod \"743861a7-8a16-4a62-8339-1a02ec991d70\" (UID: \"743861a7-8a16-4a62-8339-1a02ec991d70\") " Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.773425 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743861a7-8a16-4a62-8339-1a02ec991d70-kube-api-access-n5mz6" (OuterVolumeSpecName: "kube-api-access-n5mz6") pod "743861a7-8a16-4a62-8339-1a02ec991d70" (UID: "743861a7-8a16-4a62-8339-1a02ec991d70"). InnerVolumeSpecName "kube-api-access-n5mz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.814488 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "743861a7-8a16-4a62-8339-1a02ec991d70" (UID: "743861a7-8a16-4a62-8339-1a02ec991d70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.830750 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-config-data" (OuterVolumeSpecName: "config-data") pod "743861a7-8a16-4a62-8339-1a02ec991d70" (UID: "743861a7-8a16-4a62-8339-1a02ec991d70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.865740 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.865785 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5mz6\" (UniqueName: \"kubernetes.io/projected/743861a7-8a16-4a62-8339-1a02ec991d70-kube-api-access-n5mz6\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.865798 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743861a7-8a16-4a62-8339-1a02ec991d70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.888691 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.966694 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233658de-bdd1-4846-bfbd-69142d762c00-logs\") pod \"233658de-bdd1-4846-bfbd-69142d762c00\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.966817 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-combined-ca-bundle\") pod \"233658de-bdd1-4846-bfbd-69142d762c00\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.966896 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrngp\" (UniqueName: \"kubernetes.io/projected/233658de-bdd1-4846-bfbd-69142d762c00-kube-api-access-zrngp\") pod \"233658de-bdd1-4846-bfbd-69142d762c00\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.967022 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-config-data\") pod \"233658de-bdd1-4846-bfbd-69142d762c00\" (UID: \"233658de-bdd1-4846-bfbd-69142d762c00\") " Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.967262 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233658de-bdd1-4846-bfbd-69142d762c00-logs" (OuterVolumeSpecName: "logs") pod "233658de-bdd1-4846-bfbd-69142d762c00" (UID: "233658de-bdd1-4846-bfbd-69142d762c00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.967872 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233658de-bdd1-4846-bfbd-69142d762c00-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.970142 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233658de-bdd1-4846-bfbd-69142d762c00-kube-api-access-zrngp" (OuterVolumeSpecName: "kube-api-access-zrngp") pod "233658de-bdd1-4846-bfbd-69142d762c00" (UID: "233658de-bdd1-4846-bfbd-69142d762c00"). InnerVolumeSpecName "kube-api-access-zrngp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.988819 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-config-data" (OuterVolumeSpecName: "config-data") pod "233658de-bdd1-4846-bfbd-69142d762c00" (UID: "233658de-bdd1-4846-bfbd-69142d762c00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:26 crc kubenswrapper[5129]: I0314 07:23:26.996220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "233658de-bdd1-4846-bfbd-69142d762c00" (UID: "233658de-bdd1-4846-bfbd-69142d762c00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.070040 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.070073 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrngp\" (UniqueName: \"kubernetes.io/projected/233658de-bdd1-4846-bfbd-69142d762c00-kube-api-access-zrngp\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.070085 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233658de-bdd1-4846-bfbd-69142d762c00-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.144179 5129 generic.go:334] "Generic (PLEG): container finished" podID="743861a7-8a16-4a62-8339-1a02ec991d70" containerID="468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" exitCode=0 Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.144210 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.144231 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"743861a7-8a16-4a62-8339-1a02ec991d70","Type":"ContainerDied","Data":"468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c"} Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.144650 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"743861a7-8a16-4a62-8339-1a02ec991d70","Type":"ContainerDied","Data":"79d8fdbb4a624cddd7d801f571b6bfc39abcc502782a953cc54dec02745f9ad7"} Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.144667 5129 scope.go:117] "RemoveContainer" containerID="468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.147164 5129 generic.go:334] "Generic (PLEG): container finished" podID="233658de-bdd1-4846-bfbd-69142d762c00" containerID="69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140" exitCode=0 Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.147189 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233658de-bdd1-4846-bfbd-69142d762c00","Type":"ContainerDied","Data":"69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140"} Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.147203 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233658de-bdd1-4846-bfbd-69142d762c00","Type":"ContainerDied","Data":"4ac77b7705811ff389a06246fa65be810d5010b7f4a0272275a4a9478cfae82a"} Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.147436 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.180911 5129 scope.go:117] "RemoveContainer" containerID="468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" Mar 14 07:23:27 crc kubenswrapper[5129]: E0314 07:23:27.181445 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c\": container with ID starting with 468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c not found: ID does not exist" containerID="468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.181484 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c"} err="failed to get container status \"468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c\": rpc error: code = NotFound desc = could not find container \"468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c\": container with ID starting with 468a284437ca9315b5c15a3b465d012ec3c4b805e5ca596cf6de9cf9fbcadd1c not found: ID does not exist" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.181510 5129 scope.go:117] "RemoveContainer" containerID="69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.210066 5129 scope.go:117] "RemoveContainer" containerID="2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.215075 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.230984 5129 scope.go:117] "RemoveContainer" containerID="69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140" Mar 14 07:23:27 crc kubenswrapper[5129]: E0314 07:23:27.233081 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140\": container with ID starting with 69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140 not found: ID does not exist" containerID="69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.233118 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140"} err="failed to get container status \"69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140\": rpc error: code = NotFound desc = could not find container \"69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140\": container with ID starting with 69b55afa4ee1f3efacdc9809a48cc52fe70ff527760dd98f532e08fec2c12140 not found: ID does not exist" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.233138 5129 scope.go:117] "RemoveContainer" containerID="2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.233928 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: E0314 07:23:27.238091 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c\": container with ID starting with 2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c not found: ID does not exist" containerID="2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.238169 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c"} err="failed to get container status \"2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c\": rpc error: code = NotFound desc = could not find container \"2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c\": container with ID starting with 2f2335b7d933152ee16b84997130fe5f7b51caf3a94ec4b5ab5113489e415c4c not found: ID does not exist" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.247030 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.267072 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.270514 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: E0314 07:23:27.271005 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-log" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.271030 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-log" Mar 14 07:23:27 crc kubenswrapper[5129]: E0314 07:23:27.271065 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-api" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.271074 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-api" Mar 14 07:23:27 crc kubenswrapper[5129]: E0314 07:23:27.271094 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743861a7-8a16-4a62-8339-1a02ec991d70" containerName="nova-scheduler-scheduler" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.271103 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="743861a7-8a16-4a62-8339-1a02ec991d70" containerName="nova-scheduler-scheduler" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.271326 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="743861a7-8a16-4a62-8339-1a02ec991d70" containerName="nova-scheduler-scheduler" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.271358 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-log" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.271373 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="233658de-bdd1-4846-bfbd-69142d762c00" containerName="nova-api-api" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.272701 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.274697 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.281378 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.283033 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.286897 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.296886 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.304669 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.377288 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdxp6\" (UniqueName: \"kubernetes.io/projected/e1522514-570d-4f82-81f3-15db416cca79-kube-api-access-jdxp6\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.377541 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-config-data\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.377658 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjtzj\" (UniqueName: \"kubernetes.io/projected/6f01515f-9326-4177-86be-a0c4d37d25c8-kube-api-access-zjtzj\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.377782 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.377900 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f01515f-9326-4177-86be-a0c4d37d25c8-logs\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.377980 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.378063 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-config-data\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.479472 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.479560 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f01515f-9326-4177-86be-a0c4d37d25c8-logs\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.479587 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.479629 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-config-data\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.479668 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdxp6\" (UniqueName: \"kubernetes.io/projected/e1522514-570d-4f82-81f3-15db416cca79-kube-api-access-jdxp6\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.479691 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-config-data\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.479706 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjtzj\" (UniqueName: \"kubernetes.io/projected/6f01515f-9326-4177-86be-a0c4d37d25c8-kube-api-access-zjtzj\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.480450 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f01515f-9326-4177-86be-a0c4d37d25c8-logs\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.483164 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-config-data\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.483215 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-config-data\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.484259 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.484442 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.499245 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdxp6\" (UniqueName: \"kubernetes.io/projected/e1522514-570d-4f82-81f3-15db416cca79-kube-api-access-jdxp6\") pod \"nova-scheduler-0\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " pod="openstack/nova-scheduler-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.501522 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjtzj\" (UniqueName: \"kubernetes.io/projected/6f01515f-9326-4177-86be-a0c4d37d25c8-kube-api-access-zjtzj\") pod \"nova-api-0\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.591510 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:27 crc kubenswrapper[5129]: I0314 07:23:27.603019 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:23:28 crc kubenswrapper[5129]: I0314 07:23:28.046333 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233658de-bdd1-4846-bfbd-69142d762c00" path="/var/lib/kubelet/pods/233658de-bdd1-4846-bfbd-69142d762c00/volumes" Mar 14 07:23:28 crc kubenswrapper[5129]: I0314 07:23:28.049626 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743861a7-8a16-4a62-8339-1a02ec991d70" path="/var/lib/kubelet/pods/743861a7-8a16-4a62-8339-1a02ec991d70/volumes" Mar 14 07:23:28 crc kubenswrapper[5129]: I0314 07:23:28.083089 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:28 crc kubenswrapper[5129]: W0314 07:23:28.085539 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f01515f_9326_4177_86be_a0c4d37d25c8.slice/crio-ebbd8ec2bfed37ad4514964c9e175d8fbfc81364b09dd503f3e5a9a98e5da045 WatchSource:0}: Error finding container ebbd8ec2bfed37ad4514964c9e175d8fbfc81364b09dd503f3e5a9a98e5da045: Status 404 returned error can't find the container with id ebbd8ec2bfed37ad4514964c9e175d8fbfc81364b09dd503f3e5a9a98e5da045 Mar 14 07:23:28 crc kubenswrapper[5129]: I0314 07:23:28.155437 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:23:28 crc kubenswrapper[5129]: W0314 07:23:28.155993 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1522514_570d_4f82_81f3_15db416cca79.slice/crio-1d684cea37554e980c6273e65ca1c6c9d94f62f664e8718c255bdf7069b65f59 WatchSource:0}: Error finding container 1d684cea37554e980c6273e65ca1c6c9d94f62f664e8718c255bdf7069b65f59: Status 404 returned error can't find the container with id 1d684cea37554e980c6273e65ca1c6c9d94f62f664e8718c255bdf7069b65f59 Mar 14 07:23:28 crc kubenswrapper[5129]: I0314 07:23:28.163118 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f01515f-9326-4177-86be-a0c4d37d25c8","Type":"ContainerStarted","Data":"ebbd8ec2bfed37ad4514964c9e175d8fbfc81364b09dd503f3e5a9a98e5da045"} Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.176875 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f01515f-9326-4177-86be-a0c4d37d25c8","Type":"ContainerStarted","Data":"9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4"} Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.176924 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f01515f-9326-4177-86be-a0c4d37d25c8","Type":"ContainerStarted","Data":"05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23"} Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.179101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1522514-570d-4f82-81f3-15db416cca79","Type":"ContainerStarted","Data":"6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314"} Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.179170 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1522514-570d-4f82-81f3-15db416cca79","Type":"ContainerStarted","Data":"1d684cea37554e980c6273e65ca1c6c9d94f62f664e8718c255bdf7069b65f59"} Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.197570 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.197551653 podStartE2EDuration="2.197551653s" podCreationTimestamp="2026-03-14 07:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:29.190654688 +0000 UTC m=+1471.942569882" watchObservedRunningTime="2026-03-14 07:23:29.197551653 +0000 UTC m=+1471.949466837" Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.209480 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.209459491 podStartE2EDuration="2.209459491s" podCreationTimestamp="2026-03-14 07:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:29.204360584 +0000 UTC m=+1471.956275798" watchObservedRunningTime="2026-03-14 07:23:29.209459491 +0000 UTC m=+1471.961374685" Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.668867 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:29 crc kubenswrapper[5129]: I0314 07:23:29.669309 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="99b46a46-8c64-44f3-b7d7-b07c09be258d" containerName="kube-state-metrics" containerID="cri-o://d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0" gracePeriod=30 Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.188360 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.189391 5129 generic.go:334] "Generic (PLEG): container finished" podID="99b46a46-8c64-44f3-b7d7-b07c09be258d" containerID="d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0" exitCode=2 Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.189427 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99b46a46-8c64-44f3-b7d7-b07c09be258d","Type":"ContainerDied","Data":"d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0"} Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.189466 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99b46a46-8c64-44f3-b7d7-b07c09be258d","Type":"ContainerDied","Data":"d0044acc42d94bfe174400345b683041caad94b064e53d549876e206c543139a"} Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.189484 5129 scope.go:117] "RemoveContainer" containerID="d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0" Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.221086 5129 scope.go:117] "RemoveContainer" containerID="d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0" Mar 14 07:23:30 crc kubenswrapper[5129]: E0314 07:23:30.221595 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0\": container with ID starting with d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0 not found: ID does not exist" containerID="d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0" Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.221655 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0"} err="failed to get container status \"d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0\": rpc error: code = NotFound desc = could not find container \"d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0\": container with ID starting with d4f66ee58bcc9a68ef28d7dc15a122f4e873c7523c8d8445de9c76a88aa880b0 not found: ID does not exist" Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.236483 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frx8j\" (UniqueName: \"kubernetes.io/projected/99b46a46-8c64-44f3-b7d7-b07c09be258d-kube-api-access-frx8j\") pod \"99b46a46-8c64-44f3-b7d7-b07c09be258d\" (UID: \"99b46a46-8c64-44f3-b7d7-b07c09be258d\") " Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.248402 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b46a46-8c64-44f3-b7d7-b07c09be258d-kube-api-access-frx8j" (OuterVolumeSpecName: "kube-api-access-frx8j") pod "99b46a46-8c64-44f3-b7d7-b07c09be258d" (UID: "99b46a46-8c64-44f3-b7d7-b07c09be258d"). InnerVolumeSpecName "kube-api-access-frx8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:30 crc kubenswrapper[5129]: I0314 07:23:30.338287 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frx8j\" (UniqueName: \"kubernetes.io/projected/99b46a46-8c64-44f3-b7d7-b07c09be258d-kube-api-access-frx8j\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.198219 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.238729 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.252315 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.264167 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:31 crc kubenswrapper[5129]: E0314 07:23:31.264749 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b46a46-8c64-44f3-b7d7-b07c09be258d" containerName="kube-state-metrics" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.264769 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b46a46-8c64-44f3-b7d7-b07c09be258d" containerName="kube-state-metrics" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.264984 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b46a46-8c64-44f3-b7d7-b07c09be258d" containerName="kube-state-metrics" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.265719 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.273141 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.273247 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.273995 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.357432 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.357565 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.357696 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ldw5\" (UniqueName: \"kubernetes.io/projected/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-api-access-9ldw5\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.357895 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.459161 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.459559 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.459643 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.459717 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ldw5\" (UniqueName: \"kubernetes.io/projected/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-api-access-9ldw5\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.465335 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.467919 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.474457 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.477416 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ldw5\" (UniqueName: \"kubernetes.io/projected/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-api-access-9ldw5\") pod \"kube-state-metrics-0\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " pod="openstack/kube-state-metrics-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.489059 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.582264 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.582529 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-central-agent" containerID="cri-o://f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b" gracePeriod=30 Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.582633 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="sg-core" containerID="cri-o://65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51" gracePeriod=30 Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.582693 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-notification-agent" containerID="cri-o://f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510" gracePeriod=30 Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.582895 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="proxy-httpd" containerID="cri-o://b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9" gracePeriod=30 Mar 14 07:23:31 crc kubenswrapper[5129]: I0314 07:23:31.592181 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:23:32 crc kubenswrapper[5129]: W0314 07:23:32.043297 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d59a327_2c1e_49e7_86b3_51e8b692545a.slice/crio-4b80b0e9d56829972a7062c820c2450513b45e755db21d1e322d878b8218ede6 WatchSource:0}: Error finding container 4b80b0e9d56829972a7062c820c2450513b45e755db21d1e322d878b8218ede6: Status 404 returned error can't find the container with id 4b80b0e9d56829972a7062c820c2450513b45e755db21d1e322d878b8218ede6 Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.048102 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b46a46-8c64-44f3-b7d7-b07c09be258d" path="/var/lib/kubelet/pods/99b46a46-8c64-44f3-b7d7-b07c09be258d/volumes" Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.048798 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.209743 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1d59a327-2c1e-49e7-86b3-51e8b692545a","Type":"ContainerStarted","Data":"4b80b0e9d56829972a7062c820c2450513b45e755db21d1e322d878b8218ede6"} Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.213115 5129 generic.go:334] "Generic (PLEG): container finished" podID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerID="b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9" exitCode=0 Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.213139 5129 generic.go:334] "Generic (PLEG): container finished" podID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerID="65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51" exitCode=2 Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.213149 5129 generic.go:334] "Generic (PLEG): container finished" podID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerID="f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b" exitCode=0 Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.213164 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerDied","Data":"b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9"} Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.213180 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerDied","Data":"65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51"} Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.213190 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerDied","Data":"f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b"} Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.559304 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.559546 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:23:32 crc kubenswrapper[5129]: I0314 07:23:32.603940 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 07:23:33 crc kubenswrapper[5129]: I0314 07:23:33.224999 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1d59a327-2c1e-49e7-86b3-51e8b692545a","Type":"ContainerStarted","Data":"3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5"} Mar 14 07:23:33 crc kubenswrapper[5129]: I0314 07:23:33.226119 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 07:23:33 crc kubenswrapper[5129]: I0314 07:23:33.251315 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.906266033 podStartE2EDuration="2.251294305s" podCreationTimestamp="2026-03-14 07:23:31 +0000 UTC" firstStartedPulling="2026-03-14 07:23:32.045729508 +0000 UTC m=+1474.797644692" lastFinishedPulling="2026-03-14 07:23:32.39075778 +0000 UTC m=+1475.142672964" observedRunningTime="2026-03-14 07:23:33.250423381 +0000 UTC m=+1476.002338565" watchObservedRunningTime="2026-03-14 07:23:33.251294305 +0000 UTC m=+1476.003209499" Mar 14 07:23:33 crc kubenswrapper[5129]: I0314 07:23:33.575932 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:33 crc kubenswrapper[5129]: I0314 07:23:33.575938 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:34 crc kubenswrapper[5129]: I0314 07:23:34.985038 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.027088 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-config-data\") pod \"d0e587b9-b290-49ee-8a37-e2361f851cce\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.027342 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snq5x\" (UniqueName: \"kubernetes.io/projected/d0e587b9-b290-49ee-8a37-e2361f851cce-kube-api-access-snq5x\") pod \"d0e587b9-b290-49ee-8a37-e2361f851cce\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.027477 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-sg-core-conf-yaml\") pod \"d0e587b9-b290-49ee-8a37-e2361f851cce\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.027646 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-run-httpd\") pod \"d0e587b9-b290-49ee-8a37-e2361f851cce\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.027818 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-scripts\") pod \"d0e587b9-b290-49ee-8a37-e2361f851cce\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.027984 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-combined-ca-bundle\") pod \"d0e587b9-b290-49ee-8a37-e2361f851cce\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.028106 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-log-httpd\") pod \"d0e587b9-b290-49ee-8a37-e2361f851cce\" (UID: \"d0e587b9-b290-49ee-8a37-e2361f851cce\") " Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.029375 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d0e587b9-b290-49ee-8a37-e2361f851cce" (UID: "d0e587b9-b290-49ee-8a37-e2361f851cce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.030696 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d0e587b9-b290-49ee-8a37-e2361f851cce" (UID: "d0e587b9-b290-49ee-8a37-e2361f851cce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.035614 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e587b9-b290-49ee-8a37-e2361f851cce-kube-api-access-snq5x" (OuterVolumeSpecName: "kube-api-access-snq5x") pod "d0e587b9-b290-49ee-8a37-e2361f851cce" (UID: "d0e587b9-b290-49ee-8a37-e2361f851cce"). InnerVolumeSpecName "kube-api-access-snq5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.036172 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-scripts" (OuterVolumeSpecName: "scripts") pod "d0e587b9-b290-49ee-8a37-e2361f851cce" (UID: "d0e587b9-b290-49ee-8a37-e2361f851cce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.061714 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d0e587b9-b290-49ee-8a37-e2361f851cce" (UID: "d0e587b9-b290-49ee-8a37-e2361f851cce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.119234 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0e587b9-b290-49ee-8a37-e2361f851cce" (UID: "d0e587b9-b290-49ee-8a37-e2361f851cce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.129850 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.129871 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.129883 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.129894 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snq5x\" (UniqueName: \"kubernetes.io/projected/d0e587b9-b290-49ee-8a37-e2361f851cce-kube-api-access-snq5x\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.129902 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.129911 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0e587b9-b290-49ee-8a37-e2361f851cce-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.168858 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-config-data" (OuterVolumeSpecName: "config-data") pod "d0e587b9-b290-49ee-8a37-e2361f851cce" (UID: "d0e587b9-b290-49ee-8a37-e2361f851cce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.232250 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e587b9-b290-49ee-8a37-e2361f851cce-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.248746 5129 generic.go:334] "Generic (PLEG): container finished" podID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerID="f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510" exitCode=0 Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.248889 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerDied","Data":"f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510"} Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.249294 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0e587b9-b290-49ee-8a37-e2361f851cce","Type":"ContainerDied","Data":"55a0c0fdc935f2c37fa0d21d255e21e47a2330198fdfdef18b48a61132dc7275"} Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.249380 5129 scope.go:117] "RemoveContainer" containerID="b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.249020 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.318171 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.325497 5129 scope.go:117] "RemoveContainer" containerID="65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.328887 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.340561 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.341192 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-notification-agent" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.341271 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-notification-agent" Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.341351 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="sg-core" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.341444 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="sg-core" Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.341515 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="proxy-httpd" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.341576 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="proxy-httpd" Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.341671 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-central-agent" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.341733 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-central-agent" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.341952 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-central-agent" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.342029 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="ceilometer-notification-agent" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.342097 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="sg-core" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.342167 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" containerName="proxy-httpd" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.343759 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.352109 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.354121 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.354357 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.354569 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.361909 5129 scope.go:117] "RemoveContainer" containerID="f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.387720 5129 scope.go:117] "RemoveContainer" containerID="f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.408072 5129 scope.go:117] "RemoveContainer" containerID="b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9" Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.408585 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9\": container with ID starting with b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9 not found: ID does not exist" containerID="b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.408685 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9"} err="failed to get container status \"b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9\": rpc error: code = NotFound desc = could not find container \"b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9\": container with ID starting with b3fb74538ed34b936845fe007c5d7a0cf991bc11d8f7e9baa7027731601596c9 not found: ID does not exist" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.408735 5129 scope.go:117] "RemoveContainer" containerID="65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51" Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.409043 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51\": container with ID starting with 65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51 not found: ID does not exist" containerID="65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.409065 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51"} err="failed to get container status \"65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51\": rpc error: code = NotFound desc = could not find container \"65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51\": container with ID starting with 65d47493973c79787014de7c5903612a5282e2795407a74d810c9c3fe5e32a51 not found: ID does not exist" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.409079 5129 scope.go:117] "RemoveContainer" containerID="f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510" Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.409364 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510\": container with ID starting with f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510 not found: ID does not exist" containerID="f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.409415 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510"} err="failed to get container status \"f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510\": rpc error: code = NotFound desc = could not find container \"f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510\": container with ID starting with f68dbdb9c34b9a6f0c6a384c150d198248ac13725efebfdfbe21cee430b64510 not found: ID does not exist" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.409449 5129 scope.go:117] "RemoveContainer" containerID="f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b" Mar 14 07:23:35 crc kubenswrapper[5129]: E0314 07:23:35.409735 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b\": container with ID starting with f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b not found: ID does not exist" containerID="f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.409757 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b"} err="failed to get container status \"f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b\": rpc error: code = NotFound desc = could not find container \"f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b\": container with ID starting with f1ab5e3229d7acb95d3eaea771d1a7955c52b913499168ee1ea37df94246515b not found: ID does not exist" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.435938 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.436013 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-scripts\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.436058 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.436098 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-run-httpd\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.436164 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-log-httpd\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.436196 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.436228 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-config-data\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.436264 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9ml\" (UniqueName: \"kubernetes.io/projected/333065cb-8979-4f3a-bef7-aea9960faed5-kube-api-access-4t9ml\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537008 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9ml\" (UniqueName: \"kubernetes.io/projected/333065cb-8979-4f3a-bef7-aea9960faed5-kube-api-access-4t9ml\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537116 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537151 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-scripts\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537170 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537194 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-run-httpd\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537238 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-log-httpd\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537284 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-config-data\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.537985 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-log-httpd\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.538097 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-run-httpd\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.542177 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.542671 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.542850 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-scripts\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.543149 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-config-data\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.544105 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.553081 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9ml\" (UniqueName: \"kubernetes.io/projected/333065cb-8979-4f3a-bef7-aea9960faed5-kube-api-access-4t9ml\") pod \"ceilometer-0\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " pod="openstack/ceilometer-0" Mar 14 07:23:35 crc kubenswrapper[5129]: I0314 07:23:35.670914 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:36 crc kubenswrapper[5129]: I0314 07:23:36.046309 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e587b9-b290-49ee-8a37-e2361f851cce" path="/var/lib/kubelet/pods/d0e587b9-b290-49ee-8a37-e2361f851cce/volumes" Mar 14 07:23:36 crc kubenswrapper[5129]: I0314 07:23:36.122156 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:36 crc kubenswrapper[5129]: W0314 07:23:36.136478 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod333065cb_8979_4f3a_bef7_aea9960faed5.slice/crio-33f1e822a34650ee37a8acbfb438c1a0691773095ebb51f93bb943b57046dd2a WatchSource:0}: Error finding container 33f1e822a34650ee37a8acbfb438c1a0691773095ebb51f93bb943b57046dd2a: Status 404 returned error can't find the container with id 33f1e822a34650ee37a8acbfb438c1a0691773095ebb51f93bb943b57046dd2a Mar 14 07:23:36 crc kubenswrapper[5129]: I0314 07:23:36.264668 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerStarted","Data":"33f1e822a34650ee37a8acbfb438c1a0691773095ebb51f93bb943b57046dd2a"} Mar 14 07:23:37 crc kubenswrapper[5129]: I0314 07:23:37.279842 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerStarted","Data":"702852aa97619ff17b9aca405fa06fca1cdf0c0453611a14dd01bcea9b81beea"} Mar 14 07:23:37 crc kubenswrapper[5129]: I0314 07:23:37.592813 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:37 crc kubenswrapper[5129]: I0314 07:23:37.592871 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:23:37 crc kubenswrapper[5129]: I0314 07:23:37.603461 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 07:23:37 crc kubenswrapper[5129]: I0314 07:23:37.664152 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 07:23:38 crc kubenswrapper[5129]: I0314 07:23:38.291987 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerStarted","Data":"67ebd3e0900cda153a2d7c86c88cc56e049e6b848fa7a51dc283e67ec1570d63"} Mar 14 07:23:38 crc kubenswrapper[5129]: I0314 07:23:38.292382 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerStarted","Data":"d43d62fd05411abc521ab71d3e3bfd276a3089d7c7b33fda842f74fa6dfec765"} Mar 14 07:23:38 crc kubenswrapper[5129]: I0314 07:23:38.339413 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 07:23:38 crc kubenswrapper[5129]: I0314 07:23:38.675793 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:38 crc kubenswrapper[5129]: I0314 07:23:38.675860 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 07:23:40 crc kubenswrapper[5129]: I0314 07:23:40.313425 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerStarted","Data":"f2902aa5fe7fae1aa8da4bac45117a231c78db1e9fdc0bbe4629366a2ddd3ffd"} Mar 14 07:23:40 crc kubenswrapper[5129]: I0314 07:23:40.314086 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:23:40 crc kubenswrapper[5129]: I0314 07:23:40.337883 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.808085369 podStartE2EDuration="5.337862322s" podCreationTimestamp="2026-03-14 07:23:35 +0000 UTC" firstStartedPulling="2026-03-14 07:23:36.140058366 +0000 UTC m=+1478.891973550" lastFinishedPulling="2026-03-14 07:23:39.669835319 +0000 UTC m=+1482.421750503" observedRunningTime="2026-03-14 07:23:40.334196674 +0000 UTC m=+1483.086111858" watchObservedRunningTime="2026-03-14 07:23:40.337862322 +0000 UTC m=+1483.089777516" Mar 14 07:23:40 crc kubenswrapper[5129]: I0314 07:23:40.559776 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:23:40 crc kubenswrapper[5129]: I0314 07:23:40.559832 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:23:41 crc kubenswrapper[5129]: I0314 07:23:41.604179 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 07:23:42 crc kubenswrapper[5129]: I0314 07:23:42.565651 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 07:23:42 crc kubenswrapper[5129]: I0314 07:23:42.575460 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 07:23:42 crc kubenswrapper[5129]: I0314 07:23:42.577044 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 07:23:43 crc kubenswrapper[5129]: I0314 07:23:43.360210 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 07:23:45 crc kubenswrapper[5129]: I0314 07:23:45.592639 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:23:45 crc kubenswrapper[5129]: I0314 07:23:45.593123 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.387030 5129 generic.go:334] "Generic (PLEG): container finished" podID="61d5618c-9c6e-4837-a29e-132de4b39fb4" containerID="0b9da3364ea01b83674a5a6e3b83350d9864f04d94c06411a5408eb52341cc40" exitCode=137 Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.387122 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"61d5618c-9c6e-4837-a29e-132de4b39fb4","Type":"ContainerDied","Data":"0b9da3364ea01b83674a5a6e3b83350d9864f04d94c06411a5408eb52341cc40"} Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.475864 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.655820 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-config-data\") pod \"61d5618c-9c6e-4837-a29e-132de4b39fb4\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.656699 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-combined-ca-bundle\") pod \"61d5618c-9c6e-4837-a29e-132de4b39fb4\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.657722 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzp25\" (UniqueName: \"kubernetes.io/projected/61d5618c-9c6e-4837-a29e-132de4b39fb4-kube-api-access-kzp25\") pod \"61d5618c-9c6e-4837-a29e-132de4b39fb4\" (UID: \"61d5618c-9c6e-4837-a29e-132de4b39fb4\") " Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.662344 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d5618c-9c6e-4837-a29e-132de4b39fb4-kube-api-access-kzp25" (OuterVolumeSpecName: "kube-api-access-kzp25") pod "61d5618c-9c6e-4837-a29e-132de4b39fb4" (UID: "61d5618c-9c6e-4837-a29e-132de4b39fb4"). InnerVolumeSpecName "kube-api-access-kzp25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.693180 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61d5618c-9c6e-4837-a29e-132de4b39fb4" (UID: "61d5618c-9c6e-4837-a29e-132de4b39fb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.698889 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-config-data" (OuterVolumeSpecName: "config-data") pod "61d5618c-9c6e-4837-a29e-132de4b39fb4" (UID: "61d5618c-9c6e-4837-a29e-132de4b39fb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.760753 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzp25\" (UniqueName: \"kubernetes.io/projected/61d5618c-9c6e-4837-a29e-132de4b39fb4-kube-api-access-kzp25\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.760798 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:46 crc kubenswrapper[5129]: I0314 07:23:46.760808 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d5618c-9c6e-4837-a29e-132de4b39fb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.403692 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"61d5618c-9c6e-4837-a29e-132de4b39fb4","Type":"ContainerDied","Data":"b4f600446ad9ade115ee60899571cc449305d37a122faf39908f60b8d1aae567"} Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.403778 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.403797 5129 scope.go:117] "RemoveContainer" containerID="0b9da3364ea01b83674a5a6e3b83350d9864f04d94c06411a5408eb52341cc40" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.463965 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.489311 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.515487 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:47 crc kubenswrapper[5129]: E0314 07:23:47.515952 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d5618c-9c6e-4837-a29e-132de4b39fb4" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.515971 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d5618c-9c6e-4837-a29e-132de4b39fb4" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.516215 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d5618c-9c6e-4837-a29e-132de4b39fb4" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.516869 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.516954 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.543044 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.545143 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.545394 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.595716 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.596368 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.599956 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.679868 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.679962 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.680324 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.680450 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtjl\" (UniqueName: \"kubernetes.io/projected/f7c2c75d-2b97-4e81-9701-486cee85dd93-kube-api-access-zvtjl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.680497 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.782508 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.782863 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.782940 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtjl\" (UniqueName: \"kubernetes.io/projected/f7c2c75d-2b97-4e81-9701-486cee85dd93-kube-api-access-zvtjl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.782981 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.783071 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.787922 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.789457 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.789522 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.793326 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.802759 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtjl\" (UniqueName: \"kubernetes.io/projected/f7c2c75d-2b97-4e81-9701-486cee85dd93-kube-api-access-zvtjl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:47 crc kubenswrapper[5129]: I0314 07:23:47.870546 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.054835 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d5618c-9c6e-4837-a29e-132de4b39fb4" path="/var/lib/kubelet/pods/61d5618c-9c6e-4837-a29e-132de4b39fb4/volumes" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.377236 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:23:48 crc kubenswrapper[5129]: W0314 07:23:48.380722 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c2c75d_2b97_4e81_9701_486cee85dd93.slice/crio-ca8e5265187f91cdf39d619d2ef915c6c6f0cd9b3c4a9b939bb3f60eb9b093e2 WatchSource:0}: Error finding container ca8e5265187f91cdf39d619d2ef915c6c6f0cd9b3c4a9b939bb3f60eb9b093e2: Status 404 returned error can't find the container with id ca8e5265187f91cdf39d619d2ef915c6c6f0cd9b3c4a9b939bb3f60eb9b093e2 Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.416791 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7c2c75d-2b97-4e81-9701-486cee85dd93","Type":"ContainerStarted","Data":"ca8e5265187f91cdf39d619d2ef915c6c6f0cd9b3c4a9b939bb3f60eb9b093e2"} Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.421020 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.609764 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-rmw6j"] Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.613234 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.624277 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-rmw6j"] Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.715733 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.715844 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-config\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.716015 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.716166 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.716213 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.716329 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv2pn\" (UniqueName: \"kubernetes.io/projected/9004e305-ef45-437a-a38c-b50c9d1f1ff7-kube-api-access-gv2pn\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.817964 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.818079 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-config\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.818125 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.818173 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.818203 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.818234 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv2pn\" (UniqueName: \"kubernetes.io/projected/9004e305-ef45-437a-a38c-b50c9d1f1ff7-kube-api-access-gv2pn\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.818966 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.819366 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.820806 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-config\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.821472 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.822210 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.836009 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv2pn\" (UniqueName: \"kubernetes.io/projected/9004e305-ef45-437a-a38c-b50c9d1f1ff7-kube-api-access-gv2pn\") pod \"dnsmasq-dns-fdb8f6449-rmw6j\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:48 crc kubenswrapper[5129]: I0314 07:23:48.943091 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:49 crc kubenswrapper[5129]: I0314 07:23:49.455840 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7c2c75d-2b97-4e81-9701-486cee85dd93","Type":"ContainerStarted","Data":"a0a68df17c077a14641c9e1331ad90f35cbd875eefb6a62fd05796b5756983a2"} Mar 14 07:23:49 crc kubenswrapper[5129]: I0314 07:23:49.472226 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.47220707 podStartE2EDuration="2.47220707s" podCreationTimestamp="2026-03-14 07:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:49.469139127 +0000 UTC m=+1492.221054321" watchObservedRunningTime="2026-03-14 07:23:49.47220707 +0000 UTC m=+1492.224122254" Mar 14 07:23:50 crc kubenswrapper[5129]: W0314 07:23:50.194170 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9004e305_ef45_437a_a38c_b50c9d1f1ff7.slice/crio-88fdc29c1e015191e1cd403fdfa3d15327812e9394854fd6ac94fe015610d680 WatchSource:0}: Error finding container 88fdc29c1e015191e1cd403fdfa3d15327812e9394854fd6ac94fe015610d680: Status 404 returned error can't find the container with id 88fdc29c1e015191e1cd403fdfa3d15327812e9394854fd6ac94fe015610d680 Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.200569 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-rmw6j"] Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.381887 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.382879 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-central-agent" containerID="cri-o://702852aa97619ff17b9aca405fa06fca1cdf0c0453611a14dd01bcea9b81beea" gracePeriod=30 Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.385840 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="sg-core" containerID="cri-o://67ebd3e0900cda153a2d7c86c88cc56e049e6b848fa7a51dc283e67ec1570d63" gracePeriod=30 Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.385941 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-notification-agent" containerID="cri-o://d43d62fd05411abc521ab71d3e3bfd276a3089d7c7b33fda842f74fa6dfec765" gracePeriod=30 Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.388659 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="proxy-httpd" containerID="cri-o://f2902aa5fe7fae1aa8da4bac45117a231c78db1e9fdc0bbe4629366a2ddd3ffd" gracePeriod=30 Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.402127 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.490409 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" event={"ID":"9004e305-ef45-437a-a38c-b50c9d1f1ff7","Type":"ContainerStarted","Data":"248d4f00157a530cd3ea73e75f32845dece93694cb179c040ed0222eb822593e"} Mar 14 07:23:50 crc kubenswrapper[5129]: I0314 07:23:50.490460 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" event={"ID":"9004e305-ef45-437a-a38c-b50c9d1f1ff7","Type":"ContainerStarted","Data":"88fdc29c1e015191e1cd403fdfa3d15327812e9394854fd6ac94fe015610d680"} Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.239974 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.506959 5129 generic.go:334] "Generic (PLEG): container finished" podID="333065cb-8979-4f3a-bef7-aea9960faed5" containerID="f2902aa5fe7fae1aa8da4bac45117a231c78db1e9fdc0bbe4629366a2ddd3ffd" exitCode=0 Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.507180 5129 generic.go:334] "Generic (PLEG): container finished" podID="333065cb-8979-4f3a-bef7-aea9960faed5" containerID="67ebd3e0900cda153a2d7c86c88cc56e049e6b848fa7a51dc283e67ec1570d63" exitCode=2 Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.507192 5129 generic.go:334] "Generic (PLEG): container finished" podID="333065cb-8979-4f3a-bef7-aea9960faed5" containerID="d43d62fd05411abc521ab71d3e3bfd276a3089d7c7b33fda842f74fa6dfec765" exitCode=0 Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.507200 5129 generic.go:334] "Generic (PLEG): container finished" podID="333065cb-8979-4f3a-bef7-aea9960faed5" containerID="702852aa97619ff17b9aca405fa06fca1cdf0c0453611a14dd01bcea9b81beea" exitCode=0 Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.507142 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerDied","Data":"f2902aa5fe7fae1aa8da4bac45117a231c78db1e9fdc0bbe4629366a2ddd3ffd"} Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.507298 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerDied","Data":"67ebd3e0900cda153a2d7c86c88cc56e049e6b848fa7a51dc283e67ec1570d63"} Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.507313 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerDied","Data":"d43d62fd05411abc521ab71d3e3bfd276a3089d7c7b33fda842f74fa6dfec765"} Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.507322 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerDied","Data":"702852aa97619ff17b9aca405fa06fca1cdf0c0453611a14dd01bcea9b81beea"} Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.509373 5129 generic.go:334] "Generic (PLEG): container finished" podID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerID="248d4f00157a530cd3ea73e75f32845dece93694cb179c040ed0222eb822593e" exitCode=0 Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.509533 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-log" containerID="cri-o://05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23" gracePeriod=30 Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.510359 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" event={"ID":"9004e305-ef45-437a-a38c-b50c9d1f1ff7","Type":"ContainerDied","Data":"248d4f00157a530cd3ea73e75f32845dece93694cb179c040ed0222eb822593e"} Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.510393 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" event={"ID":"9004e305-ef45-437a-a38c-b50c9d1f1ff7","Type":"ContainerStarted","Data":"df33d7fbb10361dc5cb6e14bde3e758c886880e563b624273a3f21cdeda670f9"} Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.510454 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-api" containerID="cri-o://9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4" gracePeriod=30 Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.510621 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.540708 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" podStartSLOduration=3.540689795 podStartE2EDuration="3.540689795s" podCreationTimestamp="2026-03-14 07:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:51.540024167 +0000 UTC m=+1494.291939351" watchObservedRunningTime="2026-03-14 07:23:51.540689795 +0000 UTC m=+1494.292604979" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.592393 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778106 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-run-httpd\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778203 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-ceilometer-tls-certs\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778230 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-scripts\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778260 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-log-httpd\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778307 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-sg-core-conf-yaml\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778331 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-combined-ca-bundle\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778359 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-config-data\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.778415 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t9ml\" (UniqueName: \"kubernetes.io/projected/333065cb-8979-4f3a-bef7-aea9960faed5-kube-api-access-4t9ml\") pod \"333065cb-8979-4f3a-bef7-aea9960faed5\" (UID: \"333065cb-8979-4f3a-bef7-aea9960faed5\") " Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.779070 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.779212 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.779373 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.779393 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/333065cb-8979-4f3a-bef7-aea9960faed5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.797840 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333065cb-8979-4f3a-bef7-aea9960faed5-kube-api-access-4t9ml" (OuterVolumeSpecName: "kube-api-access-4t9ml") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "kube-api-access-4t9ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.797835 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-scripts" (OuterVolumeSpecName: "scripts") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.837214 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.871582 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.881151 5129 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.881188 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.881201 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.881213 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t9ml\" (UniqueName: \"kubernetes.io/projected/333065cb-8979-4f3a-bef7-aea9960faed5-kube-api-access-4t9ml\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.890447 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.915671 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-config-data" (OuterVolumeSpecName: "config-data") pod "333065cb-8979-4f3a-bef7-aea9960faed5" (UID: "333065cb-8979-4f3a-bef7-aea9960faed5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.982669 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:51 crc kubenswrapper[5129]: I0314 07:23:51.982697 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333065cb-8979-4f3a-bef7-aea9960faed5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.082981 5129 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod93ede4e3-f6c3-4d3b-becd-c5493390e15b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod93ede4e3-f6c3-4d3b-becd-c5493390e15b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod93ede4e3_f6c3_4d3b_becd_c5493390e15b.slice" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.519829 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"333065cb-8979-4f3a-bef7-aea9960faed5","Type":"ContainerDied","Data":"33f1e822a34650ee37a8acbfb438c1a0691773095ebb51f93bb943b57046dd2a"} Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.520144 5129 scope.go:117] "RemoveContainer" containerID="f2902aa5fe7fae1aa8da4bac45117a231c78db1e9fdc0bbe4629366a2ddd3ffd" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.519894 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.521790 5129 generic.go:334] "Generic (PLEG): container finished" podID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerID="05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23" exitCode=143 Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.522137 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f01515f-9326-4177-86be-a0c4d37d25c8","Type":"ContainerDied","Data":"05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23"} Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.555994 5129 scope.go:117] "RemoveContainer" containerID="67ebd3e0900cda153a2d7c86c88cc56e049e6b848fa7a51dc283e67ec1570d63" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.556804 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.576423 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.588930 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:52 crc kubenswrapper[5129]: E0314 07:23:52.589288 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-central-agent" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589305 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-central-agent" Mar 14 07:23:52 crc kubenswrapper[5129]: E0314 07:23:52.589318 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="sg-core" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589324 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="sg-core" Mar 14 07:23:52 crc kubenswrapper[5129]: E0314 07:23:52.589350 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-notification-agent" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589358 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-notification-agent" Mar 14 07:23:52 crc kubenswrapper[5129]: E0314 07:23:52.589366 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="proxy-httpd" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589371 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="proxy-httpd" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589556 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-notification-agent" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589575 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="ceilometer-central-agent" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589583 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="sg-core" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.589594 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" containerName="proxy-httpd" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.591221 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.594820 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.594999 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.601015 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.613742 5129 scope.go:117] "RemoveContainer" containerID="d43d62fd05411abc521ab71d3e3bfd276a3089d7c7b33fda842f74fa6dfec765" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.629518 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.654446 5129 scope.go:117] "RemoveContainer" containerID="702852aa97619ff17b9aca405fa06fca1cdf0c0453611a14dd01bcea9b81beea" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.669900 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:52 crc kubenswrapper[5129]: E0314 07:23:52.670690 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-4m7df log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="005b8fff-7ecc-4a8c-a036-1c5909a67ce9" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694446 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-run-httpd\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694522 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694659 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-scripts\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694737 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-log-httpd\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694815 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-config-data\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694856 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694884 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.694906 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7df\" (UniqueName: \"kubernetes.io/projected/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-kube-api-access-4m7df\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.796994 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-log-httpd\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.797749 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-log-httpd\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.799945 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-config-data\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.800134 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.800309 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.800453 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7df\" (UniqueName: \"kubernetes.io/projected/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-kube-api-access-4m7df\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.800756 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-run-httpd\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.800961 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.801145 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-scripts\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.801054 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-run-httpd\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.805059 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.805381 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-config-data\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.806064 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.809213 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-scripts\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.821970 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.822977 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7df\" (UniqueName: \"kubernetes.io/projected/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-kube-api-access-4m7df\") pod \"ceilometer-0\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " pod="openstack/ceilometer-0" Mar 14 07:23:52 crc kubenswrapper[5129]: I0314 07:23:52.870964 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.538400 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.557535 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.719386 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-config-data\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.719496 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-sg-core-conf-yaml\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.719560 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-log-httpd\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.719740 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-combined-ca-bundle\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.719832 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m7df\" (UniqueName: \"kubernetes.io/projected/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-kube-api-access-4m7df\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.719884 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-scripts\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.720018 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-ceilometer-tls-certs\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.720103 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-run-httpd\") pod \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\" (UID: \"005b8fff-7ecc-4a8c-a036-1c5909a67ce9\") " Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.720171 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.720526 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.720724 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.720753 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.726446 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-scripts" (OuterVolumeSpecName: "scripts") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.726589 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-config-data" (OuterVolumeSpecName: "config-data") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.726813 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.728732 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-kube-api-access-4m7df" (OuterVolumeSpecName: "kube-api-access-4m7df") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "kube-api-access-4m7df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.729242 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.748993 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "005b8fff-7ecc-4a8c-a036-1c5909a67ce9" (UID: "005b8fff-7ecc-4a8c-a036-1c5909a67ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.823908 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.823980 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.823996 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.824010 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m7df\" (UniqueName: \"kubernetes.io/projected/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-kube-api-access-4m7df\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.824023 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:53 crc kubenswrapper[5129]: I0314 07:23:53.824035 5129 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/005b8fff-7ecc-4a8c-a036-1c5909a67ce9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.054858 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333065cb-8979-4f3a-bef7-aea9960faed5" path="/var/lib/kubelet/pods/333065cb-8979-4f3a-bef7-aea9960faed5/volumes" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.547035 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.615385 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.628238 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.639206 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.641894 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.646709 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.646719 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.647282 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.649650 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.742403 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-config-data\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.742499 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.742576 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-scripts\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.742802 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvd74\" (UniqueName: \"kubernetes.io/projected/b169049b-3ab6-4871-ae92-0876f27347e6-kube-api-access-pvd74\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.742904 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-run-httpd\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.743135 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-log-httpd\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.743324 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.743447 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.845413 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-log-httpd\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.846004 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.846065 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.846113 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-config-data\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.846149 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.846186 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-scripts\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.846265 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvd74\" (UniqueName: \"kubernetes.io/projected/b169049b-3ab6-4871-ae92-0876f27347e6-kube-api-access-pvd74\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.846324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-run-httpd\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.847052 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-run-httpd\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.847060 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-log-httpd\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.852438 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.852812 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-scripts\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.853228 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.858165 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.859530 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-config-data\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.868358 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvd74\" (UniqueName: \"kubernetes.io/projected/b169049b-3ab6-4871-ae92-0876f27347e6-kube-api-access-pvd74\") pod \"ceilometer-0\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " pod="openstack/ceilometer-0" Mar 14 07:23:54 crc kubenswrapper[5129]: I0314 07:23:54.971106 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.161745 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.255608 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjtzj\" (UniqueName: \"kubernetes.io/projected/6f01515f-9326-4177-86be-a0c4d37d25c8-kube-api-access-zjtzj\") pod \"6f01515f-9326-4177-86be-a0c4d37d25c8\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.255750 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f01515f-9326-4177-86be-a0c4d37d25c8-logs\") pod \"6f01515f-9326-4177-86be-a0c4d37d25c8\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.255823 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-config-data\") pod \"6f01515f-9326-4177-86be-a0c4d37d25c8\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.255860 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-combined-ca-bundle\") pod \"6f01515f-9326-4177-86be-a0c4d37d25c8\" (UID: \"6f01515f-9326-4177-86be-a0c4d37d25c8\") " Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.256265 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f01515f-9326-4177-86be-a0c4d37d25c8-logs" (OuterVolumeSpecName: "logs") pod "6f01515f-9326-4177-86be-a0c4d37d25c8" (UID: "6f01515f-9326-4177-86be-a0c4d37d25c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.258092 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f01515f-9326-4177-86be-a0c4d37d25c8-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.261092 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f01515f-9326-4177-86be-a0c4d37d25c8-kube-api-access-zjtzj" (OuterVolumeSpecName: "kube-api-access-zjtzj") pod "6f01515f-9326-4177-86be-a0c4d37d25c8" (UID: "6f01515f-9326-4177-86be-a0c4d37d25c8"). InnerVolumeSpecName "kube-api-access-zjtzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.292120 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-config-data" (OuterVolumeSpecName: "config-data") pod "6f01515f-9326-4177-86be-a0c4d37d25c8" (UID: "6f01515f-9326-4177-86be-a0c4d37d25c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.295760 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f01515f-9326-4177-86be-a0c4d37d25c8" (UID: "6f01515f-9326-4177-86be-a0c4d37d25c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.360664 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.360704 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f01515f-9326-4177-86be-a0c4d37d25c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.360717 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjtzj\" (UniqueName: \"kubernetes.io/projected/6f01515f-9326-4177-86be-a0c4d37d25c8-kube-api-access-zjtzj\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.477837 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:23:55 crc kubenswrapper[5129]: W0314 07:23:55.478290 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb169049b_3ab6_4871_ae92_0876f27347e6.slice/crio-04e6ba33b3a7d776ccd181f000fa1c90456cfbbf8b68856377b5cc35847eb22a WatchSource:0}: Error finding container 04e6ba33b3a7d776ccd181f000fa1c90456cfbbf8b68856377b5cc35847eb22a: Status 404 returned error can't find the container with id 04e6ba33b3a7d776ccd181f000fa1c90456cfbbf8b68856377b5cc35847eb22a Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.558163 5129 generic.go:334] "Generic (PLEG): container finished" podID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerID="9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4" exitCode=0 Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.558205 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f01515f-9326-4177-86be-a0c4d37d25c8","Type":"ContainerDied","Data":"9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4"} Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.558262 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f01515f-9326-4177-86be-a0c4d37d25c8","Type":"ContainerDied","Data":"ebbd8ec2bfed37ad4514964c9e175d8fbfc81364b09dd503f3e5a9a98e5da045"} Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.558289 5129 scope.go:117] "RemoveContainer" containerID="9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.558220 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.559991 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerStarted","Data":"04e6ba33b3a7d776ccd181f000fa1c90456cfbbf8b68856377b5cc35847eb22a"} Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.587797 5129 scope.go:117] "RemoveContainer" containerID="05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.590780 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.600741 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.612152 5129 scope.go:117] "RemoveContainer" containerID="9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4" Mar 14 07:23:55 crc kubenswrapper[5129]: E0314 07:23:55.612970 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4\": container with ID starting with 9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4 not found: ID does not exist" containerID="9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.613024 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4"} err="failed to get container status \"9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4\": rpc error: code = NotFound desc = could not find container \"9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4\": container with ID starting with 9cb11ab7bd60eef674003221c4588ad35b5af15ece838388a845616ff46df3c4 not found: ID does not exist" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.613052 5129 scope.go:117] "RemoveContainer" containerID="05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23" Mar 14 07:23:55 crc kubenswrapper[5129]: E0314 07:23:55.613337 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23\": container with ID starting with 05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23 not found: ID does not exist" containerID="05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.613363 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23"} err="failed to get container status \"05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23\": rpc error: code = NotFound desc = could not find container \"05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23\": container with ID starting with 05a7f002c3841fb0a42f18259294bf61ebfb9c2780f11bfec0a0c98003b29c23 not found: ID does not exist" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.624821 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:55 crc kubenswrapper[5129]: E0314 07:23:55.625548 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-api" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.625587 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-api" Mar 14 07:23:55 crc kubenswrapper[5129]: E0314 07:23:55.625652 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-log" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.625667 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-log" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.626031 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-api" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.626061 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" containerName="nova-api-log" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.629209 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.636012 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.636431 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.642075 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.642507 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.766929 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-config-data\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.766980 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.767057 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f98da9-4046-4200-bc8c-751a841e68b9-logs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.767236 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.767429 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc77w\" (UniqueName: \"kubernetes.io/projected/99f98da9-4046-4200-bc8c-751a841e68b9-kube-api-access-nc77w\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.767663 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.869239 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-config-data\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.869283 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.869346 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f98da9-4046-4200-bc8c-751a841e68b9-logs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.869365 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.869394 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc77w\" (UniqueName: \"kubernetes.io/projected/99f98da9-4046-4200-bc8c-751a841e68b9-kube-api-access-nc77w\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.869434 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.870701 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f98da9-4046-4200-bc8c-751a841e68b9-logs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.873217 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.873264 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-config-data\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.874350 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.874776 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.898878 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc77w\" (UniqueName: \"kubernetes.io/projected/99f98da9-4046-4200-bc8c-751a841e68b9-kube-api-access-nc77w\") pod \"nova-api-0\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " pod="openstack/nova-api-0" Mar 14 07:23:55 crc kubenswrapper[5129]: I0314 07:23:55.949479 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:23:56 crc kubenswrapper[5129]: I0314 07:23:56.075539 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="005b8fff-7ecc-4a8c-a036-1c5909a67ce9" path="/var/lib/kubelet/pods/005b8fff-7ecc-4a8c-a036-1c5909a67ce9/volumes" Mar 14 07:23:56 crc kubenswrapper[5129]: I0314 07:23:56.076884 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f01515f-9326-4177-86be-a0c4d37d25c8" path="/var/lib/kubelet/pods/6f01515f-9326-4177-86be-a0c4d37d25c8/volumes" Mar 14 07:23:56 crc kubenswrapper[5129]: W0314 07:23:56.435673 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f98da9_4046_4200_bc8c_751a841e68b9.slice/crio-098ae4cfab766f38d675ed99fd8cabd24b0bb295a8ccee638ad02c64a31b6d77 WatchSource:0}: Error finding container 098ae4cfab766f38d675ed99fd8cabd24b0bb295a8ccee638ad02c64a31b6d77: Status 404 returned error can't find the container with id 098ae4cfab766f38d675ed99fd8cabd24b0bb295a8ccee638ad02c64a31b6d77 Mar 14 07:23:56 crc kubenswrapper[5129]: I0314 07:23:56.438983 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:23:56 crc kubenswrapper[5129]: I0314 07:23:56.571323 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99f98da9-4046-4200-bc8c-751a841e68b9","Type":"ContainerStarted","Data":"098ae4cfab766f38d675ed99fd8cabd24b0bb295a8ccee638ad02c64a31b6d77"} Mar 14 07:23:56 crc kubenswrapper[5129]: I0314 07:23:56.573922 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerStarted","Data":"bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d"} Mar 14 07:23:57 crc kubenswrapper[5129]: I0314 07:23:57.590908 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerStarted","Data":"17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79"} Mar 14 07:23:57 crc kubenswrapper[5129]: I0314 07:23:57.591339 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerStarted","Data":"1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a"} Mar 14 07:23:57 crc kubenswrapper[5129]: I0314 07:23:57.594161 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99f98da9-4046-4200-bc8c-751a841e68b9","Type":"ContainerStarted","Data":"e9477cfbc71ae599463d4c4b0d447a1dd5f3c86e9cc78a502adfbeda1f9627fd"} Mar 14 07:23:57 crc kubenswrapper[5129]: I0314 07:23:57.594211 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99f98da9-4046-4200-bc8c-751a841e68b9","Type":"ContainerStarted","Data":"1461ca6435203ff818e6d04644df8c180a0bc2e4993a51540b816d10d9077054"} Mar 14 07:23:57 crc kubenswrapper[5129]: I0314 07:23:57.628065 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.628040126 podStartE2EDuration="2.628040126s" podCreationTimestamp="2026-03-14 07:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:23:57.622472344 +0000 UTC m=+1500.374387538" watchObservedRunningTime="2026-03-14 07:23:57.628040126 +0000 UTC m=+1500.379955310" Mar 14 07:23:57 crc kubenswrapper[5129]: I0314 07:23:57.871173 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:57 crc kubenswrapper[5129]: I0314 07:23:57.901039 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.644551 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.827031 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-msq48"] Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.828436 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.831190 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.831276 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.834514 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-msq48"] Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.935724 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-config-data\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.935813 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q26s\" (UniqueName: \"kubernetes.io/projected/aa95bffb-d8b1-484f-9802-37d9793ef659-kube-api-access-9q26s\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.935949 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-scripts\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.935983 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:58 crc kubenswrapper[5129]: I0314 07:23:58.946498 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.017205 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-ttqvv"] Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.018588 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" podUID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerName="dnsmasq-dns" containerID="cri-o://f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9" gracePeriod=10 Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.036913 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-config-data\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.036991 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q26s\" (UniqueName: \"kubernetes.io/projected/aa95bffb-d8b1-484f-9802-37d9793ef659-kube-api-access-9q26s\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.037080 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-scripts\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.037109 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.043710 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.045911 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-config-data\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.048033 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-scripts\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.059849 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q26s\" (UniqueName: \"kubernetes.io/projected/aa95bffb-d8b1-484f-9802-37d9793ef659-kube-api-access-9q26s\") pod \"nova-cell1-cell-mapping-msq48\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.159403 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.508957 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.645298 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerStarted","Data":"7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b"} Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.646745 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.651748 5129 generic.go:334] "Generic (PLEG): container finished" podID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerID="f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9" exitCode=0 Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.651795 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" event={"ID":"6622666d-efc0-49fe-84d3-d8b8113a2ee2","Type":"ContainerDied","Data":"f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9"} Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.651803 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.651976 5129 scope.go:117] "RemoveContainer" containerID="f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.652318 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-ttqvv" event={"ID":"6622666d-efc0-49fe-84d3-d8b8113a2ee2","Type":"ContainerDied","Data":"6f4c55f3734dd8d051cff530ce688022fffc996d0e721c664d0a18bc4764d82d"} Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.654145 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-sb\") pod \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.654216 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj7s2\" (UniqueName: \"kubernetes.io/projected/6622666d-efc0-49fe-84d3-d8b8113a2ee2-kube-api-access-bj7s2\") pod \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.654254 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-config\") pod \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.654337 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-nb\") pod \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.654374 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-swift-storage-0\") pod \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.654407 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-svc\") pod \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\" (UID: \"6622666d-efc0-49fe-84d3-d8b8113a2ee2\") " Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.667976 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6622666d-efc0-49fe-84d3-d8b8113a2ee2-kube-api-access-bj7s2" (OuterVolumeSpecName: "kube-api-access-bj7s2") pod "6622666d-efc0-49fe-84d3-d8b8113a2ee2" (UID: "6622666d-efc0-49fe-84d3-d8b8113a2ee2"). InnerVolumeSpecName "kube-api-access-bj7s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:23:59 crc kubenswrapper[5129]: W0314 07:23:59.684924 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa95bffb_d8b1_484f_9802_37d9793ef659.slice/crio-6d12033587a934db72e0b613a31214764a52f75b13b9c2a65e5f565dc5dd6e7d WatchSource:0}: Error finding container 6d12033587a934db72e0b613a31214764a52f75b13b9c2a65e5f565dc5dd6e7d: Status 404 returned error can't find the container with id 6d12033587a934db72e0b613a31214764a52f75b13b9c2a65e5f565dc5dd6e7d Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.686236 5129 scope.go:117] "RemoveContainer" containerID="f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.694918 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.324317395 podStartE2EDuration="5.694899547s" podCreationTimestamp="2026-03-14 07:23:54 +0000 UTC" firstStartedPulling="2026-03-14 07:23:55.481554743 +0000 UTC m=+1498.233469927" lastFinishedPulling="2026-03-14 07:23:58.852136895 +0000 UTC m=+1501.604052079" observedRunningTime="2026-03-14 07:23:59.670084579 +0000 UTC m=+1502.421999783" watchObservedRunningTime="2026-03-14 07:23:59.694899547 +0000 UTC m=+1502.446814741" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.695722 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-msq48"] Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.714948 5129 scope.go:117] "RemoveContainer" containerID="f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9" Mar 14 07:23:59 crc kubenswrapper[5129]: E0314 07:23:59.715598 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9\": container with ID starting with f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9 not found: ID does not exist" containerID="f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.715683 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9"} err="failed to get container status \"f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9\": rpc error: code = NotFound desc = could not find container \"f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9\": container with ID starting with f9ae1e749252c55621298f2cb13073bee66ef1ee70765ca5bf47b8190a1976d9 not found: ID does not exist" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.715713 5129 scope.go:117] "RemoveContainer" containerID="f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a" Mar 14 07:23:59 crc kubenswrapper[5129]: E0314 07:23:59.716440 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a\": container with ID starting with f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a not found: ID does not exist" containerID="f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.716468 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a"} err="failed to get container status \"f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a\": rpc error: code = NotFound desc = could not find container \"f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a\": container with ID starting with f198537c8754f43f44effac2466caac39bf8f9d449d09322cf8548c5d6827c9a not found: ID does not exist" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.726571 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6622666d-efc0-49fe-84d3-d8b8113a2ee2" (UID: "6622666d-efc0-49fe-84d3-d8b8113a2ee2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.727116 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6622666d-efc0-49fe-84d3-d8b8113a2ee2" (UID: "6622666d-efc0-49fe-84d3-d8b8113a2ee2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.738727 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6622666d-efc0-49fe-84d3-d8b8113a2ee2" (UID: "6622666d-efc0-49fe-84d3-d8b8113a2ee2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.742172 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6622666d-efc0-49fe-84d3-d8b8113a2ee2" (UID: "6622666d-efc0-49fe-84d3-d8b8113a2ee2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.742552 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-config" (OuterVolumeSpecName: "config") pod "6622666d-efc0-49fe-84d3-d8b8113a2ee2" (UID: "6622666d-efc0-49fe-84d3-d8b8113a2ee2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.757002 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj7s2\" (UniqueName: \"kubernetes.io/projected/6622666d-efc0-49fe-84d3-d8b8113a2ee2-kube-api-access-bj7s2\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.757031 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.757041 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.757050 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.757060 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.757068 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6622666d-efc0-49fe-84d3-d8b8113a2ee2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.982779 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-ttqvv"] Mar 14 07:23:59 crc kubenswrapper[5129]: I0314 07:23:59.993374 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-ttqvv"] Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.048070 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" path="/var/lib/kubelet/pods/6622666d-efc0-49fe-84d3-d8b8113a2ee2/volumes" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.135825 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557884-67wtn"] Mar 14 07:24:00 crc kubenswrapper[5129]: E0314 07:24:00.136313 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerName="init" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.136334 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerName="init" Mar 14 07:24:00 crc kubenswrapper[5129]: E0314 07:24:00.136356 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerName="dnsmasq-dns" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.136365 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerName="dnsmasq-dns" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.136607 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6622666d-efc0-49fe-84d3-d8b8113a2ee2" containerName="dnsmasq-dns" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.137356 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-67wtn" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.139233 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.139237 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.140382 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.144884 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-67wtn"] Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.269318 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfmcl\" (UniqueName: \"kubernetes.io/projected/57fff26f-ddfc-4ac7-a184-6453a398b6d2-kube-api-access-qfmcl\") pod \"auto-csr-approver-29557884-67wtn\" (UID: \"57fff26f-ddfc-4ac7-a184-6453a398b6d2\") " pod="openshift-infra/auto-csr-approver-29557884-67wtn" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.371499 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfmcl\" (UniqueName: \"kubernetes.io/projected/57fff26f-ddfc-4ac7-a184-6453a398b6d2-kube-api-access-qfmcl\") pod \"auto-csr-approver-29557884-67wtn\" (UID: \"57fff26f-ddfc-4ac7-a184-6453a398b6d2\") " pod="openshift-infra/auto-csr-approver-29557884-67wtn" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.389921 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfmcl\" (UniqueName: \"kubernetes.io/projected/57fff26f-ddfc-4ac7-a184-6453a398b6d2-kube-api-access-qfmcl\") pod \"auto-csr-approver-29557884-67wtn\" (UID: \"57fff26f-ddfc-4ac7-a184-6453a398b6d2\") " pod="openshift-infra/auto-csr-approver-29557884-67wtn" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.456980 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-67wtn" Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.669213 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-msq48" event={"ID":"aa95bffb-d8b1-484f-9802-37d9793ef659","Type":"ContainerStarted","Data":"f34be114ca921e0d7a9c3d7e6dafcb312a64592a1b2a5df49c39e4250824fd0f"} Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.669547 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-msq48" event={"ID":"aa95bffb-d8b1-484f-9802-37d9793ef659","Type":"ContainerStarted","Data":"6d12033587a934db72e0b613a31214764a52f75b13b9c2a65e5f565dc5dd6e7d"} Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.689446 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-msq48" podStartSLOduration=2.68942526 podStartE2EDuration="2.68942526s" podCreationTimestamp="2026-03-14 07:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:24:00.682886642 +0000 UTC m=+1503.434801836" watchObservedRunningTime="2026-03-14 07:24:00.68942526 +0000 UTC m=+1503.441340454" Mar 14 07:24:00 crc kubenswrapper[5129]: W0314 07:24:00.922269 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57fff26f_ddfc_4ac7_a184_6453a398b6d2.slice/crio-ea44d009049a1a5882d177e4179911f187da44821efe3f365118714d9c45fb28 WatchSource:0}: Error finding container ea44d009049a1a5882d177e4179911f187da44821efe3f365118714d9c45fb28: Status 404 returned error can't find the container with id ea44d009049a1a5882d177e4179911f187da44821efe3f365118714d9c45fb28 Mar 14 07:24:00 crc kubenswrapper[5129]: I0314 07:24:00.931215 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-67wtn"] Mar 14 07:24:01 crc kubenswrapper[5129]: I0314 07:24:01.699170 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557884-67wtn" event={"ID":"57fff26f-ddfc-4ac7-a184-6453a398b6d2","Type":"ContainerStarted","Data":"ea44d009049a1a5882d177e4179911f187da44821efe3f365118714d9c45fb28"} Mar 14 07:24:02 crc kubenswrapper[5129]: I0314 07:24:02.713870 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557884-67wtn" event={"ID":"57fff26f-ddfc-4ac7-a184-6453a398b6d2","Type":"ContainerDied","Data":"2656cf6975a47fcc3760e2d72b9120d77f55fb73efb37f6c295fa023565203e8"} Mar 14 07:24:02 crc kubenswrapper[5129]: I0314 07:24:02.713683 5129 generic.go:334] "Generic (PLEG): container finished" podID="57fff26f-ddfc-4ac7-a184-6453a398b6d2" containerID="2656cf6975a47fcc3760e2d72b9120d77f55fb73efb37f6c295fa023565203e8" exitCode=0 Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.157757 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-67wtn" Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.254651 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfmcl\" (UniqueName: \"kubernetes.io/projected/57fff26f-ddfc-4ac7-a184-6453a398b6d2-kube-api-access-qfmcl\") pod \"57fff26f-ddfc-4ac7-a184-6453a398b6d2\" (UID: \"57fff26f-ddfc-4ac7-a184-6453a398b6d2\") " Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.260424 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fff26f-ddfc-4ac7-a184-6453a398b6d2-kube-api-access-qfmcl" (OuterVolumeSpecName: "kube-api-access-qfmcl") pod "57fff26f-ddfc-4ac7-a184-6453a398b6d2" (UID: "57fff26f-ddfc-4ac7-a184-6453a398b6d2"). InnerVolumeSpecName "kube-api-access-qfmcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.356469 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfmcl\" (UniqueName: \"kubernetes.io/projected/57fff26f-ddfc-4ac7-a184-6453a398b6d2-kube-api-access-qfmcl\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.747544 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557884-67wtn" event={"ID":"57fff26f-ddfc-4ac7-a184-6453a398b6d2","Type":"ContainerDied","Data":"ea44d009049a1a5882d177e4179911f187da44821efe3f365118714d9c45fb28"} Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.747635 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea44d009049a1a5882d177e4179911f187da44821efe3f365118714d9c45fb28" Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.747713 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557884-67wtn" Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.752838 5129 generic.go:334] "Generic (PLEG): container finished" podID="aa95bffb-d8b1-484f-9802-37d9793ef659" containerID="f34be114ca921e0d7a9c3d7e6dafcb312a64592a1b2a5df49c39e4250824fd0f" exitCode=0 Mar 14 07:24:04 crc kubenswrapper[5129]: I0314 07:24:04.752883 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-msq48" event={"ID":"aa95bffb-d8b1-484f-9802-37d9793ef659","Type":"ContainerDied","Data":"f34be114ca921e0d7a9c3d7e6dafcb312a64592a1b2a5df49c39e4250824fd0f"} Mar 14 07:24:05 crc kubenswrapper[5129]: I0314 07:24:05.284052 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-7lvhr"] Mar 14 07:24:05 crc kubenswrapper[5129]: I0314 07:24:05.293742 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557878-7lvhr"] Mar 14 07:24:05 crc kubenswrapper[5129]: I0314 07:24:05.950922 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:24:05 crc kubenswrapper[5129]: I0314 07:24:05.951013 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.048659 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed5d2dbd-b414-462c-9fb1-432593876d05" path="/var/lib/kubelet/pods/ed5d2dbd-b414-462c-9fb1-432593876d05/volumes" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.163123 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.205865 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-config-data\") pod \"aa95bffb-d8b1-484f-9802-37d9793ef659\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.206084 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-combined-ca-bundle\") pod \"aa95bffb-d8b1-484f-9802-37d9793ef659\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.206225 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q26s\" (UniqueName: \"kubernetes.io/projected/aa95bffb-d8b1-484f-9802-37d9793ef659-kube-api-access-9q26s\") pod \"aa95bffb-d8b1-484f-9802-37d9793ef659\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.206303 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-scripts\") pod \"aa95bffb-d8b1-484f-9802-37d9793ef659\" (UID: \"aa95bffb-d8b1-484f-9802-37d9793ef659\") " Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.211416 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-scripts" (OuterVolumeSpecName: "scripts") pod "aa95bffb-d8b1-484f-9802-37d9793ef659" (UID: "aa95bffb-d8b1-484f-9802-37d9793ef659"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.211788 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa95bffb-d8b1-484f-9802-37d9793ef659-kube-api-access-9q26s" (OuterVolumeSpecName: "kube-api-access-9q26s") pod "aa95bffb-d8b1-484f-9802-37d9793ef659" (UID: "aa95bffb-d8b1-484f-9802-37d9793ef659"). InnerVolumeSpecName "kube-api-access-9q26s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.231956 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-config-data" (OuterVolumeSpecName: "config-data") pod "aa95bffb-d8b1-484f-9802-37d9793ef659" (UID: "aa95bffb-d8b1-484f-9802-37d9793ef659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.235867 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa95bffb-d8b1-484f-9802-37d9793ef659" (UID: "aa95bffb-d8b1-484f-9802-37d9793ef659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.308675 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q26s\" (UniqueName: \"kubernetes.io/projected/aa95bffb-d8b1-484f-9802-37d9793ef659-kube-api-access-9q26s\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.308705 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.308715 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.308723 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa95bffb-d8b1-484f-9802-37d9793ef659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.773169 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-msq48" event={"ID":"aa95bffb-d8b1-484f-9802-37d9793ef659","Type":"ContainerDied","Data":"6d12033587a934db72e0b613a31214764a52f75b13b9c2a65e5f565dc5dd6e7d"} Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.773424 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d12033587a934db72e0b613a31214764a52f75b13b9c2a65e5f565dc5dd6e7d" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.773435 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-msq48" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.970904 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:24:06 crc kubenswrapper[5129]: I0314 07:24:06.970917 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.084770 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.085130 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-log" containerID="cri-o://e9477cfbc71ae599463d4c4b0d447a1dd5f3c86e9cc78a502adfbeda1f9627fd" gracePeriod=30 Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.085245 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-api" containerID="cri-o://1461ca6435203ff818e6d04644df8c180a0bc2e4993a51540b816d10d9077054" gracePeriod=30 Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.109506 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.109863 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e1522514-570d-4f82-81f3-15db416cca79" containerName="nova-scheduler-scheduler" containerID="cri-o://6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314" gracePeriod=30 Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.166898 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.167240 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-log" containerID="cri-o://c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3" gracePeriod=30 Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.167861 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-metadata" containerID="cri-o://bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9" gracePeriod=30 Mar 14 07:24:07 crc kubenswrapper[5129]: E0314 07:24:07.605844 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:07 crc kubenswrapper[5129]: E0314 07:24:07.608338 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:07 crc kubenswrapper[5129]: E0314 07:24:07.610033 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:07 crc kubenswrapper[5129]: E0314 07:24:07.610073 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e1522514-570d-4f82-81f3-15db416cca79" containerName="nova-scheduler-scheduler" Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.789696 5129 generic.go:334] "Generic (PLEG): container finished" podID="99f98da9-4046-4200-bc8c-751a841e68b9" containerID="e9477cfbc71ae599463d4c4b0d447a1dd5f3c86e9cc78a502adfbeda1f9627fd" exitCode=143 Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.789805 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99f98da9-4046-4200-bc8c-751a841e68b9","Type":"ContainerDied","Data":"e9477cfbc71ae599463d4c4b0d447a1dd5f3c86e9cc78a502adfbeda1f9627fd"} Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.792864 5129 generic.go:334] "Generic (PLEG): container finished" podID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerID="c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3" exitCode=143 Mar 14 07:24:07 crc kubenswrapper[5129]: I0314 07:24:07.792919 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe","Type":"ContainerDied","Data":"c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3"} Mar 14 07:24:10 crc kubenswrapper[5129]: E0314 07:24:10.524461 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3ba50bc_2f3a_4d00_bad8_639b7ad63ebe.slice/crio-bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9.scope\": RecentStats: unable to find data in memory cache]" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.817348 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.826959 5129 generic.go:334] "Generic (PLEG): container finished" podID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerID="bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9" exitCode=0 Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.827004 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe","Type":"ContainerDied","Data":"bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9"} Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.827041 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe","Type":"ContainerDied","Data":"bca274f8535dd4f7decf1cac753bc98f2ff2b0127f475f113b8af2eafad77788"} Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.827065 5129 scope.go:117] "RemoveContainer" containerID="bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.827090 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.882169 5129 scope.go:117] "RemoveContainer" containerID="c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.901029 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-logs\") pod \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.901117 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-combined-ca-bundle\") pod \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.901305 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-config-data\") pod \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.901394 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-nova-metadata-tls-certs\") pod \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.901455 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr5pn\" (UniqueName: \"kubernetes.io/projected/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-kube-api-access-zr5pn\") pod \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\" (UID: \"f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe\") " Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.902549 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-logs" (OuterVolumeSpecName: "logs") pod "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" (UID: "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.917143 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-kube-api-access-zr5pn" (OuterVolumeSpecName: "kube-api-access-zr5pn") pod "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" (UID: "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe"). InnerVolumeSpecName "kube-api-access-zr5pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.918064 5129 scope.go:117] "RemoveContainer" containerID="bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9" Mar 14 07:24:10 crc kubenswrapper[5129]: E0314 07:24:10.920041 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9\": container with ID starting with bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9 not found: ID does not exist" containerID="bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.920100 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9"} err="failed to get container status \"bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9\": rpc error: code = NotFound desc = could not find container \"bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9\": container with ID starting with bc868fcdb0b52e9bafc2d322391d1ed044b20ed25023bb0f71afe35f96d08fe9 not found: ID does not exist" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.920130 5129 scope.go:117] "RemoveContainer" containerID="c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3" Mar 14 07:24:10 crc kubenswrapper[5129]: E0314 07:24:10.920578 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3\": container with ID starting with c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3 not found: ID does not exist" containerID="c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.920641 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3"} err="failed to get container status \"c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3\": rpc error: code = NotFound desc = could not find container \"c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3\": container with ID starting with c403a0387349ace2fd626e02416fda92537e0233953e71e39549f7d2643de6a3 not found: ID does not exist" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.941875 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-config-data" (OuterVolumeSpecName: "config-data") pod "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" (UID: "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.946174 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" (UID: "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:10 crc kubenswrapper[5129]: I0314 07:24:10.994398 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" (UID: "f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.003434 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.003463 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.003477 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.003489 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.003502 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr5pn\" (UniqueName: \"kubernetes.io/projected/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe-kube-api-access-zr5pn\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.192395 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.201360 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.215888 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:11 crc kubenswrapper[5129]: E0314 07:24:11.216543 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-log" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.216640 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-log" Mar 14 07:24:11 crc kubenswrapper[5129]: E0314 07:24:11.216722 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa95bffb-d8b1-484f-9802-37d9793ef659" containerName="nova-manage" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.216784 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa95bffb-d8b1-484f-9802-37d9793ef659" containerName="nova-manage" Mar 14 07:24:11 crc kubenswrapper[5129]: E0314 07:24:11.216849 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fff26f-ddfc-4ac7-a184-6453a398b6d2" containerName="oc" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.217070 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fff26f-ddfc-4ac7-a184-6453a398b6d2" containerName="oc" Mar 14 07:24:11 crc kubenswrapper[5129]: E0314 07:24:11.217151 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-metadata" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.217218 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-metadata" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.217480 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-log" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.217555 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa95bffb-d8b1-484f-9802-37d9793ef659" containerName="nova-manage" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.217640 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fff26f-ddfc-4ac7-a184-6453a398b6d2" containerName="oc" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.217729 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" containerName="nova-metadata-metadata" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.218942 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.225689 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.226239 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.239695 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.312752 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.312803 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkwjd\" (UniqueName: \"kubernetes.io/projected/f12ef33b-b86d-4c80-8f19-385ff5a93fee-kube-api-access-xkwjd\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.312965 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.313034 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-config-data\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.313101 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f12ef33b-b86d-4c80-8f19-385ff5a93fee-logs\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.414706 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.414768 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwjd\" (UniqueName: \"kubernetes.io/projected/f12ef33b-b86d-4c80-8f19-385ff5a93fee-kube-api-access-xkwjd\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.414880 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.414941 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-config-data\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.414996 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f12ef33b-b86d-4c80-8f19-385ff5a93fee-logs\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.415490 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f12ef33b-b86d-4c80-8f19-385ff5a93fee-logs\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.422573 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.426102 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-config-data\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.427075 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.467251 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwjd\" (UniqueName: \"kubernetes.io/projected/f12ef33b-b86d-4c80-8f19-385ff5a93fee-kube-api-access-xkwjd\") pod \"nova-metadata-0\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.539854 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.842505 5129 generic.go:334] "Generic (PLEG): container finished" podID="e1522514-570d-4f82-81f3-15db416cca79" containerID="6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314" exitCode=0 Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.842941 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1522514-570d-4f82-81f3-15db416cca79","Type":"ContainerDied","Data":"6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314"} Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.845320 5129 generic.go:334] "Generic (PLEG): container finished" podID="99f98da9-4046-4200-bc8c-751a841e68b9" containerID="1461ca6435203ff818e6d04644df8c180a0bc2e4993a51540b816d10d9077054" exitCode=0 Mar 14 07:24:11 crc kubenswrapper[5129]: I0314 07:24:11.845370 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99f98da9-4046-4200-bc8c-751a841e68b9","Type":"ContainerDied","Data":"1461ca6435203ff818e6d04644df8c180a0bc2e4993a51540b816d10d9077054"} Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.032633 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.048877 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe" path="/var/lib/kubelet/pods/f3ba50bc-2f3a-4d00-bad8-639b7ad63ebe/volumes" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.049273 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.090525 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127385 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-config-data\") pod \"e1522514-570d-4f82-81f3-15db416cca79\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127496 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-public-tls-certs\") pod \"99f98da9-4046-4200-bc8c-751a841e68b9\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127522 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f98da9-4046-4200-bc8c-751a841e68b9-logs\") pod \"99f98da9-4046-4200-bc8c-751a841e68b9\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127582 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-combined-ca-bundle\") pod \"e1522514-570d-4f82-81f3-15db416cca79\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127647 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-internal-tls-certs\") pod \"99f98da9-4046-4200-bc8c-751a841e68b9\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127667 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-combined-ca-bundle\") pod \"99f98da9-4046-4200-bc8c-751a841e68b9\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127743 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-config-data\") pod \"99f98da9-4046-4200-bc8c-751a841e68b9\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.127800 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdxp6\" (UniqueName: \"kubernetes.io/projected/e1522514-570d-4f82-81f3-15db416cca79-kube-api-access-jdxp6\") pod \"e1522514-570d-4f82-81f3-15db416cca79\" (UID: \"e1522514-570d-4f82-81f3-15db416cca79\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.128137 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc77w\" (UniqueName: \"kubernetes.io/projected/99f98da9-4046-4200-bc8c-751a841e68b9-kube-api-access-nc77w\") pod \"99f98da9-4046-4200-bc8c-751a841e68b9\" (UID: \"99f98da9-4046-4200-bc8c-751a841e68b9\") " Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.136776 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f98da9-4046-4200-bc8c-751a841e68b9-logs" (OuterVolumeSpecName: "logs") pod "99f98da9-4046-4200-bc8c-751a841e68b9" (UID: "99f98da9-4046-4200-bc8c-751a841e68b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.141965 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f98da9-4046-4200-bc8c-751a841e68b9-kube-api-access-nc77w" (OuterVolumeSpecName: "kube-api-access-nc77w") pod "99f98da9-4046-4200-bc8c-751a841e68b9" (UID: "99f98da9-4046-4200-bc8c-751a841e68b9"). InnerVolumeSpecName "kube-api-access-nc77w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.152631 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1522514-570d-4f82-81f3-15db416cca79-kube-api-access-jdxp6" (OuterVolumeSpecName: "kube-api-access-jdxp6") pod "e1522514-570d-4f82-81f3-15db416cca79" (UID: "e1522514-570d-4f82-81f3-15db416cca79"). InnerVolumeSpecName "kube-api-access-jdxp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.171292 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-config-data" (OuterVolumeSpecName: "config-data") pod "e1522514-570d-4f82-81f3-15db416cca79" (UID: "e1522514-570d-4f82-81f3-15db416cca79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.173017 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99f98da9-4046-4200-bc8c-751a841e68b9" (UID: "99f98da9-4046-4200-bc8c-751a841e68b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.193475 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1522514-570d-4f82-81f3-15db416cca79" (UID: "e1522514-570d-4f82-81f3-15db416cca79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.212665 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-config-data" (OuterVolumeSpecName: "config-data") pod "99f98da9-4046-4200-bc8c-751a841e68b9" (UID: "99f98da9-4046-4200-bc8c-751a841e68b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.221037 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "99f98da9-4046-4200-bc8c-751a841e68b9" (UID: "99f98da9-4046-4200-bc8c-751a841e68b9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230416 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230458 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdxp6\" (UniqueName: \"kubernetes.io/projected/e1522514-570d-4f82-81f3-15db416cca79-kube-api-access-jdxp6\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230476 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc77w\" (UniqueName: \"kubernetes.io/projected/99f98da9-4046-4200-bc8c-751a841e68b9-kube-api-access-nc77w\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230489 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230502 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230514 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f98da9-4046-4200-bc8c-751a841e68b9-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230526 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1522514-570d-4f82-81f3-15db416cca79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.230538 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.236493 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "99f98da9-4046-4200-bc8c-751a841e68b9" (UID: "99f98da9-4046-4200-bc8c-751a841e68b9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.331449 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f98da9-4046-4200-bc8c-751a841e68b9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.857782 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99f98da9-4046-4200-bc8c-751a841e68b9","Type":"ContainerDied","Data":"098ae4cfab766f38d675ed99fd8cabd24b0bb295a8ccee638ad02c64a31b6d77"} Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.858051 5129 scope.go:117] "RemoveContainer" containerID="1461ca6435203ff818e6d04644df8c180a0bc2e4993a51540b816d10d9077054" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.857811 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.863283 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f12ef33b-b86d-4c80-8f19-385ff5a93fee","Type":"ContainerStarted","Data":"1424db197590997e7d0bf3ba8eb822d66a08b3447321a7763baa443a1590f95f"} Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.863319 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f12ef33b-b86d-4c80-8f19-385ff5a93fee","Type":"ContainerStarted","Data":"73a607e1cd40ac4f3d4872afe8c005319bf76b10db5cdba8a1d09fc3f83dead5"} Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.863330 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f12ef33b-b86d-4c80-8f19-385ff5a93fee","Type":"ContainerStarted","Data":"5d38b16e5304399ab733905799449c8e4299c7051910f1436062b73d0af4a1f4"} Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.864901 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1522514-570d-4f82-81f3-15db416cca79","Type":"ContainerDied","Data":"1d684cea37554e980c6273e65ca1c6c9d94f62f664e8718c255bdf7069b65f59"} Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.864992 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.892875 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.892854946 podStartE2EDuration="1.892854946s" podCreationTimestamp="2026-03-14 07:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:24:12.890353508 +0000 UTC m=+1515.642268692" watchObservedRunningTime="2026-03-14 07:24:12.892854946 +0000 UTC m=+1515.644770130" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.896425 5129 scope.go:117] "RemoveContainer" containerID="e9477cfbc71ae599463d4c4b0d447a1dd5f3c86e9cc78a502adfbeda1f9627fd" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.939507 5129 scope.go:117] "RemoveContainer" containerID="6a4ca1f337a22366ba5f7fd6ecbb3a61d4421deb0ae3d1dc4bd69ea0c8da7314" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.943740 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.963665 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.975096 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.990049 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.997564 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:12 crc kubenswrapper[5129]: E0314 07:24:12.997925 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-api" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.997940 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-api" Mar 14 07:24:12 crc kubenswrapper[5129]: E0314 07:24:12.997961 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1522514-570d-4f82-81f3-15db416cca79" containerName="nova-scheduler-scheduler" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.997968 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1522514-570d-4f82-81f3-15db416cca79" containerName="nova-scheduler-scheduler" Mar 14 07:24:12 crc kubenswrapper[5129]: E0314 07:24:12.997988 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-log" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.997993 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-log" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.998153 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1522514-570d-4f82-81f3-15db416cca79" containerName="nova-scheduler-scheduler" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.998179 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-api" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.998189 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" containerName="nova-api-log" Mar 14 07:24:12 crc kubenswrapper[5129]: I0314 07:24:12.998725 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.002834 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.005364 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.012475 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.015365 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.021475 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.022506 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.022559 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.022517 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047144 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047186 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b15058f-0936-4bb9-ad72-1c27661b4b82-logs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047229 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047300 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh7sn\" (UniqueName: \"kubernetes.io/projected/f796158b-a0d2-4077-9c18-b91a594343fb-kube-api-access-kh7sn\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047331 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtwkk\" (UniqueName: \"kubernetes.io/projected/6b15058f-0936-4bb9-ad72-1c27661b4b82-kube-api-access-vtwkk\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047358 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047382 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-config-data\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047420 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-config-data\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.047437 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149126 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-config-data\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149229 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-config-data\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149255 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149381 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149406 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b15058f-0936-4bb9-ad72-1c27661b4b82-logs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149430 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149479 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh7sn\" (UniqueName: \"kubernetes.io/projected/f796158b-a0d2-4077-9c18-b91a594343fb-kube-api-access-kh7sn\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149510 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwkk\" (UniqueName: \"kubernetes.io/projected/6b15058f-0936-4bb9-ad72-1c27661b4b82-kube-api-access-vtwkk\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.149537 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.153558 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b15058f-0936-4bb9-ad72-1c27661b4b82-logs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.155437 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-config-data\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.155967 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.156796 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.157260 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.169067 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-config-data\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.170669 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwkk\" (UniqueName: \"kubernetes.io/projected/6b15058f-0936-4bb9-ad72-1c27661b4b82-kube-api-access-vtwkk\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.171446 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.174997 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh7sn\" (UniqueName: \"kubernetes.io/projected/f796158b-a0d2-4077-9c18-b91a594343fb-kube-api-access-kh7sn\") pod \"nova-scheduler-0\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.329800 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.341159 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.822034 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:13 crc kubenswrapper[5129]: W0314 07:24:13.824025 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf796158b_a0d2_4077_9c18_b91a594343fb.slice/crio-b93662d300685763fab932a3b4cac96bb349ed3b3f9fa619c9bc10e2ecd7f47d WatchSource:0}: Error finding container b93662d300685763fab932a3b4cac96bb349ed3b3f9fa619c9bc10e2ecd7f47d: Status 404 returned error can't find the container with id b93662d300685763fab932a3b4cac96bb349ed3b3f9fa619c9bc10e2ecd7f47d Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.877526 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f796158b-a0d2-4077-9c18-b91a594343fb","Type":"ContainerStarted","Data":"b93662d300685763fab932a3b4cac96bb349ed3b3f9fa619c9bc10e2ecd7f47d"} Mar 14 07:24:13 crc kubenswrapper[5129]: I0314 07:24:13.950263 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.051874 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f98da9-4046-4200-bc8c-751a841e68b9" path="/var/lib/kubelet/pods/99f98da9-4046-4200-bc8c-751a841e68b9/volumes" Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.053350 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1522514-570d-4f82-81f3-15db416cca79" path="/var/lib/kubelet/pods/e1522514-570d-4f82-81f3-15db416cca79/volumes" Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.891255 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f796158b-a0d2-4077-9c18-b91a594343fb","Type":"ContainerStarted","Data":"a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63"} Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.893579 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b15058f-0936-4bb9-ad72-1c27661b4b82","Type":"ContainerStarted","Data":"739b061c527066552e175c0fbe0488722ce25d75896eb047b0bd920e05d3e6cb"} Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.893736 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b15058f-0936-4bb9-ad72-1c27661b4b82","Type":"ContainerStarted","Data":"57f2511f2dd582a994211348b57df514b82da119c5796e9b0a9f36eaf93af695"} Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.893759 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b15058f-0936-4bb9-ad72-1c27661b4b82","Type":"ContainerStarted","Data":"f210d80113855c4276894d756c20dedc8d8795cd00378886a3e838d17749a333"} Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.911239 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.911210893 podStartE2EDuration="2.911210893s" podCreationTimestamp="2026-03-14 07:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:24:14.910029511 +0000 UTC m=+1517.661944695" watchObservedRunningTime="2026-03-14 07:24:14.911210893 +0000 UTC m=+1517.663126107" Mar 14 07:24:14 crc kubenswrapper[5129]: I0314 07:24:14.936588 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.936562245 podStartE2EDuration="2.936562245s" podCreationTimestamp="2026-03-14 07:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 07:24:14.929216414 +0000 UTC m=+1517.681131608" watchObservedRunningTime="2026-03-14 07:24:14.936562245 +0000 UTC m=+1517.688477429" Mar 14 07:24:18 crc kubenswrapper[5129]: I0314 07:24:18.298433 5129 scope.go:117] "RemoveContainer" containerID="be3ea10d03d65f61c2755445872d28b8dc27d34ae0972e6e5b9c64b7297f6126" Mar 14 07:24:18 crc kubenswrapper[5129]: I0314 07:24:18.330027 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 07:24:19 crc kubenswrapper[5129]: I0314 07:24:19.574357 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:24:19 crc kubenswrapper[5129]: I0314 07:24:19.574783 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:24:21 crc kubenswrapper[5129]: I0314 07:24:21.540162 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:24:21 crc kubenswrapper[5129]: I0314 07:24:21.540511 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 07:24:22 crc kubenswrapper[5129]: I0314 07:24:22.555791 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:24:22 crc kubenswrapper[5129]: I0314 07:24:22.555806 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 07:24:23 crc kubenswrapper[5129]: I0314 07:24:23.330735 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 07:24:23 crc kubenswrapper[5129]: I0314 07:24:23.342240 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:24:23 crc kubenswrapper[5129]: I0314 07:24:23.342321 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 07:24:23 crc kubenswrapper[5129]: I0314 07:24:23.380421 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 07:24:24 crc kubenswrapper[5129]: I0314 07:24:24.023292 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 07:24:24 crc kubenswrapper[5129]: I0314 07:24:24.357899 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:24:24 crc kubenswrapper[5129]: I0314 07:24:24.358272 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 07:24:24 crc kubenswrapper[5129]: I0314 07:24:24.980392 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 07:24:29 crc kubenswrapper[5129]: I0314 07:24:29.540183 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:24:29 crc kubenswrapper[5129]: I0314 07:24:29.540533 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 07:24:31 crc kubenswrapper[5129]: I0314 07:24:31.342107 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:24:31 crc kubenswrapper[5129]: I0314 07:24:31.342447 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 07:24:31 crc kubenswrapper[5129]: I0314 07:24:31.545328 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 07:24:31 crc kubenswrapper[5129]: I0314 07:24:31.551769 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 07:24:31 crc kubenswrapper[5129]: I0314 07:24:31.552878 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 07:24:32 crc kubenswrapper[5129]: I0314 07:24:32.075927 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 07:24:33 crc kubenswrapper[5129]: I0314 07:24:33.351748 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:24:33 crc kubenswrapper[5129]: I0314 07:24:33.359286 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 07:24:33 crc kubenswrapper[5129]: I0314 07:24:33.359722 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:24:34 crc kubenswrapper[5129]: I0314 07:24:34.101111 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 07:24:49 crc kubenswrapper[5129]: I0314 07:24:49.574724 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:24:49 crc kubenswrapper[5129]: I0314 07:24:49.575311 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.551539 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.552272 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" containerName="openstackclient" containerID="cri-o://8d3e839655b386c621e99ce51cd89ad667b865776cb2c358ac6411cb869d4ee6" gracePeriod=2 Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.633165 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.897284 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c51b-account-create-update-7gs2v"] Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.958472 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c51b-account-create-update-7gs2v"] Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.988729 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-48dp4"] Mar 14 07:24:53 crc kubenswrapper[5129]: E0314 07:24:53.989273 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" containerName="openstackclient" Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.989293 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" containerName="openstackclient" Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.989464 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" containerName="openstackclient" Mar 14 07:24:53 crc kubenswrapper[5129]: I0314 07:24:53.990108 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:53.995657 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.009464 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c51b-account-create-update-crv5x"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.010737 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.018693 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c51b-account-create-update-crv5x"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.024178 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.026628 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.027017 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="openstack-network-exporter" containerID="cri-o://9e3b177f1620bb347c28bfeccd9f33da12c8938fc3fa807f4c5eaa40e5094650" gracePeriod=300 Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.058726 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534a50f3-4cdf-4532-9d16-d61c1792d403" path="/var/lib/kubelet/pods/534a50f3-4cdf-4532-9d16-d61c1792d403/volumes" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.059315 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-48dp4"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.061028 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.158561 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de15208-b058-4b80-886d-6b93469faac0-operator-scripts\") pod \"barbican-c51b-account-create-update-crv5x\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.158664 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-477hl\" (UniqueName: \"kubernetes.io/projected/b77f36b2-be7b-43cb-ada4-74f524396018-kube-api-access-477hl\") pod \"root-account-create-update-48dp4\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.158764 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7cq\" (UniqueName: \"kubernetes.io/projected/2de15208-b058-4b80-886d-6b93469faac0-kube-api-access-2p7cq\") pod \"barbican-c51b-account-create-update-crv5x\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.158810 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts\") pod \"root-account-create-update-48dp4\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.194152 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5dxcc"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.225810 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="ovsdbserver-sb" containerID="cri-o://27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c" gracePeriod=300 Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.261558 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de15208-b058-4b80-886d-6b93469faac0-operator-scripts\") pod \"barbican-c51b-account-create-update-crv5x\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.261901 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-477hl\" (UniqueName: \"kubernetes.io/projected/b77f36b2-be7b-43cb-ada4-74f524396018-kube-api-access-477hl\") pod \"root-account-create-update-48dp4\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.261961 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7cq\" (UniqueName: \"kubernetes.io/projected/2de15208-b058-4b80-886d-6b93469faac0-kube-api-access-2p7cq\") pod \"barbican-c51b-account-create-update-crv5x\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.261995 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts\") pod \"root-account-create-update-48dp4\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.263564 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts\") pod \"root-account-create-update-48dp4\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.263827 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de15208-b058-4b80-886d-6b93469faac0-operator-scripts\") pod \"barbican-c51b-account-create-update-crv5x\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: E0314 07:24:54.272695 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:54 crc kubenswrapper[5129]: E0314 07:24:54.272779 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data podName:d291cef2-24d8-4ae6-aa4f-dfa8e782db15 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:54.772758801 +0000 UTC m=+1557.524673985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data") pod "rabbitmq-server-0" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15") : configmap "rabbitmq-config-data" not found Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.277854 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b690-account-create-update-x9m2c"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.300663 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5dxcc"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.312793 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-477hl\" (UniqueName: \"kubernetes.io/projected/b77f36b2-be7b-43cb-ada4-74f524396018-kube-api-access-477hl\") pod \"root-account-create-update-48dp4\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.332037 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48dp4" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.337185 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7cq\" (UniqueName: \"kubernetes.io/projected/2de15208-b058-4b80-886d-6b93469faac0-kube-api-access-2p7cq\") pod \"barbican-c51b-account-create-update-crv5x\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.340060 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-b8r27"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.356332 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.390743 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-b8r27"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.456650 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-x5pc5"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.532690 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-x5pc5"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.572764 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f3f5-account-create-update-t48fl"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.583884 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b690-account-create-update-x9m2c"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.649027 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f3f5-account-create-update-t48fl"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.657665 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.658374 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="ovn-northd" containerID="cri-o://d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" gracePeriod=30 Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.658878 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="openstack-network-exporter" containerID="cri-o://c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66" gracePeriod=30 Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.666834 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wcl8h"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.688459 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wcl8h"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.708672 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.743451 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-llsfr"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.756292 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-llsfr"] Mar 14 07:24:54 crc kubenswrapper[5129]: E0314 07:24:54.782182 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:54 crc kubenswrapper[5129]: E0314 07:24:54.782242 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data podName:f6261e6b-f331-4dcb-8380-167e8f547e1b nodeName:}" failed. No retries permitted until 2026-03-14 07:24:55.282225146 +0000 UTC m=+1558.034140330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:54 crc kubenswrapper[5129]: E0314 07:24:54.782579 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:54 crc kubenswrapper[5129]: E0314 07:24:54.782648 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data podName:d291cef2-24d8-4ae6-aa4f-dfa8e782db15 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:55.782631197 +0000 UTC m=+1558.534546381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data") pod "rabbitmq-server-0" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15") : configmap "rabbitmq-config-data" not found Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.803480 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-168c-account-create-update-bgp52"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.804593 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.808513 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.827820 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-2lvwr"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.855823 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0796-account-create-update-crlmm"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.866579 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.874502 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.877633 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9bsx2"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.883762 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be17304-8fa4-4ec4-893d-3094d703893a-operator-scripts\") pod \"nova-cell1-168c-account-create-update-bgp52\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.883860 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h7c4\" (UniqueName: \"kubernetes.io/projected/1be17304-8fa4-4ec4-893d-3094d703893a-kube-api-access-5h7c4\") pod \"nova-cell1-168c-account-create-update-bgp52\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.924810 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-nrtft"] Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.932069 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:54 crc kubenswrapper[5129]: I0314 07:24:54.946527 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.005485 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-2lvwr"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.041242 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-operator-scripts\") pod \"nova-cell0-3b40-account-create-update-nrtft\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.041410 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be17304-8fa4-4ec4-893d-3094d703893a-operator-scripts\") pod \"nova-cell1-168c-account-create-update-bgp52\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.045212 5129 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" secret="" err="secret \"barbican-barbican-dockercfg-5zr48\" not found" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.051923 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be17304-8fa4-4ec4-893d-3094d703893a-operator-scripts\") pod \"nova-cell1-168c-account-create-update-bgp52\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.078213 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76qmf\" (UniqueName: \"kubernetes.io/projected/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-kube-api-access-76qmf\") pod \"nova-api-0796-account-create-update-crlmm\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.078364 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7c4\" (UniqueName: \"kubernetes.io/projected/1be17304-8fa4-4ec4-893d-3094d703893a-kube-api-access-5h7c4\") pod \"nova-cell1-168c-account-create-update-bgp52\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.078515 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-operator-scripts\") pod \"nova-api-0796-account-create-update-crlmm\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.078665 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxh5q\" (UniqueName: \"kubernetes.io/projected/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-kube-api-access-pxh5q\") pod \"nova-cell0-3b40-account-create-update-nrtft\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.106233 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-bgp52"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.131802 5129 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/swift-storage-0" secret="" err="secret \"swift-swift-dockercfg-g7j4t\" not found" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.158454 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-nrtft"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.166157 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0796-account-create-update-crlmm"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.178474 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h7c4\" (UniqueName: \"kubernetes.io/projected/1be17304-8fa4-4ec4-893d-3094d703893a-kube-api-access-5h7c4\") pod \"nova-cell1-168c-account-create-update-bgp52\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.196061 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p2z68"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.196512 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-p2z68" podUID="9d934f07-49c3-4356-ae16-0c35f0935625" containerName="openstack-network-exporter" containerID="cri-o://252a4f363d08c5b6cdff2d3573a9472b02174c2666aea18e478d986cc1704a09" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.215595 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-66xz6"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.245768 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76qmf\" (UniqueName: \"kubernetes.io/projected/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-kube-api-access-76qmf\") pod \"nova-api-0796-account-create-update-crlmm\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.245861 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-operator-scripts\") pod \"nova-api-0796-account-create-update-crlmm\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.245911 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxh5q\" (UniqueName: \"kubernetes.io/projected/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-kube-api-access-pxh5q\") pod \"nova-cell0-3b40-account-create-update-nrtft\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.246003 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-operator-scripts\") pod \"nova-cell0-3b40-account-create-update-nrtft\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.246167 5129 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.246219 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:24:55.746203769 +0000 UTC m=+1558.498118953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.246353 5129 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.246380 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:24:55.746372123 +0000 UTC m=+1558.498287297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.245999 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-66xz6"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.253004 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-operator-scripts\") pod \"nova-api-0796-account-create-update-crlmm\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.254330 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-operator-scripts\") pod \"nova-cell0-3b40-account-create-update-nrtft\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.254990 5129 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.255013 5129 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.255021 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.255032 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.255380 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:24:55.755350128 +0000 UTC m=+1558.507265312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.274294 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxh5q\" (UniqueName: \"kubernetes.io/projected/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-kube-api-access-pxh5q\") pod \"nova-cell0-3b40-account-create-update-nrtft\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.283459 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76qmf\" (UniqueName: \"kubernetes.io/projected/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-kube-api-access-76qmf\") pod \"nova-api-0796-account-create-update-crlmm\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.294647 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-cfdh9"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.299783 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75b4586cb8-pfpxj"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.300095 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75b4586cb8-pfpxj" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-log" containerID="cri-o://c8a2eb3d81e166b08f15ce66729e9485cc211076230f60d2b3882494534f0f34" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.300254 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75b4586cb8-pfpxj" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-api" containerID="cri-o://19c89364d2b386b626087226ef1de11773409433b1f6b4165c7de722f1c08207" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.325289 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.325762 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="cinder-scheduler" containerID="cri-o://76f24021b697ad5485f9f5594cd9b1485876631247e65a9f1fe131e045674477" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.326233 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="probe" containerID="cri-o://0b249c015ec3a36e3def27e75975febe3abb0dc9b079bfbd50022f9693560988" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.338042 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.338508 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="openstack-network-exporter" containerID="cri-o://ae523da99b25c8f299915e49323351372ae554b450e86eb9aa2d7dd42d03c9b4" gracePeriod=300 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.349025 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.349206 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api-log" containerID="cri-o://b6c1654757c5b72874d5b9a46ca2254eeb71fc65f978afb406f43a49c77a25ae" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.349476 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api" containerID="cri-o://d055a298c80b1c78420262c5f6b1a8b08ee515621adade6064aa22f0860f7d0f" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.352231 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.352268 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data podName:f6261e6b-f331-4dcb-8380-167e8f547e1b nodeName:}" failed. No retries permitted until 2026-03-14 07:24:56.352255723 +0000 UTC m=+1559.104170907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.356924 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.361837 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-rmw6j"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.365365 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" podUID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerName="dnsmasq-dns" containerID="cri-o://df33d7fbb10361dc5cb6e14bde3e758c886880e563b624273a3f21cdeda670f9" gracePeriod=10 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.388004 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0796-account-create-update-nndld"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.410157 5129 generic.go:334] "Generic (PLEG): container finished" podID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerID="c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66" exitCode=2 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.410582 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"72ad7859-e34b-4393-b696-548fd7ac8e1d","Type":"ContainerDied","Data":"c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66"} Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.418466 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d992f450-3800-45e2-abf4-41597a15f0c3/ovsdbserver-sb/0.log" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.418538 5129 generic.go:334] "Generic (PLEG): container finished" podID="d992f450-3800-45e2-abf4-41597a15f0c3" containerID="9e3b177f1620bb347c28bfeccd9f33da12c8938fc3fa807f4c5eaa40e5094650" exitCode=2 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.418554 5129 generic.go:334] "Generic (PLEG): container finished" podID="d992f450-3800-45e2-abf4-41597a15f0c3" containerID="27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c" exitCode=143 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.418573 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d992f450-3800-45e2-abf4-41597a15f0c3","Type":"ContainerDied","Data":"9e3b177f1620bb347c28bfeccd9f33da12c8938fc3fa807f4c5eaa40e5094650"} Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.418612 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d992f450-3800-45e2-abf4-41597a15f0c3","Type":"ContainerDied","Data":"27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c"} Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.447494 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.450594 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c is running failed: container process not found" containerID="27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.450956 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c is running failed: container process not found" containerID="27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.451653 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c is running failed: container process not found" containerID="27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.451731 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="ovsdbserver-sb" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.456701 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0796-account-create-update-nndld"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.510003 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="ovsdbserver-nb" containerID="cri-o://dab8a5ebc5177ef76172886ed36e2482dbb0fd550653e8a3c821777fa2ad984e" gracePeriod=300 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.547126 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.561758 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562233 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-server" containerID="cri-o://89fd8247fe552c4e8c71705aad2c39088f4d80948c9776ee701e127517dfcd2e" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562578 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="swift-recon-cron" containerID="cri-o://68f079c63627945e154f1a8ae1e17bd4c418f8de737ec76b1962ceb2dce0568b" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562636 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="rsync" containerID="cri-o://8dfc0ae842b54e831f58e889aa510e0e67ccf64cc21775a958610f8a34d354da" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562667 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-expirer" containerID="cri-o://387ec0f7caa0f65ef81af98a02901cb03b71d081ad6cad271dfe418e7919ac2a" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562699 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-updater" containerID="cri-o://1439671428e36f062ecec5e844057f831307ced7145f7e08d2685ea64612ca03" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562728 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-auditor" containerID="cri-o://21d34fb8c8ae530f947029ee8db5c31bb1fff7d650c0adaa7c67025d5674e1a9" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562754 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-replicator" containerID="cri-o://d28de17046ef16ad3ac1f241f5db591353ee653132756124d068067d81af0578" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562796 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-server" containerID="cri-o://3e9b50ce398570eef3d701e17b2478e6a2800a85b0238429627e5231237c15ba" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562824 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-updater" containerID="cri-o://de4e9cf22659332ab4a321d87b2be0419c0f5ba9584d4883e24f44b8cd760689" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562855 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-auditor" containerID="cri-o://3059f4424e061a65f5c7669f2d08d51ae956e9986395c5c1a40f1a6c63f29fe4" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562882 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-replicator" containerID="cri-o://1bfacbb3c7f4e07f82826af5ace3d723b967a5f7ed7522a9988f754dcb73cd76" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562910 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-server" containerID="cri-o://ea8689b6701500735737f4995bd8adb2dcef261718f8b5bd911de435e0c3a742" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562937 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-reaper" containerID="cri-o://92720726c9e708458b5da5758e478b9521c17f64ffaa2789a7359d2da3f264d0" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562963 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-auditor" containerID="cri-o://f1efa38796d71cf5fcb64262b9513b8dbb5d0472b2f9b02765e6507ec1e1670d" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.562990 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-replicator" containerID="cri-o://89c397e9fa15d1498f1dfb3620f6d0877fdad87ff4762961a8804fdb1503a9fc" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.586053 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-c46zq"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.604129 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-c46zq"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.606970 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lqhkw"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.635544 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-dbtdl"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.643036 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lqhkw"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.652456 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-dbtdl"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.668579 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9e6d-account-create-update-l4gqw"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.709262 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9e6d-account-create-update-l4gqw"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.739570 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7fd6bbc76c-9rshh"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.739841 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7fd6bbc76c-9rshh" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-api" containerID="cri-o://26d0f3fccd15d22f5d226b58a7b4b02bc6754da5235ba9f5ff39da154f4b4c5b" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.740249 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7fd6bbc76c-9rshh" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-httpd" containerID="cri-o://bc3211d9096a638fa8c4213b5bf198de4c37facb10847c262d7a2dcc1726dfc9" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.758421 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nhsz7"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.771399 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nhsz7"] Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.791057 5129 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.791128 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:24:56.791112981 +0000 UTC m=+1559.543028165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.791480 5129 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.793881 5129 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.793897 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.793908 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.793952 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:24:56.793941048 +0000 UTC m=+1559.545856232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.794090 5129 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.794134 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:24:56.794109393 +0000 UTC m=+1559.546024577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.794168 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: E0314 07:24:55.794188 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data podName:d291cef2-24d8-4ae6-aa4f-dfa8e782db15 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:57.794181945 +0000 UTC m=+1560.546097129 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data") pod "rabbitmq-server-0" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15") : configmap "rabbitmq-config-data" not found Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.798084 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8js4z"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.832485 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8js4z"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.896955 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-msq48"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.902064 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d992f450-3800-45e2-abf4-41597a15f0c3/ovsdbserver-sb/0.log" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.902134 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.918739 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-msq48"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.938401 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.938894 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-log" containerID="cri-o://487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.939343 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-httpd" containerID="cri-o://51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d" gracePeriod=30 Mar 14 07:24:55 crc kubenswrapper[5129]: I0314 07:24:55.994573 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvf6f"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999149 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-scripts\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999192 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999230 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdb-rundir\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999261 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdbserver-sb-tls-certs\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999285 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8chx\" (UniqueName: \"kubernetes.io/projected/d992f450-3800-45e2-abf4-41597a15f0c3-kube-api-access-n8chx\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999318 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-combined-ca-bundle\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999398 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-metrics-certs-tls-certs\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:55.999453 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-config\") pod \"d992f450-3800-45e2-abf4-41597a15f0c3\" (UID: \"d992f450-3800-45e2-abf4-41597a15f0c3\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.001283 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-config" (OuterVolumeSpecName: "config") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.001826 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-scripts" (OuterVolumeSpecName: "scripts") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.014035 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.055576 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d992f450-3800-45e2-abf4-41597a15f0c3-kube-api-access-n8chx" (OuterVolumeSpecName: "kube-api-access-n8chx") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "kube-api-access-n8chx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.055692 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.097116 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189b1a66-f6a7-4db2-bb51-b22d1725a41b" path="/var/lib/kubelet/pods/189b1a66-f6a7-4db2-bb51-b22d1725a41b/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.097788 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22cf633c-cf29-4f88-9ef2-693aee84d48d" path="/var/lib/kubelet/pods/22cf633c-cf29-4f88-9ef2-693aee84d48d/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.100675 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4c1ed9-1abf-442c-a30c-92249cfd9fe4" path="/var/lib/kubelet/pods/2b4c1ed9-1abf-442c-a30c-92249cfd9fe4/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.101544 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8203ff-5259-4d83-a96b-362be3884609" path="/var/lib/kubelet/pods/2c8203ff-5259-4d83-a96b-362be3884609/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.102395 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b45e803-7558-488a-bc9b-39982239b9a5" path="/var/lib/kubelet/pods/3b45e803-7558-488a-bc9b-39982239b9a5/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.103917 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:56 crc kubenswrapper[5129]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: if [ -n "barbican" ]; then Mar 14 07:24:56 crc kubenswrapper[5129]: GRANT_DATABASE="barbican" Mar 14 07:24:56 crc kubenswrapper[5129]: else Mar 14 07:24:56 crc kubenswrapper[5129]: GRANT_DATABASE="*" Mar 14 07:24:56 crc kubenswrapper[5129]: fi Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: # going for maximum compatibility here: Mar 14 07:24:56 crc kubenswrapper[5129]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:56 crc kubenswrapper[5129]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:56 crc kubenswrapper[5129]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:56 crc kubenswrapper[5129]: # support updates Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.106296 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-c51b-account-create-update-crv5x" podUID="2de15208-b058-4b80-886d-6b93469faac0" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.111647 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4545cab3-7bdb-4e55-95b6-cbb057d2bbf9" path="/var/lib/kubelet/pods/4545cab3-7bdb-4e55-95b6-cbb057d2bbf9/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.114140 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7" path="/var/lib/kubelet/pods/5ef04cbd-e3a5-406a-a7aa-f58dc2d5feb7/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.114536 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" containerID="cri-o://d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.114959 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4ee137-42c5-4c71-943f-767cd4c43b5b" path="/var/lib/kubelet/pods/7f4ee137-42c5-4c71-943f-767cd4c43b5b/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.115589 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a20ac3-5616-4b0b-9fd3-09ca4d863c24" path="/var/lib/kubelet/pods/81a20ac3-5616-4b0b-9fd3-09ca4d863c24/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.142641 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa95bffb-d8b1-484f-9802-37d9793ef659" path="/var/lib/kubelet/pods/aa95bffb-d8b1-484f-9802-37d9793ef659/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.142774 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.142814 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d992f450-3800-45e2-abf4-41597a15f0c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.142843 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.142866 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.142879 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8chx\" (UniqueName: \"kubernetes.io/projected/d992f450-3800-45e2-abf4-41597a15f0c3-kube-api-access-n8chx\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.144155 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7d80f5-edb1-4774-8072-6af87a16888f" path="/var/lib/kubelet/pods/ba7d80f5-edb1-4774-8072-6af87a16888f/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.145587 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="babd1361-ef0f-4921-905f-cbb1af24beea" path="/var/lib/kubelet/pods/babd1361-ef0f-4921-905f-cbb1af24beea/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.147475 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf76f7dd-0392-4e68-a441-c982f7055f24" path="/var/lib/kubelet/pods/bf76f7dd-0392-4e68-a441-c982f7055f24/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.148167 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e40311-e1d7-4a06-895e-9681160e38da" path="/var/lib/kubelet/pods/d6e40311-e1d7-4a06-895e-9681160e38da/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.149376 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd17de58-336d-485a-aef2-dbab072a4007" path="/var/lib/kubelet/pods/dd17de58-336d-485a-aef2-dbab072a4007/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.150002 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb" path="/var/lib/kubelet/pods/e2c82cd2-eaef-43f5-b14d-f892c3c8d0fb/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.151012 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1170613-23db-4ba5-9e73-d74a8f68fa8e" path="/var/lib/kubelet/pods/f1170613-23db-4ba5-9e73-d74a8f68fa8e/volumes" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.294694 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" containerID="cri-o://dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.301220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.329103 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.362660 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.362698 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.362802 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.362918 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data podName:f6261e6b-f331-4dcb-8380-167e8f547e1b nodeName:}" failed. No retries permitted until 2026-03-14 07:24:58.362878746 +0000 UTC m=+1561.114793930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.381758 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.435347 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7/ovsdbserver-nb/0.log" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.435760 5129 generic.go:334] "Generic (PLEG): container finished" podID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerID="ae523da99b25c8f299915e49323351372ae554b450e86eb9aa2d7dd42d03c9b4" exitCode=2 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.435777 5129 generic.go:334] "Generic (PLEG): container finished" podID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerID="dab8a5ebc5177ef76172886ed36e2482dbb0fd550653e8a3c821777fa2ad984e" exitCode=143 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.443342 5129 generic.go:334] "Generic (PLEG): container finished" podID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerID="487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636" exitCode=143 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.464633 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.472980 5129 generic.go:334] "Generic (PLEG): container finished" podID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerID="df33d7fbb10361dc5cb6e14bde3e758c886880e563b624273a3f21cdeda670f9" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.485647 5129 generic.go:334] "Generic (PLEG): container finished" podID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.488285 5129 generic.go:334] "Generic (PLEG): container finished" podID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerID="b6c1654757c5b72874d5b9a46ca2254eeb71fc65f978afb406f43a49c77a25ae" exitCode=143 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.491133 5129 generic.go:334] "Generic (PLEG): container finished" podID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" containerID="8d3e839655b386c621e99ce51cd89ad667b865776cb2c358ac6411cb869d4ee6" exitCode=137 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.493115 5129 generic.go:334] "Generic (PLEG): container finished" podID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerID="bc3211d9096a638fa8c4213b5bf198de4c37facb10847c262d7a2dcc1726dfc9" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.496995 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p2z68_9d934f07-49c3-4356-ae16-0c35f0935625/openstack-network-exporter/0.log" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.497050 5129 generic.go:334] "Generic (PLEG): container finished" podID="9d934f07-49c3-4356-ae16-0c35f0935625" containerID="252a4f363d08c5b6cdff2d3573a9472b02174c2666aea18e478d986cc1704a09" exitCode=2 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.515738 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "d992f450-3800-45e2-abf4-41597a15f0c3" (UID: "d992f450-3800-45e2-abf4-41597a15f0c3"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520540 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7","Type":"ContainerDied","Data":"ae523da99b25c8f299915e49323351372ae554b450e86eb9aa2d7dd42d03c9b4"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520571 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7","Type":"ContainerDied","Data":"dab8a5ebc5177ef76172886ed36e2482dbb0fd550653e8a3c821777fa2ad984e"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520584 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0a5119c-8784-48e5-841a-654dc253f0d0","Type":"ContainerDied","Data":"487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520617 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvf6f"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520636 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520656 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-48dp4"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520668 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" event={"ID":"9004e305-ef45-437a-a38c-b50c9d1f1ff7","Type":"ContainerDied","Data":"df33d7fbb10361dc5cb6e14bde3e758c886880e563b624273a3f21cdeda670f9"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520680 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c51b-account-create-update-crv5x"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520689 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ee9b-account-create-update-xxxtd"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" event={"ID":"9004e305-ef45-437a-a38c-b50c9d1f1ff7","Type":"ContainerDied","Data":"88fdc29c1e015191e1cd403fdfa3d15327812e9394854fd6ac94fe015610d680"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520707 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88fdc29c1e015191e1cd403fdfa3d15327812e9394854fd6ac94fe015610d680" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520719 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ee9b-account-create-update-xxxtd"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520729 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerDied","Data":"d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520740 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7957bb5589-vf68m"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520751 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520761 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a3fad4b-8e44-471d-b262-27d6a7e05276","Type":"ContainerDied","Data":"b6c1654757c5b72874d5b9a46ca2254eeb71fc65f978afb406f43a49c77a25ae"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520775 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd6bbc76c-9rshh" event={"ID":"e4553e57-9090-44cb-a8af-7297e4c624c0","Type":"ContainerDied","Data":"bc3211d9096a638fa8c4213b5bf198de4c37facb10847c262d7a2dcc1726dfc9"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p2z68" event={"ID":"9d934f07-49c3-4356-ae16-0c35f0935625","Type":"ContainerDied","Data":"252a4f363d08c5b6cdff2d3573a9472b02174c2666aea18e478d986cc1704a09"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520795 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p2z68" event={"ID":"9d934f07-49c3-4356-ae16-0c35f0935625","Type":"ContainerDied","Data":"c9654dbe22fe7af31a336802f519f1aa82324ec7af4b67905af69fa1889657f2"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520804 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9654dbe22fe7af31a336802f519f1aa82324ec7af4b67905af69fa1889657f2" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520812 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fwnrp"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520823 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fwnrp"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520833 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7ggl5"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520845 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7ggl5"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520853 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-696c7b8d5f-j2w5q"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520863 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c51b-account-create-update-crv5x"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520878 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-54888cd7bb-schqh"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520890 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cc754bc48-djssr"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.520955 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-log" containerID="cri-o://a0297ed0b9af03c2d6e5fa02d749275ab59da2eca117907d6e9e8d9642cba071" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.521070 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cc754bc48-djssr" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api-log" containerID="cri-o://b7fedc334b9aeda9f3eff633b2bb5ad2a6353604791c1e6f2adeb911962cfdac" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.521249 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7957bb5589-vf68m" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-httpd" containerID="cri-o://94ccaa2244dfc8d149d667d1f5c396aa3ecfa3171455d755f9e5a589e59121ff" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.521659 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cc754bc48-djssr" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api" containerID="cri-o://f0d88a61613b0d796589b600c918f3c42969fe8081ec6149ed9e97446ae73149" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.521725 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-httpd" containerID="cri-o://94ff5f18b233b64c1c783dbe228636e28477df4e6e43805ef408f8e1f99138ec" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.522008 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7957bb5589-vf68m" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-server" containerID="cri-o://827ff25380bea7a7b669c68f4f4faa1199ffe0abbccc374df9dfa9bf6a471dee" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.537750 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener-log" containerID="cri-o://d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.537883 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker-log" containerID="cri-o://d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.537919 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener" containerID="cri-o://ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.538038 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker" containerID="cri-o://3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.571888 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d992f450-3800-45e2-abf4-41597a15f0c3-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.573486 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0796-account-create-update-crlmm"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.577758 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p2z68_9d934f07-49c3-4356-ae16-0c35f0935625/openstack-network-exporter/0.log" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.577823 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.587070 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="8dfc0ae842b54e831f58e889aa510e0e67ccf64cc21775a958610f8a34d354da" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594536 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="387ec0f7caa0f65ef81af98a02901cb03b71d081ad6cad271dfe418e7919ac2a" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594576 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="1439671428e36f062ecec5e844057f831307ced7145f7e08d2685ea64612ca03" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594583 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="21d34fb8c8ae530f947029ee8db5c31bb1fff7d650c0adaa7c67025d5674e1a9" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594609 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="d28de17046ef16ad3ac1f241f5db591353ee653132756124d068067d81af0578" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594617 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="3e9b50ce398570eef3d701e17b2478e6a2800a85b0238429627e5231237c15ba" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594624 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="de4e9cf22659332ab4a321d87b2be0419c0f5ba9584d4883e24f44b8cd760689" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594630 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="3059f4424e061a65f5c7669f2d08d51ae956e9986395c5c1a40f1a6c63f29fe4" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594637 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="1bfacbb3c7f4e07f82826af5ace3d723b967a5f7ed7522a9988f754dcb73cd76" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594643 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="ea8689b6701500735737f4995bd8adb2dcef261718f8b5bd911de435e0c3a742" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594652 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="92720726c9e708458b5da5758e478b9521c17f64ffaa2789a7359d2da3f264d0" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594658 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="f1efa38796d71cf5fcb64262b9513b8dbb5d0472b2f9b02765e6507ec1e1670d" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594665 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="89c397e9fa15d1498f1dfb3620f6d0877fdad87ff4762961a8804fdb1503a9fc" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594671 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="89fd8247fe552c4e8c71705aad2c39088f4d80948c9776ee701e127517dfcd2e" exitCode=0 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.589313 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"8dfc0ae842b54e831f58e889aa510e0e67ccf64cc21775a958610f8a34d354da"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594811 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"387ec0f7caa0f65ef81af98a02901cb03b71d081ad6cad271dfe418e7919ac2a"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594830 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"1439671428e36f062ecec5e844057f831307ced7145f7e08d2685ea64612ca03"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594842 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"21d34fb8c8ae530f947029ee8db5c31bb1fff7d650c0adaa7c67025d5674e1a9"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"d28de17046ef16ad3ac1f241f5db591353ee653132756124d068067d81af0578"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594859 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"3e9b50ce398570eef3d701e17b2478e6a2800a85b0238429627e5231237c15ba"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594869 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"de4e9cf22659332ab4a321d87b2be0419c0f5ba9584d4883e24f44b8cd760689"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594879 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"3059f4424e061a65f5c7669f2d08d51ae956e9986395c5c1a40f1a6c63f29fe4"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"1bfacbb3c7f4e07f82826af5ace3d723b967a5f7ed7522a9988f754dcb73cd76"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594896 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"ea8689b6701500735737f4995bd8adb2dcef261718f8b5bd911de435e0c3a742"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594906 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"92720726c9e708458b5da5758e478b9521c17f64ffaa2789a7359d2da3f264d0"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594914 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"f1efa38796d71cf5fcb64262b9513b8dbb5d0472b2f9b02765e6507ec1e1670d"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594922 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"89c397e9fa15d1498f1dfb3620f6d0877fdad87ff4762961a8804fdb1503a9fc"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.594932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"89fd8247fe552c4e8c71705aad2c39088f4d80948c9776ee701e127517dfcd2e"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.599581 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-bgp52"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.604147 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48dp4" event={"ID":"b77f36b2-be7b-43cb-ada4-74f524396018","Type":"ContainerStarted","Data":"daa4ba320ebd546664278d84833e3177535913a5398cfaaa0deb972d033e7f3b"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.604539 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.605771 5129 generic.go:334] "Generic (PLEG): container finished" podID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerID="c8a2eb3d81e166b08f15ce66729e9485cc211076230f60d2b3882494534f0f34" exitCode=143 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.605809 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b4586cb8-pfpxj" event={"ID":"8da79e9b-0c3f-4d66-9813-08116725c6a4","Type":"ContainerDied","Data":"c8a2eb3d81e166b08f15ce66729e9485cc211076230f60d2b3882494534f0f34"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.612566 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c51b-account-create-update-crv5x" event={"ID":"2de15208-b058-4b80-886d-6b93469faac0","Type":"ContainerStarted","Data":"2f241bb99f034ae426d36fd3e9d916cdbcb4dda97344b82a4ef4ca508a6ff78c"} Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.614268 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:56 crc kubenswrapper[5129]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: if [ -n "barbican" ]; then Mar 14 07:24:56 crc kubenswrapper[5129]: GRANT_DATABASE="barbican" Mar 14 07:24:56 crc kubenswrapper[5129]: else Mar 14 07:24:56 crc kubenswrapper[5129]: GRANT_DATABASE="*" Mar 14 07:24:56 crc kubenswrapper[5129]: fi Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: # going for maximum compatibility here: Mar 14 07:24:56 crc kubenswrapper[5129]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:56 crc kubenswrapper[5129]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:56 crc kubenswrapper[5129]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:56 crc kubenswrapper[5129]: # support updates Mar 14 07:24:56 crc kubenswrapper[5129]: Mar 14 07:24:56 crc kubenswrapper[5129]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.615805 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-c51b-account-create-update-crv5x" podUID="2de15208-b058-4b80-886d-6b93469faac0" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.632851 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.634478 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-log" containerID="cri-o://73a607e1cd40ac4f3d4872afe8c005319bf76b10db5cdba8a1d09fc3f83dead5" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.634641 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-metadata" containerID="cri-o://1424db197590997e7d0bf3ba8eb822d66a08b3447321a7763baa443a1590f95f" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.647006 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d992f450-3800-45e2-abf4-41597a15f0c3/ovsdbserver-sb/0.log" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.647060 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d992f450-3800-45e2-abf4-41597a15f0c3","Type":"ContainerDied","Data":"1fb694959f161f9b05e2ba82059447ae12e5957805fb401864e8376aa812141b"} Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.647094 5129 scope.go:117] "RemoveContainer" containerID="9e3b177f1620bb347c28bfeccd9f33da12c8938fc3fa807f4c5eaa40e5094650" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.647241 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.653204 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-8mzrf"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.662141 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerName="rabbitmq" containerID="cri-o://dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1" gracePeriod=604800 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.662447 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.666264 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-8mzrf"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.672894 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dwf9c"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.674701 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-svc\") pod \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.674973 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-config\") pod \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.675012 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d934f07-49c3-4356-ae16-0c35f0935625-config\") pod \"9d934f07-49c3-4356-ae16-0c35f0935625\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.675121 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-swift-storage-0\") pod \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.676233 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovn-rundir\") pod \"9d934f07-49c3-4356-ae16-0c35f0935625\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.679335 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phz5b\" (UniqueName: \"kubernetes.io/projected/9d934f07-49c3-4356-ae16-0c35f0935625-kube-api-access-phz5b\") pod \"9d934f07-49c3-4356-ae16-0c35f0935625\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.680016 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-nb\") pod \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.680116 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv2pn\" (UniqueName: \"kubernetes.io/projected/9004e305-ef45-437a-a38c-b50c9d1f1ff7-kube-api-access-gv2pn\") pod \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.680173 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-metrics-certs-tls-certs\") pod \"9d934f07-49c3-4356-ae16-0c35f0935625\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.680212 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-sb\") pod \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\" (UID: \"9004e305-ef45-437a-a38c-b50c9d1f1ff7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.680294 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-combined-ca-bundle\") pod \"9d934f07-49c3-4356-ae16-0c35f0935625\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.680337 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovs-rundir\") pod \"9d934f07-49c3-4356-ae16-0c35f0935625\" (UID: \"9d934f07-49c3-4356-ae16-0c35f0935625\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.681754 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d934f07-49c3-4356-ae16-0c35f0935625-config" (OuterVolumeSpecName: "config") pod "9d934f07-49c3-4356-ae16-0c35f0935625" (UID: "9d934f07-49c3-4356-ae16-0c35f0935625"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.686453 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dwf9c"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.688867 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "9d934f07-49c3-4356-ae16-0c35f0935625" (UID: "9d934f07-49c3-4356-ae16-0c35f0935625"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.689260 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "9d934f07-49c3-4356-ae16-0c35f0935625" (UID: "9d934f07-49c3-4356-ae16-0c35f0935625"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.693803 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7/ovsdbserver-nb/0.log" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.693966 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.700194 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.710448 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dsnv9"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.721049 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dsnv9"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.722130 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d934f07-49c3-4356-ae16-0c35f0935625-kube-api-access-phz5b" (OuterVolumeSpecName: "kube-api-access-phz5b") pod "9d934f07-49c3-4356-ae16-0c35f0935625" (UID: "9d934f07-49c3-4356-ae16-0c35f0935625"). InnerVolumeSpecName "kube-api-access-phz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.728884 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9004e305-ef45-437a-a38c-b50c9d1f1ff7-kube-api-access-gv2pn" (OuterVolumeSpecName: "kube-api-access-gv2pn") pod "9004e305-ef45-437a-a38c-b50c9d1f1ff7" (UID: "9004e305-ef45-437a-a38c-b50c9d1f1ff7"). InnerVolumeSpecName "kube-api-access-gv2pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.732261 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-nrtft"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.739893 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.740138 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-log" containerID="cri-o://57f2511f2dd582a994211348b57df514b82da119c5796e9b0a9f36eaf93af695" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.740283 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-api" containerID="cri-o://739b061c527066552e175c0fbe0488722ce25d75896eb047b0bd920e05d3e6cb" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.748532 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.748803 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f7c2c75d-2b97-4e81-9701-486cee85dd93" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a0a68df17c077a14641c9e1331ad90f35cbd875eefb6a62fd05796b5756983a2" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.755940 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.781846 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-combined-ca-bundle\") pod \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.781957 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config\") pod \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.781977 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdbserver-nb-tls-certs\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782009 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-config\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782036 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782072 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdb-rundir\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782113 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f4vx\" (UniqueName: \"kubernetes.io/projected/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-kube-api-access-2f4vx\") pod \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782157 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ptn6\" (UniqueName: \"kubernetes.io/projected/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-kube-api-access-7ptn6\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782200 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config-secret\") pod \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\" (UID: \"6dcacc6d-2066-4bbf-a65c-8ff457d6235b\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782215 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-metrics-certs-tls-certs\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782246 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-scripts\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.782309 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-combined-ca-bundle\") pod \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\" (UID: \"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7\") " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.783055 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.783069 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phz5b\" (UniqueName: \"kubernetes.io/projected/9d934f07-49c3-4356-ae16-0c35f0935625-kube-api-access-phz5b\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.783079 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv2pn\" (UniqueName: \"kubernetes.io/projected/9004e305-ef45-437a-a38c-b50c9d1f1ff7-kube-api-access-gv2pn\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.783087 5129 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9d934f07-49c3-4356-ae16-0c35f0935625-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.783095 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d934f07-49c3-4356-ae16-0c35f0935625-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.788735 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.801705 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-config" (OuterVolumeSpecName: "config") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.801768 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.801847 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.812592 5129 scope.go:117] "RemoveContainer" containerID="27a037ff17952e87a1fb3f9495575e1bd75902ebd3287783d0d5ba6c1a9e147c" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.818508 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.818788 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f796158b-a0d2-4077-9c18-b91a594343fb" containerName="nova-scheduler-scheduler" containerID="cri-o://a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.821316 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-kube-api-access-2f4vx" (OuterVolumeSpecName: "kube-api-access-2f4vx") pod "6dcacc6d-2066-4bbf-a65c-8ff457d6235b" (UID: "6dcacc6d-2066-4bbf-a65c-8ff457d6235b"). InnerVolumeSpecName "kube-api-access-2f4vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.825220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-scripts" (OuterVolumeSpecName: "scripts") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.845472 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7957bb5589-vf68m" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.178:8080/healthcheck\": dial tcp 10.217.0.178:8080: connect: connection refused" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.845557 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7957bb5589-vf68m" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.178:8080/healthcheck\": dial tcp 10.217.0.178:8080: connect: connection refused" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.877511 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.879241 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-twj2g"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.893799 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.893841 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.893853 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.893863 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f4vx\" (UniqueName: \"kubernetes.io/projected/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-kube-api-access-2f4vx\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.893873 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894227 5129 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894248 5129 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894314 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:24:58.894290169 +0000 UTC m=+1561.646205363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894337 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:24:58.89432602 +0000 UTC m=+1561.646241304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-config-data" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894348 5129 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894360 5129 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894367 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894379 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:56 crc kubenswrapper[5129]: E0314 07:24:56.894415 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:24:58.894398423 +0000 UTC m=+1561.646313607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.894758 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-kube-api-access-7ptn6" (OuterVolumeSpecName: "kube-api-access-7ptn6") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "kube-api-access-7ptn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.917512 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-twj2g"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.936767 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.937211 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="288de2f6-818d-4167-8511-76f958542fbd" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1" gracePeriod=30 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.946330 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerName="rabbitmq" containerID="cri-o://277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb" gracePeriod=604800 Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.948702 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-bgp52"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.951237 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6dcacc6d-2066-4bbf-a65c-8ff457d6235b" (UID: "6dcacc6d-2066-4bbf-a65c-8ff457d6235b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.956598 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-nrtft"] Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.981765 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dcacc6d-2066-4bbf-a65c-8ff457d6235b" (UID: "6dcacc6d-2066-4bbf-a65c-8ff457d6235b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.996077 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ptn6\" (UniqueName: \"kubernetes.io/projected/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-kube-api-access-7ptn6\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.996109 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:56 crc kubenswrapper[5129]: I0314 07:24:56.996120 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.042518 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jmzpz"] Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.057866 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.058194 5129 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-conductor-0" secret="" err="secret \"nova-nova-dockercfg-q47ks\" not found" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.069292 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.069363 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jmzpz"] Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.076642 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:57 crc kubenswrapper[5129]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: if [ -n "nova_cell0" ]; then Mar 14 07:24:57 crc kubenswrapper[5129]: GRANT_DATABASE="nova_cell0" Mar 14 07:24:57 crc kubenswrapper[5129]: else Mar 14 07:24:57 crc kubenswrapper[5129]: GRANT_DATABASE="*" Mar 14 07:24:57 crc kubenswrapper[5129]: fi Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: # going for maximum compatibility here: Mar 14 07:24:57 crc kubenswrapper[5129]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:57 crc kubenswrapper[5129]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:57 crc kubenswrapper[5129]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:57 crc kubenswrapper[5129]: # support updates Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.079426 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:57 crc kubenswrapper[5129]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: if [ -n "nova_cell1" ]; then Mar 14 07:24:57 crc kubenswrapper[5129]: GRANT_DATABASE="nova_cell1" Mar 14 07:24:57 crc kubenswrapper[5129]: else Mar 14 07:24:57 crc kubenswrapper[5129]: GRANT_DATABASE="*" Mar 14 07:24:57 crc kubenswrapper[5129]: fi Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: # going for maximum compatibility here: Mar 14 07:24:57 crc kubenswrapper[5129]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:57 crc kubenswrapper[5129]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:57 crc kubenswrapper[5129]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:57 crc kubenswrapper[5129]: # support updates Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.079878 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-3b40-account-create-update-nrtft" podUID="bf7c35ff-a6ef-4686-aa16-cc65d46d3527" Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.080812 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-168c-account-create-update-bgp52" podUID="1be17304-8fa4-4ec4-893d-3094d703893a" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.097976 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.098119 5129 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.098166 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data podName:df407ca4-4a5d-404c-ab22-89bcde2439c4 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:57.598151093 +0000 UTC m=+1560.350066267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data") pod "nova-cell1-conductor-0" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4") : secret "nova-cell1-conductor-config-data" not found Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.125531 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9004e305-ef45-437a-a38c-b50c9d1f1ff7" (UID: "9004e305-ef45-437a-a38c-b50c9d1f1ff7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.128894 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9004e305-ef45-437a-a38c-b50c9d1f1ff7" (UID: "9004e305-ef45-437a-a38c-b50c9d1f1ff7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.131382 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-config" (OuterVolumeSpecName: "config") pod "9004e305-ef45-437a-a38c-b50c9d1f1ff7" (UID: "9004e305-ef45-437a-a38c-b50c9d1f1ff7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.157104 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0796-account-create-update-crlmm"] Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.164727 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.186064 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.203494 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.203534 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.203547 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.203559 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.203570 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.204397 5129 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 07:24:57 crc kubenswrapper[5129]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: if [ -n "nova_api" ]; then Mar 14 07:24:57 crc kubenswrapper[5129]: GRANT_DATABASE="nova_api" Mar 14 07:24:57 crc kubenswrapper[5129]: else Mar 14 07:24:57 crc kubenswrapper[5129]: GRANT_DATABASE="*" Mar 14 07:24:57 crc kubenswrapper[5129]: fi Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: # going for maximum compatibility here: Mar 14 07:24:57 crc kubenswrapper[5129]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 14 07:24:57 crc kubenswrapper[5129]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 14 07:24:57 crc kubenswrapper[5129]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 14 07:24:57 crc kubenswrapper[5129]: # support updates Mar 14 07:24:57 crc kubenswrapper[5129]: Mar 14 07:24:57 crc kubenswrapper[5129]: $MYSQL_CMD < logger="UnhandledError" Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.206106 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-0796-account-create-update-crlmm" podUID="87ce93e5-3bef-4e45-a23a-4164fb7aed7a" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.216317 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9004e305-ef45-437a-a38c-b50c9d1f1ff7" (UID: "9004e305-ef45-437a-a38c-b50c9d1f1ff7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.225657 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9004e305-ef45-437a-a38c-b50c9d1f1ff7" (UID: "9004e305-ef45-437a-a38c-b50c9d1f1ff7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.238795 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" (UID: "f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.241571 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6dcacc6d-2066-4bbf-a65c-8ff457d6235b" (UID: "6dcacc6d-2066-4bbf-a65c-8ff457d6235b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.246425 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d934f07-49c3-4356-ae16-0c35f0935625" (UID: "9d934f07-49c3-4356-ae16-0c35f0935625"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.251452 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9d934f07-49c3-4356-ae16-0c35f0935625" (UID: "9d934f07-49c3-4356-ae16-0c35f0935625"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.305550 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.305588 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.305615 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d934f07-49c3-4356-ae16-0c35f0935625-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.305630 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9004e305-ef45-437a-a38c-b50c9d1f1ff7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.305643 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dcacc6d-2066-4bbf-a65c-8ff457d6235b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.305654 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.419672 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerName="galera" containerID="cri-o://347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c" gracePeriod=30 Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.609439 5129 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.609895 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data podName:df407ca4-4a5d-404c-ab22-89bcde2439c4 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:58.60987316 +0000 UTC m=+1561.361788344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data") pod "nova-cell1-conductor-0" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4") : secret "nova-cell1-conductor-config-data" not found Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.656722 5129 generic.go:334] "Generic (PLEG): container finished" podID="b77f36b2-be7b-43cb-ada4-74f524396018" containerID="b6d52b75e1402ca8a13b76e96d5866748554a53c9657a25e1a3efcb0e8b1cb51" exitCode=1 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.657324 5129 scope.go:117] "RemoveContainer" containerID="b6d52b75e1402ca8a13b76e96d5866748554a53c9657a25e1a3efcb0e8b1cb51" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.657762 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48dp4" event={"ID":"b77f36b2-be7b-43cb-ada4-74f524396018","Type":"ContainerDied","Data":"b6d52b75e1402ca8a13b76e96d5866748554a53c9657a25e1a3efcb0e8b1cb51"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.663832 5129 generic.go:334] "Generic (PLEG): container finished" podID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerID="0b249c015ec3a36e3def27e75975febe3abb0dc9b079bfbd50022f9693560988" exitCode=0 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.663857 5129 generic.go:334] "Generic (PLEG): container finished" podID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerID="76f24021b697ad5485f9f5594cd9b1485876631247e65a9f1fe131e045674477" exitCode=0 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.663896 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7132e8d4-728a-4852-bcd9-833a9bd05878","Type":"ContainerDied","Data":"0b249c015ec3a36e3def27e75975febe3abb0dc9b079bfbd50022f9693560988"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.663920 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7132e8d4-728a-4852-bcd9-833a9bd05878","Type":"ContainerDied","Data":"76f24021b697ad5485f9f5594cd9b1485876631247e65a9f1fe131e045674477"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.665632 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b40-account-create-update-nrtft" event={"ID":"bf7c35ff-a6ef-4686-aa16-cc65d46d3527","Type":"ContainerStarted","Data":"b737a0013fcb46739a94169afbfeb5ed3e6add1943daf8d02759145ac3a5ffb8"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.670202 5129 generic.go:334] "Generic (PLEG): container finished" podID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerID="d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f" exitCode=143 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.670301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" event={"ID":"30ca9513-5ae9-4520-8012-3c941786ce2a","Type":"ContainerDied","Data":"d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.672688 5129 generic.go:334] "Generic (PLEG): container finished" podID="26087e66-e9e8-451d-80b4-d288468202f1" containerID="827ff25380bea7a7b669c68f4f4faa1199ffe0abbccc374df9dfa9bf6a471dee" exitCode=0 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.672697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957bb5589-vf68m" event={"ID":"26087e66-e9e8-451d-80b4-d288468202f1","Type":"ContainerDied","Data":"827ff25380bea7a7b669c68f4f4faa1199ffe0abbccc374df9dfa9bf6a471dee"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.672735 5129 generic.go:334] "Generic (PLEG): container finished" podID="26087e66-e9e8-451d-80b4-d288468202f1" containerID="94ccaa2244dfc8d149d667d1f5c396aa3ecfa3171455d755f9e5a589e59121ff" exitCode=0 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.672780 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957bb5589-vf68m" event={"ID":"26087e66-e9e8-451d-80b4-d288468202f1","Type":"ContainerDied","Data":"94ccaa2244dfc8d149d667d1f5c396aa3ecfa3171455d755f9e5a589e59121ff"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.684629 5129 generic.go:334] "Generic (PLEG): container finished" podID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerID="b7fedc334b9aeda9f3eff633b2bb5ad2a6353604791c1e6f2adeb911962cfdac" exitCode=143 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.684697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc754bc48-djssr" event={"ID":"9d7fc10c-3f26-4459-9577-e7f09371a44b","Type":"ContainerDied","Data":"b7fedc334b9aeda9f3eff633b2bb5ad2a6353604791c1e6f2adeb911962cfdac"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.689755 5129 generic.go:334] "Generic (PLEG): container finished" podID="73a06d78-48be-4099-b7fa-be0557b6138e" containerID="a0297ed0b9af03c2d6e5fa02d749275ab59da2eca117907d6e9e8d9642cba071" exitCode=143 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.689834 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a06d78-48be-4099-b7fa-be0557b6138e","Type":"ContainerDied","Data":"a0297ed0b9af03c2d6e5fa02d749275ab59da2eca117907d6e9e8d9642cba071"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.692502 5129 generic.go:334] "Generic (PLEG): container finished" podID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerID="73a607e1cd40ac4f3d4872afe8c005319bf76b10db5cdba8a1d09fc3f83dead5" exitCode=143 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.692538 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f12ef33b-b86d-4c80-8f19-385ff5a93fee","Type":"ContainerDied","Data":"73a607e1cd40ac4f3d4872afe8c005319bf76b10db5cdba8a1d09fc3f83dead5"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.693615 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0796-account-create-update-crlmm" event={"ID":"87ce93e5-3bef-4e45-a23a-4164fb7aed7a","Type":"ContainerStarted","Data":"4b0646115a7e327581c0e282ecd08d7966eb6a5884f3e6062a5a2b29a1f8806f"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.697167 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-168c-account-create-update-bgp52" event={"ID":"1be17304-8fa4-4ec4-893d-3094d703893a","Type":"ContainerStarted","Data":"5f223912778663c7e9c241ee776b319fa419c957bd65e96525729545131fdf5f"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.727228 5129 generic.go:334] "Generic (PLEG): container finished" podID="233604f0-adda-4669-b868-b96791d98bca" containerID="d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff" exitCode=143 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.727314 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" event={"ID":"233604f0-adda-4669-b868-b96791d98bca","Type":"ContainerDied","Data":"d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.730884 5129 generic.go:334] "Generic (PLEG): container finished" podID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerID="57f2511f2dd582a994211348b57df514b82da119c5796e9b0a9f36eaf93af695" exitCode=143 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.730928 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b15058f-0936-4bb9-ad72-1c27661b4b82","Type":"ContainerDied","Data":"57f2511f2dd582a994211348b57df514b82da119c5796e9b0a9f36eaf93af695"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.744030 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7/ovsdbserver-nb/0.log" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.744135 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7","Type":"ContainerDied","Data":"0ec6ffd0dcd362ce56136bf653ba0a2b11b2c895fb574f364bb8252a5efab785"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.744175 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.744210 5129 scope.go:117] "RemoveContainer" containerID="ae523da99b25c8f299915e49323351372ae554b450e86eb9aa2d7dd42d03c9b4" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.759264 5129 generic.go:334] "Generic (PLEG): container finished" podID="f7c2c75d-2b97-4e81-9701-486cee85dd93" containerID="a0a68df17c077a14641c9e1331ad90f35cbd875eefb6a62fd05796b5756983a2" exitCode=0 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.759324 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7c2c75d-2b97-4e81-9701-486cee85dd93","Type":"ContainerDied","Data":"a0a68df17c077a14641c9e1331ad90f35cbd875eefb6a62fd05796b5756983a2"} Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.776554 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p2z68" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.776809 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.776886 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-rmw6j" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.784790 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="df407ca4-4a5d-404c-ab22-89bcde2439c4" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e" gracePeriod=30 Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.795794 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.802694 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.823548 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:24:57 crc kubenswrapper[5129]: E0314 07:24:57.823861 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data podName:d291cef2-24d8-4ae6-aa4f-dfa8e782db15 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:01.82384715 +0000 UTC m=+1564.575762334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data") pod "rabbitmq-server-0" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15") : configmap "rabbitmq-config-data" not found Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.827882 5129 scope.go:117] "RemoveContainer" containerID="dab8a5ebc5177ef76172886ed36e2482dbb0fd550653e8a3c821777fa2ad984e" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.888090 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p2z68"] Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.888110 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.895998 5129 scope.go:117] "RemoveContainer" containerID="8d3e839655b386c621e99ce51cd89ad667b865776cb2c358ac6411cb869d4ee6" Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.903359 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-p2z68"] Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.915019 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-rmw6j"] Mar 14 07:24:57 crc kubenswrapper[5129]: I0314 07:24:57.922732 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-rmw6j"] Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.029054 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-scripts\") pod \"7132e8d4-728a-4852-bcd9-833a9bd05878\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.029211 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7132e8d4-728a-4852-bcd9-833a9bd05878-etc-machine-id\") pod \"7132e8d4-728a-4852-bcd9-833a9bd05878\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.029339 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7132e8d4-728a-4852-bcd9-833a9bd05878-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7132e8d4-728a-4852-bcd9-833a9bd05878" (UID: "7132e8d4-728a-4852-bcd9-833a9bd05878"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.029381 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czcpz\" (UniqueName: \"kubernetes.io/projected/7132e8d4-728a-4852-bcd9-833a9bd05878-kube-api-access-czcpz\") pod \"7132e8d4-728a-4852-bcd9-833a9bd05878\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.032160 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-combined-ca-bundle\") pod \"7132e8d4-728a-4852-bcd9-833a9bd05878\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.032266 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data-custom\") pod \"7132e8d4-728a-4852-bcd9-833a9bd05878\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.032368 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data\") pod \"7132e8d4-728a-4852-bcd9-833a9bd05878\" (UID: \"7132e8d4-728a-4852-bcd9-833a9bd05878\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.033719 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7132e8d4-728a-4852-bcd9-833a9bd05878-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.048472 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7132e8d4-728a-4852-bcd9-833a9bd05878" (UID: "7132e8d4-728a-4852-bcd9-833a9bd05878"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.050028 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1acddb0d-6c0b-420c-9df5-f1b89d56b21e" path="/var/lib/kubelet/pods/1acddb0d-6c0b-420c-9df5-f1b89d56b21e/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.050559 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21298577-e14d-4394-9a86-f9e488d33659" path="/var/lib/kubelet/pods/21298577-e14d-4394-9a86-f9e488d33659/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.051026 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7132e8d4-728a-4852-bcd9-833a9bd05878-kube-api-access-czcpz" (OuterVolumeSpecName: "kube-api-access-czcpz") pod "7132e8d4-728a-4852-bcd9-833a9bd05878" (UID: "7132e8d4-728a-4852-bcd9-833a9bd05878"). InnerVolumeSpecName "kube-api-access-czcpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.051295 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b3c5416-85f2-4109-892a-33079d1541d9" path="/var/lib/kubelet/pods/3b3c5416-85f2-4109-892a-33079d1541d9/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.052335 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410747b6-e92a-4253-a502-6e43b9e6048b" path="/var/lib/kubelet/pods/410747b6-e92a-4253-a502-6e43b9e6048b/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.053484 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dcacc6d-2066-4bbf-a65c-8ff457d6235b" path="/var/lib/kubelet/pods/6dcacc6d-2066-4bbf-a65c-8ff457d6235b/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.054005 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" path="/var/lib/kubelet/pods/9004e305-ef45-437a-a38c-b50c9d1f1ff7/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.054525 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="937d0a55-2e9a-471b-b4d6-50cc8c4ddd33" path="/var/lib/kubelet/pods/937d0a55-2e9a-471b-b4d6-50cc8c4ddd33/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.062113 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d934f07-49c3-4356-ae16-0c35f0935625" path="/var/lib/kubelet/pods/9d934f07-49c3-4356-ae16-0c35f0935625/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.062892 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9190972-0c7d-4d51-a42e-089c06e17395" path="/var/lib/kubelet/pods/a9190972-0c7d-4d51-a42e-089c06e17395/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.065015 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ca802f-a617-4766-a97b-e8bafe556ce5" path="/var/lib/kubelet/pods/b3ca802f-a617-4766-a97b-e8bafe556ce5/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.066777 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.067525 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1bbd954-27ab-473e-b15f-42a06ad72887" path="/var/lib/kubelet/pods/c1bbd954-27ab-473e-b15f-42a06ad72887/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.068095 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a67053-ce35-4d1a-bdee-7d89e882b4b1" path="/var/lib/kubelet/pods/c7a67053-ce35-4d1a-bdee-7d89e882b4b1/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.068510 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-scripts" (OuterVolumeSpecName: "scripts") pod "7132e8d4-728a-4852-bcd9-833a9bd05878" (UID: "7132e8d4-728a-4852-bcd9-833a9bd05878"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.071079 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" path="/var/lib/kubelet/pods/d992f450-3800-45e2-abf4-41597a15f0c3/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.071764 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" path="/var/lib/kubelet/pods/f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7/volumes" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.093555 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.134677 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvtjl\" (UniqueName: \"kubernetes.io/projected/f7c2c75d-2b97-4e81-9701-486cee85dd93-kube-api-access-zvtjl\") pod \"f7c2c75d-2b97-4e81-9701-486cee85dd93\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.134715 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-vencrypt-tls-certs\") pod \"f7c2c75d-2b97-4e81-9701-486cee85dd93\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.134787 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-config-data\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.134924 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blm7t\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-kube-api-access-blm7t\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.134977 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-log-httpd\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135009 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-combined-ca-bundle\") pod \"f7c2c75d-2b97-4e81-9701-486cee85dd93\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135032 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-run-httpd\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135071 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-internal-tls-certs\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135100 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-combined-ca-bundle\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135148 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-config-data\") pod \"f7c2c75d-2b97-4e81-9701-486cee85dd93\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135169 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-etc-swift\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135209 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-nova-novncproxy-tls-certs\") pod \"f7c2c75d-2b97-4e81-9701-486cee85dd93\" (UID: \"f7c2c75d-2b97-4e81-9701-486cee85dd93\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135253 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-public-tls-certs\") pod \"26087e66-e9e8-451d-80b4-d288468202f1\" (UID: \"26087e66-e9e8-451d-80b4-d288468202f1\") " Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135405 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.135850 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.137054 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.137071 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.137080 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czcpz\" (UniqueName: \"kubernetes.io/projected/7132e8d4-728a-4852-bcd9-833a9bd05878-kube-api-access-czcpz\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.137090 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.137099 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26087e66-e9e8-451d-80b4-d288468202f1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.143558 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c2c75d-2b97-4e81-9701-486cee85dd93-kube-api-access-zvtjl" (OuterVolumeSpecName: "kube-api-access-zvtjl") pod "f7c2c75d-2b97-4e81-9701-486cee85dd93" (UID: "f7c2c75d-2b97-4e81-9701-486cee85dd93"). InnerVolumeSpecName "kube-api-access-zvtjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.144755 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-kube-api-access-blm7t" (OuterVolumeSpecName: "kube-api-access-blm7t") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "kube-api-access-blm7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.149002 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.168969 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7132e8d4-728a-4852-bcd9-833a9bd05878" (UID: "7132e8d4-728a-4852-bcd9-833a9bd05878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.203343 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7c2c75d-2b97-4e81-9701-486cee85dd93" (UID: "f7c2c75d-2b97-4e81-9701-486cee85dd93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.248397 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.249109 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blm7t\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-kube-api-access-blm7t\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.249148 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.249163 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26087e66-e9e8-451d-80b4-d288468202f1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.249178 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvtjl\" (UniqueName: \"kubernetes.io/projected/f7c2c75d-2b97-4e81-9701-486cee85dd93-kube-api-access-zvtjl\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.256149 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "f7c2c75d-2b97-4e81-9701-486cee85dd93" (UID: "f7c2c75d-2b97-4e81-9701-486cee85dd93"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.276180 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.329282 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-config-data" (OuterVolumeSpecName: "config-data") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.333788 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.335384 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.340775 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.340846 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f796158b-a0d2-4077-9c18-b91a594343fb" containerName="nova-scheduler-scheduler" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.350700 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.350733 5129 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.350742 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.355858 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-config-data" (OuterVolumeSpecName: "config-data") pod "f7c2c75d-2b97-4e81-9701-486cee85dd93" (UID: "f7c2c75d-2b97-4e81-9701-486cee85dd93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.433720 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "f7c2c75d-2b97-4e81-9701-486cee85dd93" (UID: "f7c2c75d-2b97-4e81-9701-486cee85dd93"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.450632 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.454053 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "26087e66-e9e8-451d-80b4-d288468202f1" (UID: "26087e66-e9e8-451d-80b4-d288468202f1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.455651 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.455727 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data podName:f6261e6b-f331-4dcb-8380-167e8f547e1b nodeName:}" failed. No retries permitted until 2026-03-14 07:25:02.455708195 +0000 UTC m=+1565.207623379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.455281 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.456081 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.456092 5129 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c2c75d-2b97-4e81-9701-486cee85dd93-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.456102 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26087e66-e9e8-451d-80b4-d288468202f1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.536307 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data" (OuterVolumeSpecName: "config-data") pod "7132e8d4-728a-4852-bcd9-833a9bd05878" (UID: "7132e8d4-728a-4852-bcd9-833a9bd05878"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.559359 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.173:8776/healthcheck\": read tcp 10.217.0.2:38114->10.217.0.173:8776: read: connection reset by peer" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.559979 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7132e8d4-728a-4852-bcd9-833a9bd05878-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.661802 5129 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.661874 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data podName:df407ca4-4a5d-404c-ab22-89bcde2439c4 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:00.661855901 +0000 UTC m=+1563.413771085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data") pod "nova-cell1-conductor-0" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4") : secret "nova-cell1-conductor-config-data" not found Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.765139 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.765395 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-central-agent" containerID="cri-o://bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d" gracePeriod=30 Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.765757 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="proxy-httpd" containerID="cri-o://7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b" gracePeriod=30 Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.765988 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="sg-core" containerID="cri-o://17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79" gracePeriod=30 Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.766067 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-notification-agent" containerID="cri-o://1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a" gracePeriod=30 Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.805821 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.806130 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1d59a327-2c1e-49e7-86b3-51e8b692545a" containerName="kube-state-metrics" containerID="cri-o://3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5" gracePeriod=30 Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.870099 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.878585 5129 generic.go:334] "Generic (PLEG): container finished" podID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerID="d055a298c80b1c78420262c5f6b1a8b08ee515621adade6064aa22f0860f7d0f" exitCode=0 Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.878699 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a3fad4b-8e44-471d-b262-27d6a7e05276","Type":"ContainerDied","Data":"d055a298c80b1c78420262c5f6b1a8b08ee515621adade6064aa22f0860f7d0f"} Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.943822 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.950267 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.950761 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7c2c75d-2b97-4e81-9701-486cee85dd93","Type":"ContainerDied","Data":"ca8e5265187f91cdf39d619d2ef915c6c6f0cd9b3c4a9b939bb3f60eb9b093e2"} Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.951030 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.951105 5129 scope.go:117] "RemoveContainer" containerID="a0a68df17c077a14641c9e1331ad90f35cbd875eefb6a62fd05796b5756983a2" Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.953283 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:58 crc kubenswrapper[5129]: E0314 07:24:58.992953 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.993086 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0796-account-create-update-crlmm" event={"ID":"87ce93e5-3bef-4e45-a23a-4164fb7aed7a","Type":"ContainerDied","Data":"4b0646115a7e327581c0e282ecd08d7966eb6a5884f3e6062a5a2b29a1f8806f"} Mar 14 07:24:58 crc kubenswrapper[5129]: I0314 07:24:58.993172 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0796-account-create-update-crlmm" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:58.999052 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5ea2-account-create-update-pzlww"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.004083 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be17304-8fa4-4ec4-893d-3094d703893a-operator-scripts\") pod \"1be17304-8fa4-4ec4-893d-3094d703893a\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.004174 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de15208-b058-4b80-886d-6b93469faac0-operator-scripts\") pod \"2de15208-b058-4b80-886d-6b93469faac0\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.004389 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h7c4\" (UniqueName: \"kubernetes.io/projected/1be17304-8fa4-4ec4-893d-3094d703893a-kube-api-access-5h7c4\") pod \"1be17304-8fa4-4ec4-893d-3094d703893a\" (UID: \"1be17304-8fa4-4ec4-893d-3094d703893a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.004416 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76qmf\" (UniqueName: \"kubernetes.io/projected/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-kube-api-access-76qmf\") pod \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.004445 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-operator-scripts\") pod \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\" (UID: \"87ce93e5-3bef-4e45-a23a-4164fb7aed7a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.004475 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p7cq\" (UniqueName: \"kubernetes.io/projected/2de15208-b058-4b80-886d-6b93469faac0-kube-api-access-2p7cq\") pod \"2de15208-b058-4b80-886d-6b93469faac0\" (UID: \"2de15208-b058-4b80-886d-6b93469faac0\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.004890 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be17304-8fa4-4ec4-893d-3094d703893a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1be17304-8fa4-4ec4-893d-3094d703893a" (UID: "1be17304-8fa4-4ec4-893d-3094d703893a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.006858 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de15208-b058-4b80-886d-6b93469faac0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2de15208-b058-4b80-886d-6b93469faac0" (UID: "2de15208-b058-4b80-886d-6b93469faac0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.007024 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87ce93e5-3bef-4e45-a23a-4164fb7aed7a" (UID: "87ce93e5-3bef-4e45-a23a-4164fb7aed7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.007075 5129 generic.go:334] "Generic (PLEG): container finished" podID="288de2f6-818d-4167-8511-76f958542fbd" containerID="c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1" exitCode=0 Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.007154 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"288de2f6-818d-4167-8511-76f958542fbd","Type":"ContainerDied","Data":"c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1"} Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.007969 5129 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.008014 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:25:03.007998659 +0000 UTC m=+1565.759913843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-config-data" not found Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.008485 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.008939 5129 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.008963 5129 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.008972 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.008982 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.009028 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:25:03.009012656 +0000 UTC m=+1565.760927840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.009281 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="659ee685-6b83-4af2-bd2e-e5ce9372e408" containerName="memcached" containerID="cri-o://babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73" gracePeriod=30 Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.009557 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be17304-8fa4-4ec4-893d-3094d703893a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.009631 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de15208-b058-4b80-886d-6b93469faac0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.009648 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.009963 5129 secret.go:188] Couldn't get secret openstack/barbican-keystone-listener-config-data: secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.010133 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom podName:30ca9513-5ae9-4520-8012-3c941786ce2a nodeName:}" failed. No retries permitted until 2026-03-14 07:25:03.010004054 +0000 UTC m=+1565.761919288 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom") pod "barbican-keystone-listener-54888cd7bb-schqh" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a") : secret "barbican-keystone-listener-config-data" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.010289 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.010345 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="ovn-northd" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.012547 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-kube-api-access-76qmf" (OuterVolumeSpecName: "kube-api-access-76qmf") pod "87ce93e5-3bef-4e45-a23a-4164fb7aed7a" (UID: "87ce93e5-3bef-4e45-a23a-4164fb7aed7a"). InnerVolumeSpecName "kube-api-access-76qmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.012841 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be17304-8fa4-4ec4-893d-3094d703893a-kube-api-access-5h7c4" (OuterVolumeSpecName: "kube-api-access-5h7c4") pod "1be17304-8fa4-4ec4-893d-3094d703893a" (UID: "1be17304-8fa4-4ec4-893d-3094d703893a"). InnerVolumeSpecName "kube-api-access-5h7c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.014248 5129 generic.go:334] "Generic (PLEG): container finished" podID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerID="19c89364d2b386b626087226ef1de11773409433b1f6b4165c7de722f1c08207" exitCode=0 Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.014304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b4586cb8-pfpxj" event={"ID":"8da79e9b-0c3f-4d66-9813-08116725c6a4","Type":"ContainerDied","Data":"19c89364d2b386b626087226ef1de11773409433b1f6b4165c7de722f1c08207"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.017890 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de15208-b058-4b80-886d-6b93469faac0-kube-api-access-2p7cq" (OuterVolumeSpecName: "kube-api-access-2p7cq") pod "2de15208-b058-4b80-886d-6b93469faac0" (UID: "2de15208-b058-4b80-886d-6b93469faac0"). InnerVolumeSpecName "kube-api-access-2p7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.018903 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5ea2-account-create-update-pzlww"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.021946 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033130 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5ea2-account-create-update-4bdm8"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033679 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c2c75d-2b97-4e81-9701-486cee85dd93" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033703 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c2c75d-2b97-4e81-9701-486cee85dd93" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033718 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="ovsdbserver-nb" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033726 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="ovsdbserver-nb" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033749 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d934f07-49c3-4356-ae16-0c35f0935625" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033758 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d934f07-49c3-4356-ae16-0c35f0935625" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033781 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="ovsdbserver-sb" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033790 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="ovsdbserver-sb" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033803 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033810 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033839 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerName="dnsmasq-dns" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033847 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerName="dnsmasq-dns" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033861 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="cinder-scheduler" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033869 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="cinder-scheduler" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033884 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-httpd" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033892 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-httpd" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033901 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033909 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033922 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="probe" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033929 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="probe" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033940 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-server" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033949 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-server" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.033957 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerName="init" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.033965 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerName="init" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034168 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="ovsdbserver-sb" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034181 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034198 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-server" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034213 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c2c75d-2b97-4e81-9701-486cee85dd93" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034225 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="probe" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034247 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" containerName="cinder-scheduler" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034259 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d992f450-3800-45e2-abf4-41597a15f0c3" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034271 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="26087e66-e9e8-451d-80b4-d288468202f1" containerName="proxy-httpd" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034280 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9004e305-ef45-437a-a38c-b50c9d1f1ff7" containerName="dnsmasq-dns" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034290 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2172c89-1ef3-451d-abc8-2ee8ac3bb4a7" containerName="ovsdbserver-nb" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.034305 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d934f07-49c3-4356-ae16-0c35f0935625" containerName="openstack-network-exporter" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.039896 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-168c-account-create-update-bgp52" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.047135 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-168c-account-create-update-bgp52" event={"ID":"1be17304-8fa4-4ec4-893d-3094d703893a","Type":"ContainerDied","Data":"5f223912778663c7e9c241ee776b319fa419c957bd65e96525729545131fdf5f"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.047181 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4wxfh"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.049753 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.051228 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4wxfh"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.060490 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.064325 5129 generic.go:334] "Generic (PLEG): container finished" podID="b77f36b2-be7b-43cb-ada4-74f524396018" containerID="75b46a090025ba91094ce980bb19db3328ed7635e0272c7d2b864af79aa4e4e3" exitCode=1 Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.064456 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48dp4" event={"ID":"b77f36b2-be7b-43cb-ada4-74f524396018","Type":"ContainerDied","Data":"75b46a090025ba91094ce980bb19db3328ed7635e0272c7d2b864af79aa4e4e3"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.064502 5129 scope.go:117] "RemoveContainer" containerID="b6d52b75e1402ca8a13b76e96d5866748554a53c9657a25e1a3efcb0e8b1cb51" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.069566 5129 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-48dp4" secret="" err="secret \"galera-openstack-dockercfg-gq57p\" not found" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.069631 5129 scope.go:117] "RemoveContainer" containerID="75b46a090025ba91094ce980bb19db3328ed7635e0272c7d2b864af79aa4e4e3" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.069832 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-48dp4_openstack(b77f36b2-be7b-43cb-ada4-74f524396018)\"" pod="openstack/root-account-create-update-48dp4" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.090319 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b40-account-create-update-nrtft" event={"ID":"bf7c35ff-a6ef-4686-aa16-cc65d46d3527","Type":"ContainerDied","Data":"b737a0013fcb46739a94169afbfeb5ed3e6add1943daf8d02759145ac3a5ffb8"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.090463 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b40-account-create-update-nrtft" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.093773 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rkl9q"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.094036 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.106277 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957bb5589-vf68m" event={"ID":"26087e66-e9e8-451d-80b4-d288468202f1","Type":"ContainerDied","Data":"c76099919cebb4d60d6551c9ef741a06bfa26a5b6753af84e79bff21e1b74618"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.106317 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7957bb5589-vf68m" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.111884 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-operator-scripts\") pod \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.111927 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112074 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-operator-scripts\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112103 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-combined-ca-bundle\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112346 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-default\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112485 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/54d01ce1-11e5-4fc7-a120-44d9d3407142-kube-api-access-k4mph\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112512 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-generated\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112527 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-galera-tls-certs\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112646 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-kolla-config\") pod \"54d01ce1-11e5-4fc7-a120-44d9d3407142\" (UID: \"54d01ce1-11e5-4fc7-a120-44d9d3407142\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.112675 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxh5q\" (UniqueName: \"kubernetes.io/projected/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-kube-api-access-pxh5q\") pod \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\" (UID: \"bf7c35ff-a6ef-4686-aa16-cc65d46d3527\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.113090 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzjx5\" (UniqueName: \"kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.113169 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.113264 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h7c4\" (UniqueName: \"kubernetes.io/projected/1be17304-8fa4-4ec4-893d-3094d703893a-kube-api-access-5h7c4\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.113280 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76qmf\" (UniqueName: \"kubernetes.io/projected/87ce93e5-3bef-4e45-a23a-4164fb7aed7a-kube-api-access-76qmf\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.113290 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p7cq\" (UniqueName: \"kubernetes.io/projected/2de15208-b058-4b80-886d-6b93469faac0-kube-api-access-2p7cq\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.114578 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf7c35ff-a6ef-4686-aa16-cc65d46d3527" (UID: "bf7c35ff-a6ef-4686-aa16-cc65d46d3527"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.125226 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.125656 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.126041 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.126193 5129 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.126244 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts podName:b77f36b2-be7b-43cb-ada4-74f524396018 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:59.626227375 +0000 UTC m=+1562.378142559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts") pod "root-account-create-update-48dp4" (UID: "b77f36b2-be7b-43cb-ada4-74f524396018") : configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.127359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.135327 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5ea2-account-create-update-4bdm8"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.135376 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-kube-api-access-pxh5q" (OuterVolumeSpecName: "kube-api-access-pxh5q") pod "bf7c35ff-a6ef-4686-aa16-cc65d46d3527" (UID: "bf7c35ff-a6ef-4686-aa16-cc65d46d3527"). InnerVolumeSpecName "kube-api-access-pxh5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.138832 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d01ce1-11e5-4fc7-a120-44d9d3407142-kube-api-access-k4mph" (OuterVolumeSpecName: "kube-api-access-k4mph") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "kube-api-access-k4mph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.144420 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rkl9q"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.148468 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.153635 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.153996 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7132e8d4-728a-4852-bcd9-833a9bd05878","Type":"ContainerDied","Data":"0a63e977059e2bebfc4d29b936c8bfb628ac660dc2cfb4ee7f87653bb9143760"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.162844 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c51b-account-create-update-crv5x" event={"ID":"2de15208-b058-4b80-886d-6b93469faac0","Type":"ContainerDied","Data":"2f241bb99f034ae426d36fd3e9d916cdbcb4dda97344b82a4ef4ca508a6ff78c"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.162904 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c51b-account-create-update-crv5x" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.200313 5129 generic.go:334] "Generic (PLEG): container finished" podID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerID="347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c" exitCode=0 Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.200351 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"54d01ce1-11e5-4fc7-a120-44d9d3407142","Type":"ContainerDied","Data":"347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.200374 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"54d01ce1-11e5-4fc7-a120-44d9d3407142","Type":"ContainerDied","Data":"d285c2fb5c214972150d8fcfeedc394a9c958449f997ac0d0e5f4e88262012ec"} Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.200487 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.201747 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-754fd75497-x4zwc"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.202139 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-754fd75497-x4zwc" podUID="be987b8a-a47d-46a9-bce9-6969473125ff" containerName="keystone-api" containerID="cri-o://953eea29bdc354c617411e73cad623b7f0a9af66f593b7fba568db94fda9d685" gracePeriod=30 Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.219274 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221053 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzjx5\" (UniqueName: \"kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221216 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221384 5129 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221417 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxh5q\" (UniqueName: \"kubernetes.io/projected/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-kube-api-access-pxh5q\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221429 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf7c35ff-a6ef-4686-aa16-cc65d46d3527-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221448 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221458 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221466 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221496 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/54d01ce1-11e5-4fc7-a120-44d9d3407142-kube-api-access-k4mph\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.221505 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/54d01ce1-11e5-4fc7-a120-44d9d3407142-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.221462 5129 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.221577 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts podName:a8b77a9e-f83e-417f-88f7-c378046c4c54 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:59.721544307 +0000 UTC m=+1562.473459491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts") pod "keystone-5ea2-account-create-update-4bdm8" (UID: "a8b77a9e-f83e-417f-88f7-c378046c4c54") : configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.235072 5129 projected.go:194] Error preparing data for projected volume kube-api-access-bzjx5 for pod openstack/keystone-5ea2-account-create-update-4bdm8: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.235139 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5 podName:a8b77a9e-f83e-417f-88f7-c378046c4c54 nodeName:}" failed. No retries permitted until 2026-03-14 07:24:59.735120838 +0000 UTC m=+1562.487036022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bzjx5" (UniqueName: "kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5") pod "keystone-5ea2-account-create-update-4bdm8" (UID: "a8b77a9e-f83e-417f-88f7-c378046c4c54") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.242868 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.277813 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.277846 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1 is running failed: container process not found" containerID="c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.279237 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1 is running failed: container process not found" containerID="c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.280514 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1 is running failed: container process not found" containerID="c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.280939 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="288de2f6-818d-4167-8511-76f958542fbd" containerName="nova-cell0-conductor-conductor" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.284556 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mqldm"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.298923 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mqldm"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.317234 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5ea2-account-create-update-4bdm8"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.317897 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-bzjx5 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-5ea2-account-create-update-4bdm8" podUID="a8b77a9e-f83e-417f-88f7-c378046c4c54" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.318207 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "54d01ce1-11e5-4fc7-a120-44d9d3407142" (UID: "54d01ce1-11e5-4fc7-a120-44d9d3407142"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.324378 5129 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.324408 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.324422 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d01ce1-11e5-4fc7-a120-44d9d3407142-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.351756 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-48dp4"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.385749 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9bsx2" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerName="ovn-controller" probeResult="failure" output=< Mar 14 07:24:59 crc kubenswrapper[5129]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 14 07:24:59 crc kubenswrapper[5129]: > Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.389363 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.397328 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.410374 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-bgp52"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.417618 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-168c-account-create-update-bgp52"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.421891 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.422343 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.422563 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.422591 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.425759 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.431261 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.432821 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.432856 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.450856 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.467467 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.482007 5129 scope.go:117] "RemoveContainer" containerID="827ff25380bea7a7b669c68f4f4faa1199ffe0abbccc374df9dfa9bf6a471dee" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.513739 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7957bb5589-vf68m"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.513849 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7957bb5589-vf68m"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.534706 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerName="galera" containerID="cri-o://d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f" gracePeriod=30 Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.535175 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.571801 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.596189 5129 scope.go:117] "RemoveContainer" containerID="94ccaa2244dfc8d149d667d1f5c396aa3ecfa3171455d755f9e5a589e59121ff" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631206 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a3fad4b-8e44-471d-b262-27d6a7e05276-etc-machine-id\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631270 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-internal-tls-certs\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631308 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a3fad4b-8e44-471d-b262-27d6a7e05276-logs\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631391 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-combined-ca-bundle\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631463 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhsj7\" (UniqueName: \"kubernetes.io/projected/8da79e9b-0c3f-4d66-9813-08116725c6a4-kube-api-access-bhsj7\") pod \"8da79e9b-0c3f-4d66-9813-08116725c6a4\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631542 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-scripts\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631595 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-internal-tls-certs\") pod \"8da79e9b-0c3f-4d66-9813-08116725c6a4\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631662 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-public-tls-certs\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631710 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-scripts\") pod \"8da79e9b-0c3f-4d66-9813-08116725c6a4\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631736 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-public-tls-certs\") pod \"8da79e9b-0c3f-4d66-9813-08116725c6a4\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631831 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631876 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data-custom\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631904 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-combined-ca-bundle\") pod \"8da79e9b-0c3f-4d66-9813-08116725c6a4\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631970 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8da79e9b-0c3f-4d66-9813-08116725c6a4-logs\") pod \"8da79e9b-0c3f-4d66-9813-08116725c6a4\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.631992 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-config-data\") pod \"8da79e9b-0c3f-4d66-9813-08116725c6a4\" (UID: \"8da79e9b-0c3f-4d66-9813-08116725c6a4\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.632038 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g2m5\" (UniqueName: \"kubernetes.io/projected/1a3fad4b-8e44-471d-b262-27d6a7e05276-kube-api-access-4g2m5\") pod \"1a3fad4b-8e44-471d-b262-27d6a7e05276\" (UID: \"1a3fad4b-8e44-471d-b262-27d6a7e05276\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.632378 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a3fad4b-8e44-471d-b262-27d6a7e05276-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.636349 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a3fad4b-8e44-471d-b262-27d6a7e05276-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.636467 5129 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.636548 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts podName:b77f36b2-be7b-43cb-ada4-74f524396018 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:00.636531203 +0000 UTC m=+1563.388446387 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts") pod "root-account-create-update-48dp4" (UID: "b77f36b2-be7b-43cb-ada4-74f524396018") : configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.643094 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c51b-account-create-update-crv5x"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.643749 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3fad4b-8e44-471d-b262-27d6a7e05276-logs" (OuterVolumeSpecName: "logs") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.663932 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8da79e9b-0c3f-4d66-9813-08116725c6a4-logs" (OuterVolumeSpecName: "logs") pod "8da79e9b-0c3f-4d66-9813-08116725c6a4" (UID: "8da79e9b-0c3f-4d66-9813-08116725c6a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.675683 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da79e9b-0c3f-4d66-9813-08116725c6a4-kube-api-access-bhsj7" (OuterVolumeSpecName: "kube-api-access-bhsj7") pod "8da79e9b-0c3f-4d66-9813-08116725c6a4" (UID: "8da79e9b-0c3f-4d66-9813-08116725c6a4"). InnerVolumeSpecName "kube-api-access-bhsj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.682415 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.682794 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-scripts" (OuterVolumeSpecName: "scripts") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.690263 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-scripts" (OuterVolumeSpecName: "scripts") pod "8da79e9b-0c3f-4d66-9813-08116725c6a4" (UID: "8da79e9b-0c3f-4d66-9813-08116725c6a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.699169 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c51b-account-create-update-crv5x"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.704019 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3fad4b-8e44-471d-b262-27d6a7e05276-kube-api-access-4g2m5" (OuterVolumeSpecName: "kube-api-access-4g2m5") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "kube-api-access-4g2m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.731305 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737056 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-config\") pod \"1d59a327-2c1e-49e7-86b3-51e8b692545a\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737128 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ldw5\" (UniqueName: \"kubernetes.io/projected/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-api-access-9ldw5\") pod \"1d59a327-2c1e-49e7-86b3-51e8b692545a\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737245 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-combined-ca-bundle\") pod \"1d59a327-2c1e-49e7-86b3-51e8b692545a\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737335 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-certs\") pod \"1d59a327-2c1e-49e7-86b3-51e8b692545a\" (UID: \"1d59a327-2c1e-49e7-86b3-51e8b692545a\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737640 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzjx5\" (UniqueName: \"kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737713 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737792 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737803 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8da79e9b-0c3f-4d66-9813-08116725c6a4-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737812 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g2m5\" (UniqueName: \"kubernetes.io/projected/1a3fad4b-8e44-471d-b262-27d6a7e05276-kube-api-access-4g2m5\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737822 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a3fad4b-8e44-471d-b262-27d6a7e05276-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737831 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhsj7\" (UniqueName: \"kubernetes.io/projected/8da79e9b-0c3f-4d66-9813-08116725c6a4-kube-api-access-bhsj7\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737839 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.737846 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.737900 5129 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.737938 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts podName:a8b77a9e-f83e-417f-88f7-c378046c4c54 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:00.737924531 +0000 UTC m=+1563.489839715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts") pod "keystone-5ea2-account-create-update-4bdm8" (UID: "a8b77a9e-f83e-417f-88f7-c378046c4c54") : configmap "openstack-scripts" not found Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.739084 5129 scope.go:117] "RemoveContainer" containerID="0b249c015ec3a36e3def27e75975febe3abb0dc9b079bfbd50022f9693560988" Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.746079 5129 projected.go:194] Error preparing data for projected volume kube-api-access-bzjx5 for pod openstack/keystone-5ea2-account-create-update-4bdm8: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:59 crc kubenswrapper[5129]: E0314 07:24:59.746220 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5 podName:a8b77a9e-f83e-417f-88f7-c378046c4c54 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:00.746180425 +0000 UTC m=+1563.498095609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bzjx5" (UniqueName: "kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5") pod "keystone-5ea2-account-create-update-4bdm8" (UID: "a8b77a9e-f83e-417f-88f7-c378046c4c54") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.751462 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.751942 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-api-access-9ldw5" (OuterVolumeSpecName: "kube-api-access-9ldw5") pod "1d59a327-2c1e-49e7-86b3-51e8b692545a" (UID: "1d59a327-2c1e-49e7-86b3-51e8b692545a"). InnerVolumeSpecName "kube-api-access-9ldw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.753806 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cc754bc48-djssr" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:51914->10.217.0.167:9311: read: connection reset by peer" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.753921 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cc754bc48-djssr" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:51916->10.217.0.167:9311: read: connection reset by peer" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.775145 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-config-data" (OuterVolumeSpecName: "config-data") pod "8da79e9b-0c3f-4d66-9813-08116725c6a4" (UID: "8da79e9b-0c3f-4d66-9813-08116725c6a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.780065 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data" (OuterVolumeSpecName: "config-data") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.788073 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.790482 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d59a327-2c1e-49e7-86b3-51e8b692545a" (UID: "1d59a327-2c1e-49e7-86b3-51e8b692545a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.790685 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0796-account-create-update-crlmm"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.791534 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "1d59a327-2c1e-49e7-86b3-51e8b692545a" (UID: "1d59a327-2c1e-49e7-86b3-51e8b692545a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.816392 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0796-account-create-update-crlmm"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.826116 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8da79e9b-0c3f-4d66-9813-08116725c6a4" (UID: "8da79e9b-0c3f-4d66-9813-08116725c6a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.826177 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "1d59a327-2c1e-49e7-86b3-51e8b692545a" (UID: "1d59a327-2c1e-49e7-86b3-51e8b692545a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.832199 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-nrtft"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.839381 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmcqr\" (UniqueName: \"kubernetes.io/projected/288de2f6-818d-4167-8511-76f958542fbd-kube-api-access-kmcqr\") pod \"288de2f6-818d-4167-8511-76f958542fbd\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.839435 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-config-data\") pod \"288de2f6-818d-4167-8511-76f958542fbd\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.839840 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-combined-ca-bundle\") pod \"288de2f6-818d-4167-8511-76f958542fbd\" (UID: \"288de2f6-818d-4167-8511-76f958542fbd\") " Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840249 5129 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840260 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ldw5\" (UniqueName: \"kubernetes.io/projected/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-api-access-9ldw5\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840269 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840278 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840286 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840293 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840301 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.840309 5129 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d59a327-2c1e-49e7-86b3-51e8b692545a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.842769 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3b40-account-create-update-nrtft"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.849874 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.859382 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.872460 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288de2f6-818d-4167-8511-76f958542fbd-kube-api-access-kmcqr" (OuterVolumeSpecName: "kube-api-access-kmcqr") pod "288de2f6-818d-4167-8511-76f958542fbd" (UID: "288de2f6-818d-4167-8511-76f958542fbd"). InnerVolumeSpecName "kube-api-access-kmcqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.874204 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.883874 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-config-data" (OuterVolumeSpecName: "config-data") pod "288de2f6-818d-4167-8511-76f958542fbd" (UID: "288de2f6-818d-4167-8511-76f958542fbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.891852 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "288de2f6-818d-4167-8511-76f958542fbd" (UID: "288de2f6-818d-4167-8511-76f958542fbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.897541 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a3fad4b-8e44-471d-b262-27d6a7e05276" (UID: "1a3fad4b-8e44-471d-b262-27d6a7e05276"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.915681 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8da79e9b-0c3f-4d66-9813-08116725c6a4" (UID: "8da79e9b-0c3f-4d66-9813-08116725c6a4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.948148 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.948181 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288de2f6-818d-4167-8511-76f958542fbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.948193 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.948202 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.948213 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3fad4b-8e44-471d-b262-27d6a7e05276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.948443 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmcqr\" (UniqueName: \"kubernetes.io/projected/288de2f6-818d-4167-8511-76f958542fbd-kube-api-access-kmcqr\") on node \"crc\" DevicePath \"\"" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.948815 5129 scope.go:117] "RemoveContainer" containerID="76f24021b697ad5485f9f5594cd9b1485876631247e65a9f1fe131e045674477" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.956644 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8da79e9b-0c3f-4d66-9813-08116725c6a4" (UID: "8da79e9b-0c3f-4d66-9813-08116725c6a4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:24:59 crc kubenswrapper[5129]: I0314 07:24:59.985868 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:24:59.999446 5129 scope.go:117] "RemoveContainer" containerID="347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.045220 5129 scope.go:117] "RemoveContainer" containerID="2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.049856 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8da79e9b-0c3f-4d66-9813-08116725c6a4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.053102 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be17304-8fa4-4ec4-893d-3094d703893a" path="/var/lib/kubelet/pods/1be17304-8fa4-4ec4-893d-3094d703893a/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.053664 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26087e66-e9e8-451d-80b4-d288468202f1" path="/var/lib/kubelet/pods/26087e66-e9e8-451d-80b4-d288468202f1/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.054357 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de15208-b058-4b80-886d-6b93469faac0" path="/var/lib/kubelet/pods/2de15208-b058-4b80-886d-6b93469faac0/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.054726 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb44baa-73cf-418a-9cb5-4e7fa092524c" path="/var/lib/kubelet/pods/4cb44baa-73cf-418a-9cb5-4e7fa092524c/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.055801 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d01ce1-11e5-4fc7-a120-44d9d3407142" path="/var/lib/kubelet/pods/54d01ce1-11e5-4fc7-a120-44d9d3407142/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.056322 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680430ff-d825-4af7-bb3d-d8cfe32f1e4f" path="/var/lib/kubelet/pods/680430ff-d825-4af7-bb3d-d8cfe32f1e4f/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.057408 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b5edf2-4ac6-4377-8bde-268655955533" path="/var/lib/kubelet/pods/68b5edf2-4ac6-4377-8bde-268655955533/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.059771 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7132e8d4-728a-4852-bcd9-833a9bd05878" path="/var/lib/kubelet/pods/7132e8d4-728a-4852-bcd9-833a9bd05878/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.060271 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ce93e5-3bef-4e45-a23a-4164fb7aed7a" path="/var/lib/kubelet/pods/87ce93e5-3bef-4e45-a23a-4164fb7aed7a/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.061103 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7c35ff-a6ef-4686-aa16-cc65d46d3527" path="/var/lib/kubelet/pods/bf7c35ff-a6ef-4686-aa16-cc65d46d3527/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.061456 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9fd78eb-874d-4fbd-b8b3-7e23a32503b5" path="/var/lib/kubelet/pods/e9fd78eb-874d-4fbd-b8b3-7e23a32503b5/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.061971 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c2c75d-2b97-4e81-9701-486cee85dd93" path="/var/lib/kubelet/pods/f7c2c75d-2b97-4e81-9701-486cee85dd93/volumes" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.079548 5129 scope.go:117] "RemoveContainer" containerID="347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c" Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.080912 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c\": container with ID starting with 347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c not found: ID does not exist" containerID="347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.080957 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c"} err="failed to get container status \"347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c\": rpc error: code = NotFound desc = could not find container \"347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c\": container with ID starting with 347a7b8e1e7a603f4afb767a2a29b743978ea8a3bd63e76edd3e8e66b4e8268c not found: ID does not exist" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.080984 5129 scope.go:117] "RemoveContainer" containerID="2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937" Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.081776 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937\": container with ID starting with 2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937 not found: ID does not exist" containerID="2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.081801 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937"} err="failed to get container status \"2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937\": rpc error: code = NotFound desc = could not find container \"2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937\": container with ID starting with 2fdf9b5c6b6465e84cb1682373d72f4eee6de798eebaff1e1b5c40f811253937 not found: ID does not exist" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150669 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-logs\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150722 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-combined-ca-bundle\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150759 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-httpd-run\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150784 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150837 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-internal-tls-certs\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150869 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-config-data\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150885 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7wpl\" (UniqueName: \"kubernetes.io/projected/b0a5119c-8784-48e5-841a-654dc253f0d0-kube-api-access-v7wpl\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.150899 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-scripts\") pod \"b0a5119c-8784-48e5-841a-654dc253f0d0\" (UID: \"b0a5119c-8784-48e5-841a-654dc253f0d0\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.153097 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.153219 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-logs" (OuterVolumeSpecName: "logs") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.155585 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-scripts" (OuterVolumeSpecName: "scripts") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.156184 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.158171 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a5119c-8784-48e5-841a-654dc253f0d0-kube-api-access-v7wpl" (OuterVolumeSpecName: "kube-api-access-v7wpl") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "kube-api-access-v7wpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.179977 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.238119 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.252920 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.252950 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.252960 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0a5119c-8784-48e5-841a-654dc253f0d0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.252980 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.252991 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.252999 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7wpl\" (UniqueName: \"kubernetes.io/projected/b0a5119c-8784-48e5-841a-654dc253f0d0-kube-api-access-v7wpl\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.253007 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.257903 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a3fad4b-8e44-471d-b262-27d6a7e05276","Type":"ContainerDied","Data":"3472c054b50f74444cfb9f99926ed91789bd6fb03b78b9180e9bafdaef83653d"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.257951 5129 scope.go:117] "RemoveContainer" containerID="d055a298c80b1c78420262c5f6b1a8b08ee515621adade6064aa22f0860f7d0f" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.258149 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.276963 5129 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.278558 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-config-data" (OuterVolumeSpecName: "config-data") pod "b0a5119c-8784-48e5-841a-654dc253f0d0" (UID: "b0a5119c-8784-48e5-841a-654dc253f0d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.280423 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.339081 5129 generic.go:334] "Generic (PLEG): container finished" podID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerID="f0d88a61613b0d796589b600c918f3c42969fe8081ec6149ed9e97446ae73149" exitCode=0 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.339148 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc754bc48-djssr" event={"ID":"9d7fc10c-3f26-4459-9577-e7f09371a44b","Type":"ContainerDied","Data":"f0d88a61613b0d796589b600c918f3c42969fe8081ec6149ed9e97446ae73149"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.358205 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.358247 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a5119c-8784-48e5-841a-654dc253f0d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.367401 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.374234 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.375543 5129 scope.go:117] "RemoveContainer" containerID="b6c1654757c5b72874d5b9a46ca2254eeb71fc65f978afb406f43a49c77a25ae" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.392349 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.392364 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b4586cb8-pfpxj" event={"ID":"8da79e9b-0c3f-4d66-9813-08116725c6a4","Type":"ContainerDied","Data":"e05aad8860f6b477225eadd8146b46e6b4b7e8c4a703290e40abe4f403f0a7bf"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.392349 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b4586cb8-pfpxj" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.426196 5129 generic.go:334] "Generic (PLEG): container finished" podID="73a06d78-48be-4099-b7fa-be0557b6138e" containerID="94ff5f18b233b64c1c783dbe228636e28477df4e6e43805ef408f8e1f99138ec" exitCode=0 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.426235 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a06d78-48be-4099-b7fa-be0557b6138e","Type":"ContainerDied","Data":"94ff5f18b233b64c1c783dbe228636e28477df4e6e43805ef408f8e1f99138ec"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.427009 5129 scope.go:117] "RemoveContainer" containerID="19c89364d2b386b626087226ef1de11773409433b1f6b4165c7de722f1c08207" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.429978 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"288de2f6-818d-4167-8511-76f958542fbd","Type":"ContainerDied","Data":"bd03bffeeaeddba14cd269c20a41101d398671043ab76fd99636edd7cd4a2a76"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.430071 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.437026 5129 generic.go:334] "Generic (PLEG): container finished" podID="b169049b-3ab6-4871-ae92-0876f27347e6" containerID="7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b" exitCode=0 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.437054 5129 generic.go:334] "Generic (PLEG): container finished" podID="b169049b-3ab6-4871-ae92-0876f27347e6" containerID="17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79" exitCode=2 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.437062 5129 generic.go:334] "Generic (PLEG): container finished" podID="b169049b-3ab6-4871-ae92-0876f27347e6" containerID="bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d" exitCode=0 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.437098 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerDied","Data":"7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.437121 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerDied","Data":"17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.437131 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerDied","Data":"bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.440103 5129 generic.go:334] "Generic (PLEG): container finished" podID="1d59a327-2c1e-49e7-86b3-51e8b692545a" containerID="3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5" exitCode=2 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.440152 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1d59a327-2c1e-49e7-86b3-51e8b692545a","Type":"ContainerDied","Data":"3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.440173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1d59a327-2c1e-49e7-86b3-51e8b692545a","Type":"ContainerDied","Data":"4b80b0e9d56829972a7062c820c2450513b45e755db21d1e322d878b8218ede6"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.440219 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.452728 5129 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-48dp4" secret="" err="secret \"galera-openstack-dockercfg-gq57p\" not found" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.452918 5129 scope.go:117] "RemoveContainer" containerID="75b46a090025ba91094ce980bb19db3328ed7635e0272c7d2b864af79aa4e4e3" Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.453436 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-48dp4_openstack(b77f36b2-be7b-43cb-ada4-74f524396018)\"" pod="openstack/root-account-create-update-48dp4" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.465884 5129 generic.go:334] "Generic (PLEG): container finished" podID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerID="1424db197590997e7d0bf3ba8eb822d66a08b3447321a7763baa443a1590f95f" exitCode=0 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.465975 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f12ef33b-b86d-4c80-8f19-385ff5a93fee","Type":"ContainerDied","Data":"1424db197590997e7d0bf3ba8eb822d66a08b3447321a7763baa443a1590f95f"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.475451 5129 generic.go:334] "Generic (PLEG): container finished" podID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerID="51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d" exitCode=0 Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.475889 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0a5119c-8784-48e5-841a-654dc253f0d0","Type":"ContainerDied","Data":"51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.475940 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.475967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0a5119c-8784-48e5-841a-654dc253f0d0","Type":"ContainerDied","Data":"b183b39a72e933f206d3aff379c314c3a7df9dcb04f04bccb4ce2810cbc81c44"} Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.478530 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.484352 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75b4586cb8-pfpxj"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.493406 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-75b4586cb8-pfpxj"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.570934 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data\") pod \"9d7fc10c-3f26-4459-9577-e7f09371a44b\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.570979 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-internal-tls-certs\") pod \"9d7fc10c-3f26-4459-9577-e7f09371a44b\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.571031 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-public-tls-certs\") pod \"9d7fc10c-3f26-4459-9577-e7f09371a44b\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.571067 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7fc10c-3f26-4459-9577-e7f09371a44b-logs\") pod \"9d7fc10c-3f26-4459-9577-e7f09371a44b\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.571098 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data-custom\") pod \"9d7fc10c-3f26-4459-9577-e7f09371a44b\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.571116 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-combined-ca-bundle\") pod \"9d7fc10c-3f26-4459-9577-e7f09371a44b\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.571158 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77vpv\" (UniqueName: \"kubernetes.io/projected/9d7fc10c-3f26-4459-9577-e7f09371a44b-kube-api-access-77vpv\") pod \"9d7fc10c-3f26-4459-9577-e7f09371a44b\" (UID: \"9d7fc10c-3f26-4459-9577-e7f09371a44b\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.572649 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d7fc10c-3f26-4459-9577-e7f09371a44b-logs" (OuterVolumeSpecName: "logs") pod "9d7fc10c-3f26-4459-9577-e7f09371a44b" (UID: "9d7fc10c-3f26-4459-9577-e7f09371a44b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.594658 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9d7fc10c-3f26-4459-9577-e7f09371a44b" (UID: "9d7fc10c-3f26-4459-9577-e7f09371a44b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.614502 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7fc10c-3f26-4459-9577-e7f09371a44b-kube-api-access-77vpv" (OuterVolumeSpecName: "kube-api-access-77vpv") pod "9d7fc10c-3f26-4459-9577-e7f09371a44b" (UID: "9d7fc10c-3f26-4459-9577-e7f09371a44b"). InnerVolumeSpecName "kube-api-access-77vpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.634686 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d7fc10c-3f26-4459-9577-e7f09371a44b" (UID: "9d7fc10c-3f26-4459-9577-e7f09371a44b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.654670 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data" (OuterVolumeSpecName: "config-data") pod "9d7fc10c-3f26-4459-9577-e7f09371a44b" (UID: "9d7fc10c-3f26-4459-9577-e7f09371a44b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.660009 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9d7fc10c-3f26-4459-9577-e7f09371a44b" (UID: "9d7fc10c-3f26-4459-9577-e7f09371a44b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.664441 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9d7fc10c-3f26-4459-9577-e7f09371a44b" (UID: "9d7fc10c-3f26-4459-9577-e7f09371a44b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.674658 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d7fc10c-3f26-4459-9577-e7f09371a44b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.674697 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.674711 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.674722 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77vpv\" (UniqueName: \"kubernetes.io/projected/9d7fc10c-3f26-4459-9577-e7f09371a44b-kube-api-access-77vpv\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.674768 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.674780 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.674790 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d7fc10c-3f26-4459-9577-e7f09371a44b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.674723 5129 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.674974 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data podName:df407ca4-4a5d-404c-ab22-89bcde2439c4 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:04.674950175 +0000 UTC m=+1567.426865409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data") pod "nova-cell1-conductor-0" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4") : secret "nova-cell1-conductor-config-data" not found Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.674763 5129 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.675873 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts podName:b77f36b2-be7b-43cb-ada4-74f524396018 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:02.675861419 +0000 UTC m=+1565.427776603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts") pod "root-account-create-update-48dp4" (UID: "b77f36b2-be7b-43cb-ada4-74f524396018") : configmap "openstack-scripts" not found Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.700259 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.720323 5129 scope.go:117] "RemoveContainer" containerID="c8a2eb3d81e166b08f15ce66729e9485cc211076230f60d2b3882494534f0f34" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.782476 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-nova-metadata-tls-certs\") pod \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.782588 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f12ef33b-b86d-4c80-8f19-385ff5a93fee-logs\") pod \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.782638 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkwjd\" (UniqueName: \"kubernetes.io/projected/f12ef33b-b86d-4c80-8f19-385ff5a93fee-kube-api-access-xkwjd\") pod \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.782653 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-config-data\") pod \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.782732 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-combined-ca-bundle\") pod \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\" (UID: \"f12ef33b-b86d-4c80-8f19-385ff5a93fee\") " Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.783388 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzjx5\" (UniqueName: \"kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.783459 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts\") pod \"keystone-5ea2-account-create-update-4bdm8\" (UID: \"a8b77a9e-f83e-417f-88f7-c378046c4c54\") " pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.783674 5129 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.783715 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts podName:a8b77a9e-f83e-417f-88f7-c378046c4c54 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:02.783702973 +0000 UTC m=+1565.535618157 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts") pod "keystone-5ea2-account-create-update-4bdm8" (UID: "a8b77a9e-f83e-417f-88f7-c378046c4c54") : configmap "openstack-scripts" not found Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.786426 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f12ef33b-b86d-4c80-8f19-385ff5a93fee-logs" (OuterVolumeSpecName: "logs") pod "f12ef33b-b86d-4c80-8f19-385ff5a93fee" (UID: "f12ef33b-b86d-4c80-8f19-385ff5a93fee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.788110 5129 projected.go:194] Error preparing data for projected volume kube-api-access-bzjx5 for pod openstack/keystone-5ea2-account-create-update-4bdm8: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:25:00 crc kubenswrapper[5129]: E0314 07:25:00.789772 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5 podName:a8b77a9e-f83e-417f-88f7-c378046c4c54 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:02.789753088 +0000 UTC m=+1565.541668272 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bzjx5" (UniqueName: "kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5") pod "keystone-5ea2-account-create-update-4bdm8" (UID: "a8b77a9e-f83e-417f-88f7-c378046c4c54") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.792440 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.806324 5129 scope.go:117] "RemoveContainer" containerID="c1fcdf4a873899697c71e1744e78008c9ad86529e509e2cf0e32fd619d7c1cb1" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.809191 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.812307 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12ef33b-b86d-4c80-8f19-385ff5a93fee-kube-api-access-xkwjd" (OuterVolumeSpecName: "kube-api-access-xkwjd") pod "f12ef33b-b86d-4c80-8f19-385ff5a93fee" (UID: "f12ef33b-b86d-4c80-8f19-385ff5a93fee"). InnerVolumeSpecName "kube-api-access-xkwjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.813322 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.814697 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.822506 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.829523 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.837380 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.847953 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.850085 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-config-data" (OuterVolumeSpecName: "config-data") pod "f12ef33b-b86d-4c80-8f19-385ff5a93fee" (UID: "f12ef33b-b86d-4c80-8f19-385ff5a93fee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:00 crc kubenswrapper[5129]: I0314 07:25:00.851156 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f12ef33b-b86d-4c80-8f19-385ff5a93fee" (UID: "f12ef33b-b86d-4c80-8f19-385ff5a93fee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.887382 5129 scope.go:117] "RemoveContainer" containerID="3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891481 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-httpd-run\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891531 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891563 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-scripts\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891660 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfzf8\" (UniqueName: \"kubernetes.io/projected/73a06d78-48be-4099-b7fa-be0557b6138e-kube-api-access-wfzf8\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891722 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-config-data\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891782 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-combined-ca-bundle\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891883 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-public-tls-certs\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.891924 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-logs\") pod \"73a06d78-48be-4099-b7fa-be0557b6138e\" (UID: \"73a06d78-48be-4099-b7fa-be0557b6138e\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.892284 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.892295 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f12ef33b-b86d-4c80-8f19-385ff5a93fee-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.892304 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkwjd\" (UniqueName: \"kubernetes.io/projected/f12ef33b-b86d-4c80-8f19-385ff5a93fee-kube-api-access-xkwjd\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.892314 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.894945 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-logs" (OuterVolumeSpecName: "logs") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.895707 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.919367 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.921212 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-scripts" (OuterVolumeSpecName: "scripts") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.921315 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a06d78-48be-4099-b7fa-be0557b6138e-kube-api-access-wfzf8" (OuterVolumeSpecName: "kube-api-access-wfzf8") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "kube-api-access-wfzf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.928042 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f12ef33b-b86d-4c80-8f19-385ff5a93fee" (UID: "f12ef33b-b86d-4c80-8f19-385ff5a93fee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.929259 5129 scope.go:117] "RemoveContainer" containerID="3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:00.929762 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5\": container with ID starting with 3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5 not found: ID does not exist" containerID="3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.929791 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5"} err="failed to get container status \"3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5\": rpc error: code = NotFound desc = could not find container \"3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5\": container with ID starting with 3fa18634f1bb9a6d07527df4350eeb4bdd0a91a0e97abb6e976b1194f1b93cd5 not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.929820 5129 scope.go:117] "RemoveContainer" containerID="51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.935371 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.952553 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-config-data" (OuterVolumeSpecName: "config-data") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.988948 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "73a06d78-48be-4099-b7fa-be0557b6138e" (UID: "73a06d78-48be-4099-b7fa-be0557b6138e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995593 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995624 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995633 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a06d78-48be-4099-b7fa-be0557b6138e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995657 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995665 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995674 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfzf8\" (UniqueName: \"kubernetes.io/projected/73a06d78-48be-4099-b7fa-be0557b6138e-kube-api-access-wfzf8\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995683 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995691 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a06d78-48be-4099-b7fa-be0557b6138e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:00.995699 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12ef33b-b86d-4c80-8f19-385ff5a93fee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.020802 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.057469 5129 scope.go:117] "RemoveContainer" containerID="487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.087053 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.087565 5129 scope.go:117] "RemoveContainer" containerID="51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.090100 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d\": container with ID starting with 51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d not found: ID does not exist" containerID="51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.090144 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d"} err="failed to get container status \"51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d\": rpc error: code = NotFound desc = could not find container \"51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d\": container with ID starting with 51a6a8c222fecf3f936e6b32ca1163ecf7305b993569a3ab2d01e4f6863e6e3d not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.090164 5129 scope.go:117] "RemoveContainer" containerID="487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.090984 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636\": container with ID starting with 487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636 not found: ID does not exist" containerID="487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.091019 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636"} err="failed to get container status \"487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636\": rpc error: code = NotFound desc = could not find container \"487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636\": container with ID starting with 487da9597dbcf0671505d2cad76544a39daed693b3625176056d417c8e515636 not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.101986 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.148011 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.151756 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205541 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233604f0-adda-4669-b868-b96791d98bca-logs\") pod \"233604f0-adda-4669-b868-b96791d98bca\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205662 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-sg-core-conf-yaml\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205735 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-combined-ca-bundle\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205778 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzhdp\" (UniqueName: \"kubernetes.io/projected/30ca9513-5ae9-4520-8012-3c941786ce2a-kube-api-access-kzhdp\") pod \"30ca9513-5ae9-4520-8012-3c941786ce2a\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205835 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data-custom\") pod \"233604f0-adda-4669-b868-b96791d98bca\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205854 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6sqx\" (UniqueName: \"kubernetes.io/projected/233604f0-adda-4669-b868-b96791d98bca-kube-api-access-t6sqx\") pod \"233604f0-adda-4669-b868-b96791d98bca\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205916 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom\") pod \"30ca9513-5ae9-4520-8012-3c941786ce2a\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205938 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-scripts\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.205993 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-combined-ca-bundle\") pod \"233604f0-adda-4669-b868-b96791d98bca\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206000 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233604f0-adda-4669-b868-b96791d98bca-logs" (OuterVolumeSpecName: "logs") pod "233604f0-adda-4669-b868-b96791d98bca" (UID: "233604f0-adda-4669-b868-b96791d98bca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206021 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-ceilometer-tls-certs\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206067 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data\") pod \"233604f0-adda-4669-b868-b96791d98bca\" (UID: \"233604f0-adda-4669-b868-b96791d98bca\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206172 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ca9513-5ae9-4520-8012-3c941786ce2a-logs\") pod \"30ca9513-5ae9-4520-8012-3c941786ce2a\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206243 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-log-httpd\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206261 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvd74\" (UniqueName: \"kubernetes.io/projected/b169049b-3ab6-4871-ae92-0876f27347e6-kube-api-access-pvd74\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206315 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-run-httpd\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206407 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-config-data\") pod \"b169049b-3ab6-4871-ae92-0876f27347e6\" (UID: \"b169049b-3ab6-4871-ae92-0876f27347e6\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206433 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data\") pod \"30ca9513-5ae9-4520-8012-3c941786ce2a\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.206478 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-combined-ca-bundle\") pod \"30ca9513-5ae9-4520-8012-3c941786ce2a\" (UID: \"30ca9513-5ae9-4520-8012-3c941786ce2a\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.207073 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233604f0-adda-4669-b868-b96791d98bca-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.216719 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "233604f0-adda-4669-b868-b96791d98bca" (UID: "233604f0-adda-4669-b868-b96791d98bca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.217574 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ca9513-5ae9-4520-8012-3c941786ce2a-logs" (OuterVolumeSpecName: "logs") pod "30ca9513-5ae9-4520-8012-3c941786ce2a" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.219712 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.219952 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.221475 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ca9513-5ae9-4520-8012-3c941786ce2a-kube-api-access-kzhdp" (OuterVolumeSpecName: "kube-api-access-kzhdp") pod "30ca9513-5ae9-4520-8012-3c941786ce2a" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a"). InnerVolumeSpecName "kube-api-access-kzhdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.231061 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233604f0-adda-4669-b868-b96791d98bca-kube-api-access-t6sqx" (OuterVolumeSpecName: "kube-api-access-t6sqx") pod "233604f0-adda-4669-b868-b96791d98bca" (UID: "233604f0-adda-4669-b868-b96791d98bca"). InnerVolumeSpecName "kube-api-access-t6sqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.231168 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "30ca9513-5ae9-4520-8012-3c941786ce2a" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.231434 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-scripts" (OuterVolumeSpecName: "scripts") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.231585 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b169049b-3ab6-4871-ae92-0876f27347e6-kube-api-access-pvd74" (OuterVolumeSpecName: "kube-api-access-pvd74") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "kube-api-access-pvd74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.285266 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30ca9513-5ae9-4520-8012-3c941786ce2a" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.308798 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309233 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ca9513-5ae9-4520-8012-3c941786ce2a-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309263 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309276 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvd74\" (UniqueName: \"kubernetes.io/projected/b169049b-3ab6-4871-ae92-0876f27347e6-kube-api-access-pvd74\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309288 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b169049b-3ab6-4871-ae92-0876f27347e6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309299 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309309 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309373 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzhdp\" (UniqueName: \"kubernetes.io/projected/30ca9513-5ae9-4520-8012-3c941786ce2a-kube-api-access-kzhdp\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309385 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309396 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6sqx\" (UniqueName: \"kubernetes.io/projected/233604f0-adda-4669-b868-b96791d98bca-kube-api-access-t6sqx\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309406 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.309416 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.319625 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data" (OuterVolumeSpecName: "config-data") pod "30ca9513-5ae9-4520-8012-3c941786ce2a" (UID: "30ca9513-5ae9-4520-8012-3c941786ce2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.335798 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data" (OuterVolumeSpecName: "config-data") pod "233604f0-adda-4669-b868-b96791d98bca" (UID: "233604f0-adda-4669-b868-b96791d98bca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.337486 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "233604f0-adda-4669-b868-b96791d98bca" (UID: "233604f0-adda-4669-b868-b96791d98bca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.402818 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-config-data" (OuterVolumeSpecName: "config-data") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.410582 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.410629 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233604f0-adda-4669-b868-b96791d98bca-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.410640 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.410649 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ca9513-5ae9-4520-8012-3c941786ce2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.414214 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.418110 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b169049b-3ab6-4871-ae92-0876f27347e6" (UID: "b169049b-3ab6-4871-ae92-0876f27347e6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.437260 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.458967 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_72ad7859-e34b-4393-b696-548fd7ac8e1d/ovn-northd/0.log" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.459109 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.463778 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.468270 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.469509 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.469582 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="df407ca4-4a5d-404c-ab22-89bcde2439c4" containerName="nova-cell1-conductor-conductor" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.494905 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f12ef33b-b86d-4c80-8f19-385ff5a93fee","Type":"ContainerDied","Data":"5d38b16e5304399ab733905799449c8e4299c7051910f1436062b73d0af4a1f4"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.494967 5129 scope.go:117] "RemoveContainer" containerID="1424db197590997e7d0bf3ba8eb822d66a08b3447321a7763baa443a1590f95f" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.495135 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.502298 5129 generic.go:334] "Generic (PLEG): container finished" podID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerID="ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce" exitCode=0 Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.502375 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" event={"ID":"30ca9513-5ae9-4520-8012-3c941786ce2a","Type":"ContainerDied","Data":"ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.502406 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" event={"ID":"30ca9513-5ae9-4520-8012-3c941786ce2a","Type":"ContainerDied","Data":"c80dc4bf162b474b3071f41b2f9c48160db5cdc8397f4bb35b1410cb11c1ab51"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.502511 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54888cd7bb-schqh" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.507247 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a06d78-48be-4099-b7fa-be0557b6138e","Type":"ContainerDied","Data":"8597ce79a857cf34460f8fa1817f53b9faa5fe60bf1de30cf299a12972307200"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.507371 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.511421 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_72ad7859-e34b-4393-b696-548fd7ac8e1d/ovn-northd/0.log" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.511502 5129 generic.go:334] "Generic (PLEG): container finished" podID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerID="d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" exitCode=139 Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.511647 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.511701 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"72ad7859-e34b-4393-b696-548fd7ac8e1d","Type":"ContainerDied","Data":"d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.511733 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"72ad7859-e34b-4393-b696-548fd7ac8e1d","Type":"ContainerDied","Data":"d5ea91c26a8f8e3e2cdddf83fca8f1d7c2ee1497148b94cdde0cc7199396cb35"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514232 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-memcached-tls-certs\") pod \"659ee685-6b83-4af2-bd2e-e5ce9372e408\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514321 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-combined-ca-bundle\") pod \"659ee685-6b83-4af2-bd2e-e5ce9372e408\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514363 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-metrics-certs-tls-certs\") pod \"72ad7859-e34b-4393-b696-548fd7ac8e1d\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514408 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-combined-ca-bundle\") pod \"72ad7859-e34b-4393-b696-548fd7ac8e1d\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514488 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-config\") pod \"72ad7859-e34b-4393-b696-548fd7ac8e1d\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514523 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-kolla-config\") pod \"659ee685-6b83-4af2-bd2e-e5ce9372e408\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514588 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-scripts\") pod \"72ad7859-e34b-4393-b696-548fd7ac8e1d\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514924 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-rundir\") pod \"72ad7859-e34b-4393-b696-548fd7ac8e1d\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.514983 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv4l6\" (UniqueName: \"kubernetes.io/projected/72ad7859-e34b-4393-b696-548fd7ac8e1d-kube-api-access-qv4l6\") pod \"72ad7859-e34b-4393-b696-548fd7ac8e1d\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.515051 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl5dm\" (UniqueName: \"kubernetes.io/projected/659ee685-6b83-4af2-bd2e-e5ce9372e408-kube-api-access-wl5dm\") pod \"659ee685-6b83-4af2-bd2e-e5ce9372e408\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.515085 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-northd-tls-certs\") pod \"72ad7859-e34b-4393-b696-548fd7ac8e1d\" (UID: \"72ad7859-e34b-4393-b696-548fd7ac8e1d\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.515141 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-config-data\") pod \"659ee685-6b83-4af2-bd2e-e5ce9372e408\" (UID: \"659ee685-6b83-4af2-bd2e-e5ce9372e408\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.515628 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.515653 5129 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b169049b-3ab6-4871-ae92-0876f27347e6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.516359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-config-data" (OuterVolumeSpecName: "config-data") pod "659ee685-6b83-4af2-bd2e-e5ce9372e408" (UID: "659ee685-6b83-4af2-bd2e-e5ce9372e408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.517849 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-config" (OuterVolumeSpecName: "config") pod "72ad7859-e34b-4393-b696-548fd7ac8e1d" (UID: "72ad7859-e34b-4393-b696-548fd7ac8e1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.521809 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "659ee685-6b83-4af2-bd2e-e5ce9372e408" (UID: "659ee685-6b83-4af2-bd2e-e5ce9372e408"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.536178 5129 scope.go:117] "RemoveContainer" containerID="73a607e1cd40ac4f3d4872afe8c005319bf76b10db5cdba8a1d09fc3f83dead5" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.539316 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659ee685-6b83-4af2-bd2e-e5ce9372e408-kube-api-access-wl5dm" (OuterVolumeSpecName: "kube-api-access-wl5dm") pod "659ee685-6b83-4af2-bd2e-e5ce9372e408" (UID: "659ee685-6b83-4af2-bd2e-e5ce9372e408"). InnerVolumeSpecName "kube-api-access-wl5dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.539855 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-scripts" (OuterVolumeSpecName: "scripts") pod "72ad7859-e34b-4393-b696-548fd7ac8e1d" (UID: "72ad7859-e34b-4393-b696-548fd7ac8e1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.540078 5129 generic.go:334] "Generic (PLEG): container finished" podID="659ee685-6b83-4af2-bd2e-e5ce9372e408" containerID="babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73" exitCode=0 Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.540149 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"659ee685-6b83-4af2-bd2e-e5ce9372e408","Type":"ContainerDied","Data":"babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.540181 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"659ee685-6b83-4af2-bd2e-e5ce9372e408","Type":"ContainerDied","Data":"e2e2589920cfdc3b915a03e0c59ca4d3599190e1d0272f7ed29588b81322f69e"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.540208 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.551734 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "72ad7859-e34b-4393-b696-548fd7ac8e1d" (UID: "72ad7859-e34b-4393-b696-548fd7ac8e1d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.556259 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.557569 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.560498 5129 scope.go:117] "RemoveContainer" containerID="ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.562173 5129 generic.go:334] "Generic (PLEG): container finished" podID="233604f0-adda-4669-b868-b96791d98bca" containerID="3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e" exitCode=0 Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.562216 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" event={"ID":"233604f0-adda-4669-b868-b96791d98bca","Type":"ContainerDied","Data":"3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.562236 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" event={"ID":"233604f0-adda-4669-b868-b96791d98bca","Type":"ContainerDied","Data":"f7af0415a56cb4e739d7f5fdc29c7672f0c3842490e43ef212b01081a0025a67"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.562315 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-696c7b8d5f-j2w5q" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.568198 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-54888cd7bb-schqh"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.569672 5129 generic.go:334] "Generic (PLEG): container finished" podID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerID="739b061c527066552e175c0fbe0488722ce25d75896eb047b0bd920e05d3e6cb" exitCode=0 Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.569736 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b15058f-0936-4bb9-ad72-1c27661b4b82","Type":"ContainerDied","Data":"739b061c527066552e175c0fbe0488722ce25d75896eb047b0bd920e05d3e6cb"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.576259 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ad7859-e34b-4393-b696-548fd7ac8e1d-kube-api-access-qv4l6" (OuterVolumeSpecName: "kube-api-access-qv4l6") pod "72ad7859-e34b-4393-b696-548fd7ac8e1d" (UID: "72ad7859-e34b-4393-b696-548fd7ac8e1d"). InnerVolumeSpecName "kube-api-access-qv4l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.580077 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cc754bc48-djssr" event={"ID":"9d7fc10c-3f26-4459-9577-e7f09371a44b","Type":"ContainerDied","Data":"415248023f335d941d9c0d17ead22257eaba6c308138cb3f4b361ef356ebbed0"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.580154 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cc754bc48-djssr" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.586406 5129 scope.go:117] "RemoveContainer" containerID="d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.589262 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-54888cd7bb-schqh"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.594645 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "659ee685-6b83-4af2-bd2e-e5ce9372e408" (UID: "659ee685-6b83-4af2-bd2e-e5ce9372e408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.597749 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72ad7859-e34b-4393-b696-548fd7ac8e1d" (UID: "72ad7859-e34b-4393-b696-548fd7ac8e1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.598395 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-696c7b8d5f-j2w5q"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.609102 5129 generic.go:334] "Generic (PLEG): container finished" podID="b169049b-3ab6-4871-ae92-0876f27347e6" containerID="1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a" exitCode=0 Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.609501 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.610748 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerDied","Data":"1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.610793 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b169049b-3ab6-4871-ae92-0876f27347e6","Type":"ContainerDied","Data":"04e6ba33b3a7d776ccd181f000fa1c90456cfbbf8b68856377b5cc35847eb22a"} Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.610801 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ea2-account-create-update-4bdm8" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617447 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617481 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617494 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617504 5129 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617514 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ad7859-e34b-4393-b696-548fd7ac8e1d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617525 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617536 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv4l6\" (UniqueName: \"kubernetes.io/projected/72ad7859-e34b-4393-b696-548fd7ac8e1d-kube-api-access-qv4l6\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617549 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl5dm\" (UniqueName: \"kubernetes.io/projected/659ee685-6b83-4af2-bd2e-e5ce9372e408-kube-api-access-wl5dm\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617559 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/659ee685-6b83-4af2-bd2e-e5ce9372e408-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.617583 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-696c7b8d5f-j2w5q"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.629867 5129 scope.go:117] "RemoveContainer" containerID="ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.631536 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.632883 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce\": container with ID starting with ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce not found: ID does not exist" containerID="ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.632923 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce"} err="failed to get container status \"ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce\": rpc error: code = NotFound desc = could not find container \"ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce\": container with ID starting with ee25cb26a5aa1f0bb656d11fce4519f1563d782dbc7cf1fd218ebf2964d004ce not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.632947 5129 scope.go:117] "RemoveContainer" containerID="d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.636822 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f\": container with ID starting with d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f not found: ID does not exist" containerID="d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.636851 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f"} err="failed to get container status \"d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f\": rpc error: code = NotFound desc = could not find container \"d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f\": container with ID starting with d76db94856217343e1382a613c04bd6a275d213a9da38e6ea54fa7e5058c934f not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.636867 5129 scope.go:117] "RemoveContainer" containerID="94ff5f18b233b64c1c783dbe228636e28477df4e6e43805ef408f8e1f99138ec" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.639881 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.661964 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "72ad7859-e34b-4393-b696-548fd7ac8e1d" (UID: "72ad7859-e34b-4393-b696-548fd7ac8e1d"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.664656 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.664899 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "659ee685-6b83-4af2-bd2e-e5ce9372e408" (UID: "659ee685-6b83-4af2-bd2e-e5ce9372e408"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.670368 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.696782 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.704425 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "72ad7859-e34b-4393-b696-548fd7ac8e1d" (UID: "72ad7859-e34b-4393-b696-548fd7ac8e1d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.707912 5129 scope.go:117] "RemoveContainer" containerID="a0297ed0b9af03c2d6e5fa02d749275ab59da2eca117907d6e9e8d9642cba071" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.719749 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.719780 5129 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/659ee685-6b83-4af2-bd2e-e5ce9372e408-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.719790 5129 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ad7859-e34b-4393-b696-548fd7ac8e1d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.735251 5129 scope.go:117] "RemoveContainer" containerID="c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.747588 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5ea2-account-create-update-4bdm8"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.754948 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5ea2-account-create-update-4bdm8"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.768443 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cc754bc48-djssr"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.775335 5129 scope.go:117] "RemoveContainer" containerID="d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.776061 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6cc754bc48-djssr"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.799643 5129 scope.go:117] "RemoveContainer" containerID="c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.800031 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66\": container with ID starting with c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66 not found: ID does not exist" containerID="c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.800072 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66"} err="failed to get container status \"c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66\": rpc error: code = NotFound desc = could not find container \"c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66\": container with ID starting with c43534dc2a23fd4cd80aec67aba6aa51e1ac7a2b3c68810b70ff6a152f4d2e66 not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.800098 5129 scope.go:117] "RemoveContainer" containerID="d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.800731 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395\": container with ID starting with d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395 not found: ID does not exist" containerID="d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.800760 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395"} err="failed to get container status \"d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395\": rpc error: code = NotFound desc = could not find container \"d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395\": container with ID starting with d0cdce7e2d0eb6e6c377ac56ed26b1e0c8ae3d516df683f25a4cfa59b208e395 not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.800774 5129 scope.go:117] "RemoveContainer" containerID="babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820006 5129 scope.go:117] "RemoveContainer" containerID="babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820118 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-combined-ca-bundle\") pod \"6b15058f-0936-4bb9-ad72-1c27661b4b82\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820156 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-internal-tls-certs\") pod \"6b15058f-0936-4bb9-ad72-1c27661b4b82\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820192 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtwkk\" (UniqueName: \"kubernetes.io/projected/6b15058f-0936-4bb9-ad72-1c27661b4b82-kube-api-access-vtwkk\") pod \"6b15058f-0936-4bb9-ad72-1c27661b4b82\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820214 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-public-tls-certs\") pod \"6b15058f-0936-4bb9-ad72-1c27661b4b82\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820281 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-config-data\") pod \"6b15058f-0936-4bb9-ad72-1c27661b4b82\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820312 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b15058f-0936-4bb9-ad72-1c27661b4b82-logs\") pod \"6b15058f-0936-4bb9-ad72-1c27661b4b82\" (UID: \"6b15058f-0936-4bb9-ad72-1c27661b4b82\") " Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820665 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzjx5\" (UniqueName: \"kubernetes.io/projected/a8b77a9e-f83e-417f-88f7-c378046c4c54-kube-api-access-bzjx5\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.820677 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b77a9e-f83e-417f-88f7-c378046c4c54-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.821040 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b15058f-0936-4bb9-ad72-1c27661b4b82-logs" (OuterVolumeSpecName: "logs") pod "6b15058f-0936-4bb9-ad72-1c27661b4b82" (UID: "6b15058f-0936-4bb9-ad72-1c27661b4b82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.821344 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73\": container with ID starting with babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73 not found: ID does not exist" containerID="babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.821397 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73"} err="failed to get container status \"babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73\": rpc error: code = NotFound desc = could not find container \"babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73\": container with ID starting with babbc10ebff9f1149bd8ac464ac8ab29bfe228ddf2812b62e0b09490fb06eb73 not found: ID does not exist" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.821433 5129 scope.go:117] "RemoveContainer" containerID="3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.871970 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.875845 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b15058f-0936-4bb9-ad72-1c27661b4b82-kube-api-access-vtwkk" (OuterVolumeSpecName: "kube-api-access-vtwkk") pod "6b15058f-0936-4bb9-ad72-1c27661b4b82" (UID: "6b15058f-0936-4bb9-ad72-1c27661b4b82"). InnerVolumeSpecName "kube-api-access-vtwkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.879053 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b15058f-0936-4bb9-ad72-1c27661b4b82" (UID: "6b15058f-0936-4bb9-ad72-1c27661b4b82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.886504 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b15058f-0936-4bb9-ad72-1c27661b4b82" (UID: "6b15058f-0936-4bb9-ad72-1c27661b4b82"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.886593 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-config-data" (OuterVolumeSpecName: "config-data") pod "6b15058f-0936-4bb9-ad72-1c27661b4b82" (UID: "6b15058f-0936-4bb9-ad72-1c27661b4b82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.889905 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.909057 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6b15058f-0936-4bb9-ad72-1c27661b4b82" (UID: "6b15058f-0936-4bb9-ad72-1c27661b4b82"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.931188 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.931219 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.931229 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtwkk\" (UniqueName: \"kubernetes.io/projected/6b15058f-0936-4bb9-ad72-1c27661b4b82-kube-api-access-vtwkk\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.931238 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.931249 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b15058f-0936-4bb9-ad72-1c27661b4b82-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.931256 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b15058f-0936-4bb9-ad72-1c27661b4b82-logs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.931329 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 14 07:25:01 crc kubenswrapper[5129]: E0314 07:25:01.931376 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data podName:d291cef2-24d8-4ae6-aa4f-dfa8e782db15 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:09.931361166 +0000 UTC m=+1572.683276350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data") pod "rabbitmq-server-0" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15") : configmap "rabbitmq-config-data" not found Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.937973 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.945016 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 14 07:25:01 crc kubenswrapper[5129]: I0314 07:25:01.956862 5129 scope.go:117] "RemoveContainer" containerID="d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.006246 5129 scope.go:117] "RemoveContainer" containerID="3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.006839 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e\": container with ID starting with 3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e not found: ID does not exist" containerID="3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.006887 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e"} err="failed to get container status \"3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e\": rpc error: code = NotFound desc = could not find container \"3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e\": container with ID starting with 3617459b9e0934de4b0cdcd527b0ea212329a10cd6b69d43c83978c12d92190e not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.006914 5129 scope.go:117] "RemoveContainer" containerID="d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.007339 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff\": container with ID starting with d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff not found: ID does not exist" containerID="d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.007401 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff"} err="failed to get container status \"d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff\": rpc error: code = NotFound desc = could not find container \"d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff\": container with ID starting with d22ea494cb98d26b0c0cd83af3d1839afa617384538263f11bce0e451a9ccbff not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.007428 5129 scope.go:117] "RemoveContainer" containerID="f0d88a61613b0d796589b600c918f3c42969fe8081ec6149ed9e97446ae73149" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.032691 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48dp4" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.062890 5129 scope.go:117] "RemoveContainer" containerID="b7fedc334b9aeda9f3eff633b2bb5ad2a6353604791c1e6f2adeb911962cfdac" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.067280 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" path="/var/lib/kubelet/pods/1a3fad4b-8e44-471d-b262-27d6a7e05276/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.067956 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d59a327-2c1e-49e7-86b3-51e8b692545a" path="/var/lib/kubelet/pods/1d59a327-2c1e-49e7-86b3-51e8b692545a/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.068448 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233604f0-adda-4669-b868-b96791d98bca" path="/var/lib/kubelet/pods/233604f0-adda-4669-b868-b96791d98bca/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.069488 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288de2f6-818d-4167-8511-76f958542fbd" path="/var/lib/kubelet/pods/288de2f6-818d-4167-8511-76f958542fbd/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.072252 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" path="/var/lib/kubelet/pods/30ca9513-5ae9-4520-8012-3c941786ce2a/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.072843 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659ee685-6b83-4af2-bd2e-e5ce9372e408" path="/var/lib/kubelet/pods/659ee685-6b83-4af2-bd2e-e5ce9372e408/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.075104 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" path="/var/lib/kubelet/pods/72ad7859-e34b-4393-b696-548fd7ac8e1d/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.075700 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" path="/var/lib/kubelet/pods/73a06d78-48be-4099-b7fa-be0557b6138e/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.076929 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" path="/var/lib/kubelet/pods/8da79e9b-0c3f-4d66-9813-08116725c6a4/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.077497 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" path="/var/lib/kubelet/pods/9d7fc10c-3f26-4459-9577-e7f09371a44b/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.077914 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b77a9e-f83e-417f-88f7-c378046c4c54" path="/var/lib/kubelet/pods/a8b77a9e-f83e-417f-88f7-c378046c4c54/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.078248 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" path="/var/lib/kubelet/pods/b0a5119c-8784-48e5-841a-654dc253f0d0/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.079312 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" path="/var/lib/kubelet/pods/b169049b-3ab6-4871-ae92-0876f27347e6/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.079980 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" path="/var/lib/kubelet/pods/f12ef33b-b86d-4c80-8f19-385ff5a93fee/volumes" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.093478 5129 scope.go:117] "RemoveContainer" containerID="7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.113382 5129 scope.go:117] "RemoveContainer" containerID="17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.133085 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-477hl\" (UniqueName: \"kubernetes.io/projected/b77f36b2-be7b-43cb-ada4-74f524396018-kube-api-access-477hl\") pod \"b77f36b2-be7b-43cb-ada4-74f524396018\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.133238 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts\") pod \"b77f36b2-be7b-43cb-ada4-74f524396018\" (UID: \"b77f36b2-be7b-43cb-ada4-74f524396018\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.135479 5129 scope.go:117] "RemoveContainer" containerID="1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.135872 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b77f36b2-be7b-43cb-ada4-74f524396018" (UID: "b77f36b2-be7b-43cb-ada4-74f524396018"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.136327 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77f36b2-be7b-43cb-ada4-74f524396018-kube-api-access-477hl" (OuterVolumeSpecName: "kube-api-access-477hl") pod "b77f36b2-be7b-43cb-ada4-74f524396018" (UID: "b77f36b2-be7b-43cb-ada4-74f524396018"). InnerVolumeSpecName "kube-api-access-477hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.238571 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b77f36b2-be7b-43cb-ada4-74f524396018-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.238633 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-477hl\" (UniqueName: \"kubernetes.io/projected/b77f36b2-be7b-43cb-ada4-74f524396018-kube-api-access-477hl\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.265488 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.274066 5129 scope.go:117] "RemoveContainer" containerID="bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.302300 5129 scope.go:117] "RemoveContainer" containerID="7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.302711 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b\": container with ID starting with 7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b not found: ID does not exist" containerID="7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.302750 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b"} err="failed to get container status \"7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b\": rpc error: code = NotFound desc = could not find container \"7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b\": container with ID starting with 7d094db7ac27362ce41fb20cfbbfbd0cb58d32ff76fd8de1e1c4e8741569d84b not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.302780 5129 scope.go:117] "RemoveContainer" containerID="17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.303007 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79\": container with ID starting with 17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79 not found: ID does not exist" containerID="17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.303024 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79"} err="failed to get container status \"17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79\": rpc error: code = NotFound desc = could not find container \"17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79\": container with ID starting with 17fc0c37982bdc4c17582bebd5c88160b0617e58d221490152bd767bb2b74b79 not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.303035 5129 scope.go:117] "RemoveContainer" containerID="1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.303339 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a\": container with ID starting with 1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a not found: ID does not exist" containerID="1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.303359 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a"} err="failed to get container status \"1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a\": rpc error: code = NotFound desc = could not find container \"1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a\": container with ID starting with 1b3cbe2b568039b97be778d27023c536419b0464ab0fc336670f1b9fee17852a not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.303371 5129 scope.go:117] "RemoveContainer" containerID="bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.303589 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d\": container with ID starting with bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d not found: ID does not exist" containerID="bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.303618 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d"} err="failed to get container status \"bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d\": rpc error: code = NotFound desc = could not find container \"bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d\": container with ID starting with bcd78cfb2a66f431657cd0c11bcbda01e51d38d25273756f75b8f2c7eaef968d not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339283 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-operator-scripts\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339394 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-default\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339452 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kolla-config\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339490 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-generated\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339540 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339612 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-galera-tls-certs\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339652 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbr7t\" (UniqueName: \"kubernetes.io/projected/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kube-api-access-mbr7t\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.339690 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-combined-ca-bundle\") pod \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\" (UID: \"eca83e14-f023-4dbd-b646-c9fc5a9e177e\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.340043 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.340584 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.340923 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.342398 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.344902 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kube-api-access-mbr7t" (OuterVolumeSpecName: "kube-api-access-mbr7t") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "kube-api-access-mbr7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.356366 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.408205 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.432721 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "eca83e14-f023-4dbd-b646-c9fc5a9e177e" (UID: "eca83e14-f023-4dbd-b646-c9fc5a9e177e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442274 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442313 5129 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442326 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eca83e14-f023-4dbd-b646-c9fc5a9e177e-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442361 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442375 5129 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442388 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbr7t\" (UniqueName: \"kubernetes.io/projected/eca83e14-f023-4dbd-b646-c9fc5a9e177e-kube-api-access-mbr7t\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442398 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca83e14-f023-4dbd-b646-c9fc5a9e177e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.442409 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca83e14-f023-4dbd-b646-c9fc5a9e177e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.461901 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.544229 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.544289 5129 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.544356 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data podName:f6261e6b-f331-4dcb-8380-167e8f547e1b nodeName:}" failed. No retries permitted until 2026-03-14 07:25:10.544340705 +0000 UTC m=+1573.296255889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b") : configmap "rabbitmq-cell1-config-data" not found Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.646284 5129 generic.go:334] "Generic (PLEG): container finished" podID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerID="d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f" exitCode=0 Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.646396 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eca83e14-f023-4dbd-b646-c9fc5a9e177e","Type":"ContainerDied","Data":"d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f"} Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.646465 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eca83e14-f023-4dbd-b646-c9fc5a9e177e","Type":"ContainerDied","Data":"42e109de26a9b91954d843c71030ca80112572c8f504742616b8e1b0269362a0"} Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.646497 5129 scope.go:117] "RemoveContainer" containerID="d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.646411 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.658805 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.659246 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.659695 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b15058f-0936-4bb9-ad72-1c27661b4b82","Type":"ContainerDied","Data":"f210d80113855c4276894d756c20dedc8d8795cd00378886a3e838d17749a333"} Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.674970 5129 generic.go:334] "Generic (PLEG): container finished" podID="be987b8a-a47d-46a9-bce9-6969473125ff" containerID="953eea29bdc354c617411e73cad623b7f0a9af66f593b7fba568db94fda9d685" exitCode=0 Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.675028 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754fd75497-x4zwc" event={"ID":"be987b8a-a47d-46a9-bce9-6969473125ff","Type":"ContainerDied","Data":"953eea29bdc354c617411e73cad623b7f0a9af66f593b7fba568db94fda9d685"} Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.690773 5129 scope.go:117] "RemoveContainer" containerID="629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.704135 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.712848 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.720696 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.723542 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48dp4" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.723556 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48dp4" event={"ID":"b77f36b2-be7b-43cb-ada4-74f524396018","Type":"ContainerDied","Data":"daa4ba320ebd546664278d84833e3177535913a5398cfaaa0deb972d033e7f3b"} Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.728768 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.743220 5129 scope.go:117] "RemoveContainer" containerID="d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.743582 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.744354 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f\": container with ID starting with d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f not found: ID does not exist" containerID="d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.744393 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f"} err="failed to get container status \"d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f\": rpc error: code = NotFound desc = could not find container \"d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f\": container with ID starting with d5cc3a1871dfed563ca9fb888eec58509f6dd98c1be35ad885efedeb3d7c677f not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.744416 5129 scope.go:117] "RemoveContainer" containerID="629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac" Mar 14 07:25:02 crc kubenswrapper[5129]: E0314 07:25:02.744810 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac\": container with ID starting with 629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac not found: ID does not exist" containerID="629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.744846 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac"} err="failed to get container status \"629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac\": rpc error: code = NotFound desc = could not find container \"629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac\": container with ID starting with 629316b26a984d1db8aef6f3acb03e45b17d93b3f59eac2ee52a07abfa37eaac not found: ID does not exist" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.744873 5129 scope.go:117] "RemoveContainer" containerID="739b061c527066552e175c0fbe0488722ce25d75896eb047b0bd920e05d3e6cb" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.768640 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-48dp4"] Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.777985 5129 scope.go:117] "RemoveContainer" containerID="57f2511f2dd582a994211348b57df514b82da119c5796e9b0a9f36eaf93af695" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.796677 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-48dp4"] Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.816834 5129 scope.go:117] "RemoveContainer" containerID="75b46a090025ba91094ce980bb19db3328ed7635e0272c7d2b864af79aa4e4e3" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853123 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-public-tls-certs\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853170 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-internal-tls-certs\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853235 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-scripts\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853257 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-credential-keys\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853301 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-config-data\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853319 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-fernet-keys\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853342 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-combined-ca-bundle\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.853429 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dg7g\" (UniqueName: \"kubernetes.io/projected/be987b8a-a47d-46a9-bce9-6969473125ff-kube-api-access-9dg7g\") pod \"be987b8a-a47d-46a9-bce9-6969473125ff\" (UID: \"be987b8a-a47d-46a9-bce9-6969473125ff\") " Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.858103 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.858893 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-scripts" (OuterVolumeSpecName: "scripts") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.861087 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be987b8a-a47d-46a9-bce9-6969473125ff-kube-api-access-9dg7g" (OuterVolumeSpecName: "kube-api-access-9dg7g") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "kube-api-access-9dg7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.862380 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.872070 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="f7c2c75d-2b97-4e81-9701-486cee85dd93" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.206:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.878783 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-config-data" (OuterVolumeSpecName: "config-data") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.879402 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.900696 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.905945 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "be987b8a-a47d-46a9-bce9-6969473125ff" (UID: "be987b8a-a47d-46a9-bce9-6969473125ff"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.925235 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955509 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955532 5129 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955541 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955552 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dg7g\" (UniqueName: \"kubernetes.io/projected/be987b8a-a47d-46a9-bce9-6969473125ff-kube-api-access-9dg7g\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955560 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955568 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955576 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:02 crc kubenswrapper[5129]: I0314 07:25:02.955584 5129 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be987b8a-a47d-46a9-bce9-6969473125ff-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.056670 5129 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.056706 5129 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.056714 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.056725 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.056783 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:25:11.056766611 +0000 UTC m=+1573.808681795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.332243 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.334327 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.337397 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.337484 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f796158b-a0d2-4077-9c18-b91a594343fb" containerName="nova-scheduler-scheduler" Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.353818 5129 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 14 07:25:03 crc kubenswrapper[5129]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-14T07:24:56Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 14 07:25:03 crc kubenswrapper[5129]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Mar 14 07:25:03 crc kubenswrapper[5129]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-9bsx2" message=< Mar 14 07:25:03 crc kubenswrapper[5129]: Exiting ovn-controller (1) [FAILED] Mar 14 07:25:03 crc kubenswrapper[5129]: Killing ovn-controller (1) [ OK ] Mar 14 07:25:03 crc kubenswrapper[5129]: Killing ovn-controller (1) with SIGKILL [ OK ] Mar 14 07:25:03 crc kubenswrapper[5129]: 2026-03-14T07:24:56Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 14 07:25:03 crc kubenswrapper[5129]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Mar 14 07:25:03 crc kubenswrapper[5129]: > Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.353869 5129 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 14 07:25:03 crc kubenswrapper[5129]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-14T07:24:56Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 14 07:25:03 crc kubenswrapper[5129]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Mar 14 07:25:03 crc kubenswrapper[5129]: > pod="openstack/ovn-controller-9bsx2" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerName="ovn-controller" containerID="cri-o://59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.354227 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-9bsx2" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerName="ovn-controller" containerID="cri-o://59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28" gracePeriod=22 Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.471202 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.561077 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670415 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670474 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-erlang-cookie\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670512 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6261e6b-f331-4dcb-8380-167e8f547e1b-pod-info\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670529 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670544 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-server-conf\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670562 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7zg6\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-kube-api-access-m7zg6\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670582 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-plugins-conf\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670624 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndhbj\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-kube-api-access-ndhbj\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670653 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-erlang-cookie-secret\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670674 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-erlang-cookie\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670691 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-tls\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670719 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-plugins-conf\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670738 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-tls\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670754 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670792 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-pod-info\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670828 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-server-conf\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670889 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6261e6b-f331-4dcb-8380-167e8f547e1b-erlang-cookie-secret\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670922 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-plugins\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670939 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-confd\") pod \"f6261e6b-f331-4dcb-8380-167e8f547e1b\" (UID: \"f6261e6b-f331-4dcb-8380-167e8f547e1b\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670958 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-plugins\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.670976 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-confd\") pod \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\" (UID: \"d291cef2-24d8-4ae6-aa4f-dfa8e782db15\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.675160 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.678714 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-kube-api-access-m7zg6" (OuterVolumeSpecName: "kube-api-access-m7zg6") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "kube-api-access-m7zg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.679244 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.679401 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.679670 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.680070 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.680677 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.681392 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.681807 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.682797 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6261e6b-f331-4dcb-8380-167e8f547e1b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.683223 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.685014 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.685031 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-pod-info" (OuterVolumeSpecName: "pod-info") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.685391 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-kube-api-access-ndhbj" (OuterVolumeSpecName: "kube-api-access-ndhbj") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "kube-api-access-ndhbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.689337 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f6261e6b-f331-4dcb-8380-167e8f547e1b-pod-info" (OuterVolumeSpecName: "pod-info") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.691804 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data" (OuterVolumeSpecName: "config-data") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.692547 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.697048 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data" (OuterVolumeSpecName: "config-data") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.706067 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-server-conf" (OuterVolumeSpecName: "server-conf") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.708207 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9bsx2_b6a688fe-1537-4ed7-a1ae-2070ba6b1219/ovn-controller/0.log" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.708287 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.719216 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-server-conf" (OuterVolumeSpecName: "server-conf") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.742080 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754fd75497-x4zwc" event={"ID":"be987b8a-a47d-46a9-bce9-6969473125ff","Type":"ContainerDied","Data":"19ff2bc2e9aacd6fda05c8222e2579a06b5c003b8d219db120364b26800f5d13"} Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.742128 5129 scope.go:117] "RemoveContainer" containerID="953eea29bdc354c617411e73cad623b7f0a9af66f593b7fba568db94fda9d685" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.742208 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754fd75497-x4zwc" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.749542 5129 generic.go:334] "Generic (PLEG): container finished" podID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerID="277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb" exitCode=0 Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.749595 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6261e6b-f331-4dcb-8380-167e8f547e1b","Type":"ContainerDied","Data":"277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb"} Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.749632 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6261e6b-f331-4dcb-8380-167e8f547e1b","Type":"ContainerDied","Data":"600515935a33ea69c5e4fbd0d79d4b199a7f7e89cf5337f3055894057821f3c4"} Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.749689 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.753018 5129 generic.go:334] "Generic (PLEG): container finished" podID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerID="dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1" exitCode=0 Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.753049 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d291cef2-24d8-4ae6-aa4f-dfa8e782db15","Type":"ContainerDied","Data":"dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1"} Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.753238 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d291cef2-24d8-4ae6-aa4f-dfa8e782db15","Type":"ContainerDied","Data":"680615c8f16f3b261768db9c4ddfd8b7d973d879da64bbb56cb94b530c17a8bb"} Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.753066 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.754618 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9bsx2_b6a688fe-1537-4ed7-a1ae-2070ba6b1219/ovn-controller/0.log" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.754649 5129 generic.go:334] "Generic (PLEG): container finished" podID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerID="59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28" exitCode=137 Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.754665 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bsx2" event={"ID":"b6a688fe-1537-4ed7-a1ae-2070ba6b1219","Type":"ContainerDied","Data":"59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28"} Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.754679 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bsx2" event={"ID":"b6a688fe-1537-4ed7-a1ae-2070ba6b1219","Type":"ContainerDied","Data":"73e2eb2a448314f0941c9b89f3971d708261829a3793e4b70a331517faf92a07"} Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.757111 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bsx2" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.761527 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f6261e6b-f331-4dcb-8380-167e8f547e1b" (UID: "f6261e6b-f331-4dcb-8380-167e8f547e1b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.763171 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d291cef2-24d8-4ae6-aa4f-dfa8e782db15" (UID: "d291cef2-24d8-4ae6-aa4f-dfa8e782db15"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.774696 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-754fd75497-x4zwc"] Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775517 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndhbj\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-kube-api-access-ndhbj\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775536 5129 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775545 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775555 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775566 5129 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775574 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775596 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775619 5129 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775628 5129 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775635 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775643 5129 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6261e6b-f331-4dcb-8380-167e8f547e1b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775653 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775661 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775669 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775677 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775684 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775692 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6261e6b-f331-4dcb-8380-167e8f547e1b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775700 5129 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6261e6b-f331-4dcb-8380-167e8f547e1b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775713 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775722 5129 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d291cef2-24d8-4ae6-aa4f-dfa8e782db15-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775731 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7zg6\" (UniqueName: \"kubernetes.io/projected/f6261e6b-f331-4dcb-8380-167e8f547e1b-kube-api-access-m7zg6\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.775740 5129 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6261e6b-f331-4dcb-8380-167e8f547e1b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.778371 5129 scope.go:117] "RemoveContainer" containerID="277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.779467 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-754fd75497-x4zwc"] Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.790508 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.793783 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.798924 5129 scope.go:117] "RemoveContainer" containerID="50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.818723 5129 scope.go:117] "RemoveContainer" containerID="277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb" Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.818998 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb\": container with ID starting with 277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb not found: ID does not exist" containerID="277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.819030 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb"} err="failed to get container status \"277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb\": rpc error: code = NotFound desc = could not find container \"277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb\": container with ID starting with 277f280a8838bdfe5fc5653338c9173353a6393548f25eb001dedc1540db7dbb not found: ID does not exist" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.819047 5129 scope.go:117] "RemoveContainer" containerID="50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0" Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.819482 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0\": container with ID starting with 50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0 not found: ID does not exist" containerID="50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.819554 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0"} err="failed to get container status \"50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0\": rpc error: code = NotFound desc = could not find container \"50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0\": container with ID starting with 50c855d9f68ff01d5db8cb50e2191c259ba79b1ab08bfa29c635893e5624e8d0 not found: ID does not exist" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.819585 5129 scope.go:117] "RemoveContainer" containerID="dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.837503 5129 scope.go:117] "RemoveContainer" containerID="6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.858837 5129 scope.go:117] "RemoveContainer" containerID="dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1" Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.859499 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1\": container with ID starting with dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1 not found: ID does not exist" containerID="dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.859545 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1"} err="failed to get container status \"dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1\": rpc error: code = NotFound desc = could not find container \"dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1\": container with ID starting with dec65be26b8c4e1660c706156e48f471c694d12966264082fc806b6ecd1fc6c1 not found: ID does not exist" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.859570 5129 scope.go:117] "RemoveContainer" containerID="6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122" Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.859910 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122\": container with ID starting with 6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122 not found: ID does not exist" containerID="6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.859949 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122"} err="failed to get container status \"6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122\": rpc error: code = NotFound desc = could not find container \"6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122\": container with ID starting with 6503e2aa7dd95bfb4855a65e7d0dea939ea6ca565ff7c365cb8197e3cf19b122 not found: ID does not exist" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.859972 5129 scope.go:117] "RemoveContainer" containerID="59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.876401 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-scripts\") pod \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.877071 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run-ovn\") pod \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.877194 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf7kq\" (UniqueName: \"kubernetes.io/projected/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-kube-api-access-xf7kq\") pod \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.877323 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-ovn-controller-tls-certs\") pod \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.877695 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run\") pod \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.877855 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6a688fe-1537-4ed7-a1ae-2070ba6b1219" (UID: "b6a688fe-1537-4ed7-a1ae-2070ba6b1219"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.877866 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-log-ovn\") pod \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.877931 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-combined-ca-bundle\") pod \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\" (UID: \"b6a688fe-1537-4ed7-a1ae-2070ba6b1219\") " Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.878130 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6a688fe-1537-4ed7-a1ae-2070ba6b1219" (UID: "b6a688fe-1537-4ed7-a1ae-2070ba6b1219"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.878507 5129 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.878588 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.878672 5129 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.878732 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.878810 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run" (OuterVolumeSpecName: "var-run") pod "b6a688fe-1537-4ed7-a1ae-2070ba6b1219" (UID: "b6a688fe-1537-4ed7-a1ae-2070ba6b1219"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.879891 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-scripts" (OuterVolumeSpecName: "scripts") pod "b6a688fe-1537-4ed7-a1ae-2070ba6b1219" (UID: "b6a688fe-1537-4ed7-a1ae-2070ba6b1219"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.883513 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-kube-api-access-xf7kq" (OuterVolumeSpecName: "kube-api-access-xf7kq") pod "b6a688fe-1537-4ed7-a1ae-2070ba6b1219" (UID: "b6a688fe-1537-4ed7-a1ae-2070ba6b1219"). InnerVolumeSpecName "kube-api-access-xf7kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.890377 5129 scope.go:117] "RemoveContainer" containerID="59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28" Mar 14 07:25:03 crc kubenswrapper[5129]: E0314 07:25:03.890900 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28\": container with ID starting with 59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28 not found: ID does not exist" containerID="59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.890931 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28"} err="failed to get container status \"59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28\": rpc error: code = NotFound desc = could not find container \"59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28\": container with ID starting with 59e2a48189e053c185dad60e343a538636c21342c31aa09e71839cd2b4df3a28 not found: ID does not exist" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.903660 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6a688fe-1537-4ed7-a1ae-2070ba6b1219" (UID: "b6a688fe-1537-4ed7-a1ae-2070ba6b1219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.952942 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "b6a688fe-1537-4ed7-a1ae-2070ba6b1219" (UID: "b6a688fe-1537-4ed7-a1ae-2070ba6b1219"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.979799 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.979832 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.979844 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf7kq\" (UniqueName: \"kubernetes.io/projected/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-kube-api-access-xf7kq\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.979856 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:03 crc kubenswrapper[5129]: I0314 07:25:03.979864 5129 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6a688fe-1537-4ed7-a1ae-2070ba6b1219-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.053751 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" path="/var/lib/kubelet/pods/6b15058f-0936-4bb9-ad72-1c27661b4b82/volumes" Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.054559 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" path="/var/lib/kubelet/pods/b77f36b2-be7b-43cb-ada4-74f524396018/volumes" Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.055031 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be987b8a-a47d-46a9-bce9-6969473125ff" path="/var/lib/kubelet/pods/be987b8a-a47d-46a9-bce9-6969473125ff/volumes" Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.056073 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" path="/var/lib/kubelet/pods/eca83e14-f023-4dbd-b646-c9fc5a9e177e/volumes" Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.105541 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.113427 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.154177 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9bsx2"] Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.163591 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9bsx2"] Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.169860 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.175069 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.421117 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.421667 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.422145 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.422223 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.422566 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.426045 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.428181 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.428274 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.693440 5129 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Mar 14 07:25:04 crc kubenswrapper[5129]: E0314 07:25:04.693620 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data podName:df407ca4-4a5d-404c-ab22-89bcde2439c4 nodeName:}" failed. No retries permitted until 2026-03-14 07:25:12.693524723 +0000 UTC m=+1575.445439937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data") pod "nova-cell1-conductor-0" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4") : secret "nova-cell1-conductor-config-data" not found Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.774407 5129 generic.go:334] "Generic (PLEG): container finished" podID="df407ca4-4a5d-404c-ab22-89bcde2439c4" containerID="d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e" exitCode=0 Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.774915 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df407ca4-4a5d-404c-ab22-89bcde2439c4","Type":"ContainerDied","Data":"d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e"} Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.779395 5129 generic.go:334] "Generic (PLEG): container finished" podID="f796158b-a0d2-4077-9c18-b91a594343fb" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" exitCode=0 Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.779523 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f796158b-a0d2-4077-9c18-b91a594343fb","Type":"ContainerDied","Data":"a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63"} Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.915580 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.994946 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.997911 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-combined-ca-bundle\") pod \"df407ca4-4a5d-404c-ab22-89bcde2439c4\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.997971 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvwsc\" (UniqueName: \"kubernetes.io/projected/df407ca4-4a5d-404c-ab22-89bcde2439c4-kube-api-access-hvwsc\") pod \"df407ca4-4a5d-404c-ab22-89bcde2439c4\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " Mar 14 07:25:04 crc kubenswrapper[5129]: I0314 07:25:04.998073 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data\") pod \"df407ca4-4a5d-404c-ab22-89bcde2439c4\" (UID: \"df407ca4-4a5d-404c-ab22-89bcde2439c4\") " Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.003717 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df407ca4-4a5d-404c-ab22-89bcde2439c4-kube-api-access-hvwsc" (OuterVolumeSpecName: "kube-api-access-hvwsc") pod "df407ca4-4a5d-404c-ab22-89bcde2439c4" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4"). InnerVolumeSpecName "kube-api-access-hvwsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.022684 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df407ca4-4a5d-404c-ab22-89bcde2439c4" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.033517 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data" (OuterVolumeSpecName: "config-data") pod "df407ca4-4a5d-404c-ab22-89bcde2439c4" (UID: "df407ca4-4a5d-404c-ab22-89bcde2439c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.099641 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-combined-ca-bundle\") pod \"f796158b-a0d2-4077-9c18-b91a594343fb\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.099698 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh7sn\" (UniqueName: \"kubernetes.io/projected/f796158b-a0d2-4077-9c18-b91a594343fb-kube-api-access-kh7sn\") pod \"f796158b-a0d2-4077-9c18-b91a594343fb\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.099811 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-config-data\") pod \"f796158b-a0d2-4077-9c18-b91a594343fb\" (UID: \"f796158b-a0d2-4077-9c18-b91a594343fb\") " Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.100085 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.100101 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df407ca4-4a5d-404c-ab22-89bcde2439c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.100112 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvwsc\" (UniqueName: \"kubernetes.io/projected/df407ca4-4a5d-404c-ab22-89bcde2439c4-kube-api-access-hvwsc\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.107252 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f796158b-a0d2-4077-9c18-b91a594343fb-kube-api-access-kh7sn" (OuterVolumeSpecName: "kube-api-access-kh7sn") pod "f796158b-a0d2-4077-9c18-b91a594343fb" (UID: "f796158b-a0d2-4077-9c18-b91a594343fb"). InnerVolumeSpecName "kube-api-access-kh7sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.118286 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f796158b-a0d2-4077-9c18-b91a594343fb" (UID: "f796158b-a0d2-4077-9c18-b91a594343fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.118569 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-config-data" (OuterVolumeSpecName: "config-data") pod "f796158b-a0d2-4077-9c18-b91a594343fb" (UID: "f796158b-a0d2-4077-9c18-b91a594343fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.216713 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.216750 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f796158b-a0d2-4077-9c18-b91a594343fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.216765 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh7sn\" (UniqueName: \"kubernetes.io/projected/f796158b-a0d2-4077-9c18-b91a594343fb-kube-api-access-kh7sn\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.795168 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df407ca4-4a5d-404c-ab22-89bcde2439c4","Type":"ContainerDied","Data":"4d2e504d56287a429d6548e6ffbfede164dc4e28142a036fe5d350c48f4a09b6"} Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.795530 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.795665 5129 scope.go:117] "RemoveContainer" containerID="d5f37f6fbaa362787227b822e969f3f95cfd2287989a1655515b437116197b4e" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.798477 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f796158b-a0d2-4077-9c18-b91a594343fb","Type":"ContainerDied","Data":"b93662d300685763fab932a3b4cac96bb349ed3b3f9fa619c9bc10e2ecd7f47d"} Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.798842 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.819699 5129 scope.go:117] "RemoveContainer" containerID="a8b30b499fcbe76c6c54c38cfe35751ce9f67a27193f239a4c3a4e773018fd63" Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.856957 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.866322 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.873520 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:25:05 crc kubenswrapper[5129]: I0314 07:25:05.882661 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 07:25:06 crc kubenswrapper[5129]: I0314 07:25:06.046506 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" path="/var/lib/kubelet/pods/b6a688fe-1537-4ed7-a1ae-2070ba6b1219/volumes" Mar 14 07:25:06 crc kubenswrapper[5129]: I0314 07:25:06.047947 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" path="/var/lib/kubelet/pods/d291cef2-24d8-4ae6-aa4f-dfa8e782db15/volumes" Mar 14 07:25:06 crc kubenswrapper[5129]: I0314 07:25:06.049304 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df407ca4-4a5d-404c-ab22-89bcde2439c4" path="/var/lib/kubelet/pods/df407ca4-4a5d-404c-ab22-89bcde2439c4/volumes" Mar 14 07:25:06 crc kubenswrapper[5129]: I0314 07:25:06.050493 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" path="/var/lib/kubelet/pods/f6261e6b-f331-4dcb-8380-167e8f547e1b/volumes" Mar 14 07:25:06 crc kubenswrapper[5129]: I0314 07:25:06.052494 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f796158b-a0d2-4077-9c18-b91a594343fb" path="/var/lib/kubelet/pods/f796158b-a0d2-4077-9c18-b91a594343fb/volumes" Mar 14 07:25:06 crc kubenswrapper[5129]: I0314 07:25:06.106158 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="659ee685-6b83-4af2-bd2e-e5ce9372e408" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.108:11211: i/o timeout" Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.422641 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.422677 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.426260 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.428018 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.428336 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.428456 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.429632 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:09 crc kubenswrapper[5129]: E0314 07:25:09.429700 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:25:11 crc kubenswrapper[5129]: E0314 07:25:11.127780 5129 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 14 07:25:11 crc kubenswrapper[5129]: E0314 07:25:11.128286 5129 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 14 07:25:11 crc kubenswrapper[5129]: E0314 07:25:11.128317 5129 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 07:25:11 crc kubenswrapper[5129]: E0314 07:25:11.128346 5129 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:25:11 crc kubenswrapper[5129]: E0314 07:25:11.128443 5129 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift podName:2c0c6778-6f35-4243-9e82-ca3c8f3968fc nodeName:}" failed. No retries permitted until 2026-03-14 07:25:27.12841204 +0000 UTC m=+1589.880327274 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift") pod "swift-storage-0" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 14 07:25:11 crc kubenswrapper[5129]: I0314 07:25:11.880159 5129 generic.go:334] "Generic (PLEG): container finished" podID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerID="26d0f3fccd15d22f5d226b58a7b4b02bc6754da5235ba9f5ff39da154f4b4c5b" exitCode=0 Mar 14 07:25:11 crc kubenswrapper[5129]: I0314 07:25:11.880229 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd6bbc76c-9rshh" event={"ID":"e4553e57-9090-44cb-a8af-7297e4c624c0","Type":"ContainerDied","Data":"26d0f3fccd15d22f5d226b58a7b4b02bc6754da5235ba9f5ff39da154f4b4c5b"} Mar 14 07:25:11 crc kubenswrapper[5129]: I0314 07:25:11.880268 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fd6bbc76c-9rshh" event={"ID":"e4553e57-9090-44cb-a8af-7297e4c624c0","Type":"ContainerDied","Data":"7222285490c57f1476b7a540feb24baf129594704555c0308a6c0ea9c393a0fe"} Mar 14 07:25:11 crc kubenswrapper[5129]: I0314 07:25:11.880279 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7222285490c57f1476b7a540feb24baf129594704555c0308a6c0ea9c393a0fe" Mar 14 07:25:11 crc kubenswrapper[5129]: I0314 07:25:11.905635 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.040016 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-combined-ca-bundle\") pod \"e4553e57-9090-44cb-a8af-7297e4c624c0\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.040107 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-internal-tls-certs\") pod \"e4553e57-9090-44cb-a8af-7297e4c624c0\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.040126 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-ovndb-tls-certs\") pod \"e4553e57-9090-44cb-a8af-7297e4c624c0\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.040156 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-public-tls-certs\") pod \"e4553e57-9090-44cb-a8af-7297e4c624c0\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.040193 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbdt5\" (UniqueName: \"kubernetes.io/projected/e4553e57-9090-44cb-a8af-7297e4c624c0-kube-api-access-sbdt5\") pod \"e4553e57-9090-44cb-a8af-7297e4c624c0\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.040348 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-config\") pod \"e4553e57-9090-44cb-a8af-7297e4c624c0\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.040389 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-httpd-config\") pod \"e4553e57-9090-44cb-a8af-7297e4c624c0\" (UID: \"e4553e57-9090-44cb-a8af-7297e4c624c0\") " Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.045271 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4553e57-9090-44cb-a8af-7297e4c624c0-kube-api-access-sbdt5" (OuterVolumeSpecName: "kube-api-access-sbdt5") pod "e4553e57-9090-44cb-a8af-7297e4c624c0" (UID: "e4553e57-9090-44cb-a8af-7297e4c624c0"). InnerVolumeSpecName "kube-api-access-sbdt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.056855 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e4553e57-9090-44cb-a8af-7297e4c624c0" (UID: "e4553e57-9090-44cb-a8af-7297e4c624c0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.079192 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4553e57-9090-44cb-a8af-7297e4c624c0" (UID: "e4553e57-9090-44cb-a8af-7297e4c624c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.085239 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4553e57-9090-44cb-a8af-7297e4c624c0" (UID: "e4553e57-9090-44cb-a8af-7297e4c624c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.087448 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-config" (OuterVolumeSpecName: "config") pod "e4553e57-9090-44cb-a8af-7297e4c624c0" (UID: "e4553e57-9090-44cb-a8af-7297e4c624c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.098579 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4553e57-9090-44cb-a8af-7297e4c624c0" (UID: "e4553e57-9090-44cb-a8af-7297e4c624c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.119383 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e4553e57-9090-44cb-a8af-7297e4c624c0" (UID: "e4553e57-9090-44cb-a8af-7297e4c624c0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.141691 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.141818 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.142018 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.142060 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.142072 5129 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.142081 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4553e57-9090-44cb-a8af-7297e4c624c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.142093 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbdt5\" (UniqueName: \"kubernetes.io/projected/e4553e57-9090-44cb-a8af-7297e4c624c0-kube-api-access-sbdt5\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.891790 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fd6bbc76c-9rshh" Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.944743 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7fd6bbc76c-9rshh"] Mar 14 07:25:12 crc kubenswrapper[5129]: I0314 07:25:12.955523 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7fd6bbc76c-9rshh"] Mar 14 07:25:14 crc kubenswrapper[5129]: I0314 07:25:14.051667 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" path="/var/lib/kubelet/pods/e4553e57-9090-44cb-a8af-7297e4c624c0/volumes" Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.420256 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.421080 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.421696 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.421752 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.422390 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.423953 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.425464 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:14 crc kubenswrapper[5129]: E0314 07:25:14.425526 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.420847 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.422639 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.422715 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.423899 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.423943 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.424655 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.426826 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.426860 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.574442 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.574536 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.574654 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.575542 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.575683 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" gracePeriod=600 Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.708841 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.969916 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" exitCode=0 Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.969994 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3"} Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.970090 5129 scope.go:117] "RemoveContainer" containerID="2bfe21ee59c696834fdd6604caeed60a263cff0e96aef565e246c7dd2d9b99d9" Mar 14 07:25:19 crc kubenswrapper[5129]: I0314 07:25:19.971027 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:25:19 crc kubenswrapper[5129]: E0314 07:25:19.971508 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.421509 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.422411 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.422972 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.423018 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.423043 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.424699 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.426575 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 14 07:25:24 crc kubenswrapper[5129]: E0314 07:25:24.426800 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cfdh9" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.038222 5129 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerID="68f079c63627945e154f1a8ae1e17bd4c418f8de737ec76b1962ceb2dce0568b" exitCode=137 Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.049689 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"68f079c63627945e154f1a8ae1e17bd4c418f8de737ec76b1962ceb2dce0568b"} Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.205974 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.373874 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-combined-ca-bundle\") pod \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.373936 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") pod \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.373970 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42ppz\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-kube-api-access-42ppz\") pod \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.374020 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-cache\") pod \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.374052 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.374083 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-lock\") pod \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\" (UID: \"2c0c6778-6f35-4243-9e82-ca3c8f3968fc\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.375199 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-cache" (OuterVolumeSpecName: "cache") pod "2c0c6778-6f35-4243-9e82-ca3c8f3968fc" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.375284 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-lock" (OuterVolumeSpecName: "lock") pod "2c0c6778-6f35-4243-9e82-ca3c8f3968fc" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.375315 5129 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-cache\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.379035 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "2c0c6778-6f35-4243-9e82-ca3c8f3968fc" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.379411 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2c0c6778-6f35-4243-9e82-ca3c8f3968fc" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.389786 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-kube-api-access-42ppz" (OuterVolumeSpecName: "kube-api-access-42ppz") pod "2c0c6778-6f35-4243-9e82-ca3c8f3968fc" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc"). InnerVolumeSpecName "kube-api-access-42ppz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.478574 5129 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-lock\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.478624 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.478639 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42ppz\" (UniqueName: \"kubernetes.io/projected/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-kube-api-access-42ppz\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.478671 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.496265 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.583085 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.686066 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c0c6778-6f35-4243-9e82-ca3c8f3968fc" (UID: "2c0c6778-6f35-4243-9e82-ca3c8f3968fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.785351 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0c6778-6f35-4243-9e82-ca3c8f3968fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.841958 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfdh9_a67a2a4b-fa14-43cf-983b-45df5afc8e4e/ovs-vswitchd/0.log" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.842621 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886434 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-log\") pod \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886539 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-scripts\") pod \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886547 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-log" (OuterVolumeSpecName: "var-log") pod "a67a2a4b-fa14-43cf-983b-45df5afc8e4e" (UID: "a67a2a4b-fa14-43cf-983b-45df5afc8e4e"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886569 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-run\") pod \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886613 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-run" (OuterVolumeSpecName: "var-run") pod "a67a2a4b-fa14-43cf-983b-45df5afc8e4e" (UID: "a67a2a4b-fa14-43cf-983b-45df5afc8e4e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886664 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwl68\" (UniqueName: \"kubernetes.io/projected/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-kube-api-access-hwl68\") pod \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886713 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-lib\") pod \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886767 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-etc-ovs\") pod \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\" (UID: \"a67a2a4b-fa14-43cf-983b-45df5afc8e4e\") " Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.886975 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-lib" (OuterVolumeSpecName: "var-lib") pod "a67a2a4b-fa14-43cf-983b-45df5afc8e4e" (UID: "a67a2a4b-fa14-43cf-983b-45df5afc8e4e"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.887039 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "a67a2a4b-fa14-43cf-983b-45df5afc8e4e" (UID: "a67a2a4b-fa14-43cf-983b-45df5afc8e4e"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.887329 5129 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-log\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.887345 5129 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.887356 5129 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-var-lib\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.887367 5129 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.887514 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-scripts" (OuterVolumeSpecName: "scripts") pod "a67a2a4b-fa14-43cf-983b-45df5afc8e4e" (UID: "a67a2a4b-fa14-43cf-983b-45df5afc8e4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.889988 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-kube-api-access-hwl68" (OuterVolumeSpecName: "kube-api-access-hwl68") pod "a67a2a4b-fa14-43cf-983b-45df5afc8e4e" (UID: "a67a2a4b-fa14-43cf-983b-45df5afc8e4e"). InnerVolumeSpecName "kube-api-access-hwl68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.987962 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:26 crc kubenswrapper[5129]: I0314 07:25:26.987994 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwl68\" (UniqueName: \"kubernetes.io/projected/a67a2a4b-fa14-43cf-983b-45df5afc8e4e-kube-api-access-hwl68\") on node \"crc\" DevicePath \"\"" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.057095 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c0c6778-6f35-4243-9e82-ca3c8f3968fc","Type":"ContainerDied","Data":"1a8ac9864a22197bd62d032b41808e93c4f88caf9eb643f52eb3ddfe8310a45c"} Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.057179 5129 scope.go:117] "RemoveContainer" containerID="68f079c63627945e154f1a8ae1e17bd4c418f8de737ec76b1962ceb2dce0568b" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.057304 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.059516 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfdh9_a67a2a4b-fa14-43cf-983b-45df5afc8e4e/ovs-vswitchd/0.log" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.060269 5129 generic.go:334] "Generic (PLEG): container finished" podID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" exitCode=137 Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.060307 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerDied","Data":"dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62"} Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.060332 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfdh9" event={"ID":"a67a2a4b-fa14-43cf-983b-45df5afc8e4e","Type":"ContainerDied","Data":"dbd0a143bc3475d85459ab0b07ed8da56f8674c1d3e4aee9e8f813091b079dff"} Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.060404 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cfdh9" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.098839 5129 scope.go:117] "RemoveContainer" containerID="8dfc0ae842b54e831f58e889aa510e0e67ccf64cc21775a958610f8a34d354da" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.128024 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-cfdh9"] Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.128806 5129 scope.go:117] "RemoveContainer" containerID="387ec0f7caa0f65ef81af98a02901cb03b71d081ad6cad271dfe418e7919ac2a" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.137962 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-cfdh9"] Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.146225 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.153474 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.160133 5129 scope.go:117] "RemoveContainer" containerID="1439671428e36f062ecec5e844057f831307ced7145f7e08d2685ea64612ca03" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.179593 5129 scope.go:117] "RemoveContainer" containerID="21d34fb8c8ae530f947029ee8db5c31bb1fff7d650c0adaa7c67025d5674e1a9" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.194701 5129 scope.go:117] "RemoveContainer" containerID="d28de17046ef16ad3ac1f241f5db591353ee653132756124d068067d81af0578" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.209035 5129 scope.go:117] "RemoveContainer" containerID="3e9b50ce398570eef3d701e17b2478e6a2800a85b0238429627e5231237c15ba" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.229410 5129 scope.go:117] "RemoveContainer" containerID="de4e9cf22659332ab4a321d87b2be0419c0f5ba9584d4883e24f44b8cd760689" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.245973 5129 scope.go:117] "RemoveContainer" containerID="3059f4424e061a65f5c7669f2d08d51ae956e9986395c5c1a40f1a6c63f29fe4" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.281622 5129 scope.go:117] "RemoveContainer" containerID="1bfacbb3c7f4e07f82826af5ace3d723b967a5f7ed7522a9988f754dcb73cd76" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.304647 5129 scope.go:117] "RemoveContainer" containerID="ea8689b6701500735737f4995bd8adb2dcef261718f8b5bd911de435e0c3a742" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.326590 5129 scope.go:117] "RemoveContainer" containerID="92720726c9e708458b5da5758e478b9521c17f64ffaa2789a7359d2da3f264d0" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.347959 5129 scope.go:117] "RemoveContainer" containerID="f1efa38796d71cf5fcb64262b9513b8dbb5d0472b2f9b02765e6507ec1e1670d" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.378284 5129 scope.go:117] "RemoveContainer" containerID="89c397e9fa15d1498f1dfb3620f6d0877fdad87ff4762961a8804fdb1503a9fc" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.395810 5129 scope.go:117] "RemoveContainer" containerID="89fd8247fe552c4e8c71705aad2c39088f4d80948c9776ee701e127517dfcd2e" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.429229 5129 scope.go:117] "RemoveContainer" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.467588 5129 scope.go:117] "RemoveContainer" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.488048 5129 scope.go:117] "RemoveContainer" containerID="0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.523676 5129 scope.go:117] "RemoveContainer" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" Mar 14 07:25:27 crc kubenswrapper[5129]: E0314 07:25:27.524127 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62\": container with ID starting with dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62 not found: ID does not exist" containerID="dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.524156 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62"} err="failed to get container status \"dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62\": rpc error: code = NotFound desc = could not find container \"dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62\": container with ID starting with dd96254ebc2feb88ee38c73bfb8169c409c3b6e5dc64656654be1bb781498a62 not found: ID does not exist" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.524174 5129 scope.go:117] "RemoveContainer" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" Mar 14 07:25:27 crc kubenswrapper[5129]: E0314 07:25:27.524624 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef\": container with ID starting with d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef not found: ID does not exist" containerID="d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.524682 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef"} err="failed to get container status \"d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef\": rpc error: code = NotFound desc = could not find container \"d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef\": container with ID starting with d8ba41b96a292fa47c7e8167d9547aae273f1ee0b8bad6aa6f5e08eba62c56ef not found: ID does not exist" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.524717 5129 scope.go:117] "RemoveContainer" containerID="0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc" Mar 14 07:25:27 crc kubenswrapper[5129]: E0314 07:25:27.525060 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc\": container with ID starting with 0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc not found: ID does not exist" containerID="0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc" Mar 14 07:25:27 crc kubenswrapper[5129]: I0314 07:25:27.525083 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc"} err="failed to get container status \"0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc\": rpc error: code = NotFound desc = could not find container \"0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc\": container with ID starting with 0beb9f26c6263c32221d7a8d342037e7bb146fd310587dc1747cd887a2bb31fc not found: ID does not exist" Mar 14 07:25:28 crc kubenswrapper[5129]: I0314 07:25:28.053779 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" path="/var/lib/kubelet/pods/2c0c6778-6f35-4243-9e82-ca3c8f3968fc/volumes" Mar 14 07:25:28 crc kubenswrapper[5129]: I0314 07:25:28.058441 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" path="/var/lib/kubelet/pods/a67a2a4b-fa14-43cf-983b-45df5afc8e4e/volumes" Mar 14 07:25:31 crc kubenswrapper[5129]: I0314 07:25:31.036292 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:25:31 crc kubenswrapper[5129]: E0314 07:25:31.036919 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:25:46 crc kubenswrapper[5129]: I0314 07:25:46.036591 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:25:46 crc kubenswrapper[5129]: E0314 07:25:46.037856 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.145307 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8z79x"] Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147209 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be987b8a-a47d-46a9-bce9-6969473125ff" containerName="keystone-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147245 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="be987b8a-a47d-46a9-bce9-6969473125ff" containerName="keystone-api" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147255 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f796158b-a0d2-4077-9c18-b91a594343fb" containerName="nova-scheduler-scheduler" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147263 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f796158b-a0d2-4077-9c18-b91a594343fb" containerName="nova-scheduler-scheduler" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147271 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147278 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147296 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerName="galera" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147302 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerName="galera" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147310 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="ovn-northd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147315 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="ovn-northd" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147323 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147330 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147337 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147343 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147353 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerName="rabbitmq" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147359 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerName="rabbitmq" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147367 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147372 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147379 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="swift-recon-cron" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147385 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="swift-recon-cron" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147393 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-metadata" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147398 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-metadata" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147410 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerName="galera" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147415 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerName="galera" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147425 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147430 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147461 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="rsync" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147471 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="rsync" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147484 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147491 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147498 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147504 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-api" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147513 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-updater" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147519 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-updater" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147525 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147531 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147541 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659ee685-6b83-4af2-bd2e-e5ce9372e408" containerName="memcached" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147547 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="659ee685-6b83-4af2-bd2e-e5ce9372e408" containerName="memcached" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147557 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df407ca4-4a5d-404c-ab22-89bcde2439c4" containerName="nova-cell1-conductor-conductor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147562 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="df407ca4-4a5d-404c-ab22-89bcde2439c4" containerName="nova-cell1-conductor-conductor" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147570 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d59a327-2c1e-49e7-86b3-51e8b692545a" containerName="kube-state-metrics" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147575 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d59a327-2c1e-49e7-86b3-51e8b692545a" containerName="kube-state-metrics" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147584 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147590 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-server" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147618 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-notification-agent" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147625 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-notification-agent" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147635 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147641 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-server" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147650 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" containerName="mariadb-account-create-update" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147656 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" containerName="mariadb-account-create-update" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147666 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-central-agent" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147672 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-central-agent" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147681 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147686 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147696 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerName="mysql-bootstrap" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147703 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerName="mysql-bootstrap" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147712 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="proxy-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147717 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="proxy-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147726 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147732 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147740 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147746 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147756 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147790 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-api" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147796 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147802 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147809 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147815 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147824 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server-init" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147829 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server-init" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147836 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147842 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147850 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147856 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147865 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" containerName="mariadb-account-create-update" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147870 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" containerName="mariadb-account-create-update" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147878 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerName="setup-container" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147884 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerName="setup-container" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147893 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-expirer" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147899 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-expirer" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147905 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerName="rabbitmq" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147910 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerName="rabbitmq" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147920 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147927 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147937 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147944 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147952 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147959 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147970 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerName="ovn-controller" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147976 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerName="ovn-controller" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.147989 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.147997 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148005 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148012 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-server" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148023 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-updater" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148030 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-updater" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148044 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148052 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148065 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerName="mysql-bootstrap" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148073 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerName="mysql-bootstrap" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148083 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="openstack-network-exporter" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148090 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="openstack-network-exporter" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148103 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288de2f6-818d-4167-8511-76f958542fbd" containerName="nova-cell0-conductor-conductor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148110 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="288de2f6-818d-4167-8511-76f958542fbd" containerName="nova-cell0-conductor-conductor" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148123 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148130 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148138 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148148 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-api" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148155 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148160 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener-log" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148169 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="sg-core" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148174 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="sg-core" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148182 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerName="setup-container" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148188 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerName="setup-container" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148196 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148201 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148211 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148217 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148226 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148232 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker" Mar 14 07:26:00 crc kubenswrapper[5129]: E0314 07:26:00.148240 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-reaper" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148246 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-reaper" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148384 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148394 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148403 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-updater" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148410 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148418 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="df407ca4-4a5d-404c-ab22-89bcde2439c4" containerName="nova-cell1-conductor-conductor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148427 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148433 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d59a327-2c1e-49e7-86b3-51e8b692545a" containerName="kube-state-metrics" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148441 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-notification-agent" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148451 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148459 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ca9513-5ae9-4520-8012-3c941786ce2a" containerName="barbican-keystone-listener" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148469 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" containerName="mariadb-account-create-update" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148475 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6261e6b-f331-4dcb-8380-167e8f547e1b" containerName="rabbitmq" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148484 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148493 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="sg-core" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148502 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f796158b-a0d2-4077-9c18-b91a594343fb" containerName="nova-scheduler-scheduler" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148511 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="openstack-network-exporter" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148522 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="288de2f6-818d-4167-8511-76f958542fbd" containerName="nova-cell0-conductor-conductor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148533 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovs-vswitchd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148544 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="proxy-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148551 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148559 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca83e14-f023-4dbd-b646-c9fc5a9e177e" containerName="galera" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148569 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-reaper" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148578 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="659ee685-6b83-4af2-bd2e-e5ce9372e408" containerName="memcached" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148588 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="rsync" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148594 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="be987b8a-a47d-46a9-bce9-6969473125ff" containerName="keystone-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148619 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12ef33b-b86d-4c80-8f19-385ff5a93fee" containerName="nova-metadata-metadata" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148628 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a688fe-1537-4ed7-a1ae-2070ba6b1219" containerName="ovn-controller" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148634 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148640 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="swift-recon-cron" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148647 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148657 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148665 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148673 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d291cef2-24d8-4ae6-aa4f-dfa8e782db15" containerName="rabbitmq" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148682 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148691 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148699 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da79e9b-0c3f-4d66-9813-08116725c6a4" containerName="placement-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148706 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67a2a4b-fa14-43cf-983b-45df5afc8e4e" containerName="ovsdb-server" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148712 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="233604f0-adda-4669-b868-b96791d98bca" containerName="barbican-worker-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148721 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148730 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-updater" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148739 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148747 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="account-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148753 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3fad4b-8e44-471d-b262-27d6a7e05276" containerName="cinder-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148762 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b15058f-0936-4bb9-ad72-1c27661b4b82" containerName="nova-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148768 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148773 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a5119c-8784-48e5-841a-654dc253f0d0" containerName="glance-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148783 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a06d78-48be-4099-b7fa-be0557b6138e" containerName="glance-httpd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148790 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ad7859-e34b-4393-b696-548fd7ac8e1d" containerName="ovn-northd" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148797 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7fc10c-3f26-4459-9577-e7f09371a44b" containerName="barbican-api-log" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148806 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="object-expirer" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148815 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4553e57-9090-44cb-a8af-7297e4c624c0" containerName="neutron-api" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148823 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b169049b-3ab6-4871-ae92-0876f27347e6" containerName="ceilometer-central-agent" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148829 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-replicator" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148837 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d01ce1-11e5-4fc7-a120-44d9d3407142" containerName="galera" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.148844 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6778-6f35-4243-9e82-ca3c8f3968fc" containerName="container-auditor" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.149323 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8z79x" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.151269 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.151408 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.152091 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.159871 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8z79x"] Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.302225 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np8nz\" (UniqueName: \"kubernetes.io/projected/865753e9-fbb2-475f-9b60-c598f5199ecf-kube-api-access-np8nz\") pod \"auto-csr-approver-29557886-8z79x\" (UID: \"865753e9-fbb2-475f-9b60-c598f5199ecf\") " pod="openshift-infra/auto-csr-approver-29557886-8z79x" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.404028 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np8nz\" (UniqueName: \"kubernetes.io/projected/865753e9-fbb2-475f-9b60-c598f5199ecf-kube-api-access-np8nz\") pod \"auto-csr-approver-29557886-8z79x\" (UID: \"865753e9-fbb2-475f-9b60-c598f5199ecf\") " pod="openshift-infra/auto-csr-approver-29557886-8z79x" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.425293 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np8nz\" (UniqueName: \"kubernetes.io/projected/865753e9-fbb2-475f-9b60-c598f5199ecf-kube-api-access-np8nz\") pod \"auto-csr-approver-29557886-8z79x\" (UID: \"865753e9-fbb2-475f-9b60-c598f5199ecf\") " pod="openshift-infra/auto-csr-approver-29557886-8z79x" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.470850 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8z79x" Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.908172 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:26:00 crc kubenswrapper[5129]: I0314 07:26:00.917207 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8z79x"] Mar 14 07:26:01 crc kubenswrapper[5129]: I0314 07:26:01.037247 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:26:01 crc kubenswrapper[5129]: E0314 07:26:01.037666 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:26:01 crc kubenswrapper[5129]: I0314 07:26:01.396836 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8z79x" event={"ID":"865753e9-fbb2-475f-9b60-c598f5199ecf","Type":"ContainerStarted","Data":"0743aa4f71968d7a06639f094a088cefa6808fc13a5b273bdb5c1a517c98a593"} Mar 14 07:26:02 crc kubenswrapper[5129]: I0314 07:26:02.406234 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8z79x" event={"ID":"865753e9-fbb2-475f-9b60-c598f5199ecf","Type":"ContainerStarted","Data":"546f73617960a1c5914f46c5b758402ea474b5b6eb6eef57c8eafae71f1308c0"} Mar 14 07:26:02 crc kubenswrapper[5129]: I0314 07:26:02.420381 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557886-8z79x" podStartSLOduration=1.338549886 podStartE2EDuration="2.420363422s" podCreationTimestamp="2026-03-14 07:26:00 +0000 UTC" firstStartedPulling="2026-03-14 07:26:00.907879042 +0000 UTC m=+1623.659794226" lastFinishedPulling="2026-03-14 07:26:01.989692538 +0000 UTC m=+1624.741607762" observedRunningTime="2026-03-14 07:26:02.418122771 +0000 UTC m=+1625.170037955" watchObservedRunningTime="2026-03-14 07:26:02.420363422 +0000 UTC m=+1625.172278606" Mar 14 07:26:03 crc kubenswrapper[5129]: I0314 07:26:03.417000 5129 generic.go:334] "Generic (PLEG): container finished" podID="865753e9-fbb2-475f-9b60-c598f5199ecf" containerID="546f73617960a1c5914f46c5b758402ea474b5b6eb6eef57c8eafae71f1308c0" exitCode=0 Mar 14 07:26:03 crc kubenswrapper[5129]: I0314 07:26:03.417054 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8z79x" event={"ID":"865753e9-fbb2-475f-9b60-c598f5199ecf","Type":"ContainerDied","Data":"546f73617960a1c5914f46c5b758402ea474b5b6eb6eef57c8eafae71f1308c0"} Mar 14 07:26:04 crc kubenswrapper[5129]: I0314 07:26:04.768897 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8z79x" Mar 14 07:26:04 crc kubenswrapper[5129]: I0314 07:26:04.774839 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np8nz\" (UniqueName: \"kubernetes.io/projected/865753e9-fbb2-475f-9b60-c598f5199ecf-kube-api-access-np8nz\") pod \"865753e9-fbb2-475f-9b60-c598f5199ecf\" (UID: \"865753e9-fbb2-475f-9b60-c598f5199ecf\") " Mar 14 07:26:04 crc kubenswrapper[5129]: I0314 07:26:04.783184 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865753e9-fbb2-475f-9b60-c598f5199ecf-kube-api-access-np8nz" (OuterVolumeSpecName: "kube-api-access-np8nz") pod "865753e9-fbb2-475f-9b60-c598f5199ecf" (UID: "865753e9-fbb2-475f-9b60-c598f5199ecf"). InnerVolumeSpecName "kube-api-access-np8nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:26:04 crc kubenswrapper[5129]: I0314 07:26:04.876328 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np8nz\" (UniqueName: \"kubernetes.io/projected/865753e9-fbb2-475f-9b60-c598f5199ecf-kube-api-access-np8nz\") on node \"crc\" DevicePath \"\"" Mar 14 07:26:05 crc kubenswrapper[5129]: I0314 07:26:05.437682 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557886-8z79x" event={"ID":"865753e9-fbb2-475f-9b60-c598f5199ecf","Type":"ContainerDied","Data":"0743aa4f71968d7a06639f094a088cefa6808fc13a5b273bdb5c1a517c98a593"} Mar 14 07:26:05 crc kubenswrapper[5129]: I0314 07:26:05.437732 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0743aa4f71968d7a06639f094a088cefa6808fc13a5b273bdb5c1a517c98a593" Mar 14 07:26:05 crc kubenswrapper[5129]: I0314 07:26:05.437789 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557886-8z79x" Mar 14 07:26:05 crc kubenswrapper[5129]: I0314 07:26:05.490291 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-jmwkx"] Mar 14 07:26:05 crc kubenswrapper[5129]: I0314 07:26:05.495863 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557880-jmwkx"] Mar 14 07:26:06 crc kubenswrapper[5129]: I0314 07:26:06.047697 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acac1a33-44b5-4200-b5c1-91a2339283b9" path="/var/lib/kubelet/pods/acac1a33-44b5-4200-b5c1-91a2339283b9/volumes" Mar 14 07:26:12 crc kubenswrapper[5129]: I0314 07:26:12.036058 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:26:12 crc kubenswrapper[5129]: E0314 07:26:12.036643 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:26:19 crc kubenswrapper[5129]: I0314 07:26:19.257226 5129 scope.go:117] "RemoveContainer" containerID="eed30ff2173035b830ab6b839b3e1b27368ac2ee4887106fc76a47c2d18dd2ca" Mar 14 07:26:23 crc kubenswrapper[5129]: I0314 07:26:23.036718 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:26:23 crc kubenswrapper[5129]: E0314 07:26:23.037741 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:26:36 crc kubenswrapper[5129]: I0314 07:26:36.036359 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:26:36 crc kubenswrapper[5129]: E0314 07:26:36.037229 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:26:49 crc kubenswrapper[5129]: I0314 07:26:49.036271 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:26:49 crc kubenswrapper[5129]: E0314 07:26:49.037333 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:27:02 crc kubenswrapper[5129]: I0314 07:27:02.036777 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:27:02 crc kubenswrapper[5129]: E0314 07:27:02.038437 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:27:15 crc kubenswrapper[5129]: I0314 07:27:15.037112 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:27:15 crc kubenswrapper[5129]: E0314 07:27:15.037830 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.378012 5129 scope.go:117] "RemoveContainer" containerID="280e70bcc439b2c014146604829b8ba14a084f751f5b8f4de23c94c4c3333070" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.430176 5129 scope.go:117] "RemoveContainer" containerID="05a12761bd380c5b63fa4611c9eefc9182b51f8797441885cb0882e5f1428152" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.450929 5129 scope.go:117] "RemoveContainer" containerID="94622b0d2d04ac62d54bff88e740d79ee2b30b8094d80870f9821e12c72f54b6" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.495557 5129 scope.go:117] "RemoveContainer" containerID="62bd5a629f85e5a8e103427ea5018ceb216c75ee329d681eba7351287bdeff37" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.512897 5129 scope.go:117] "RemoveContainer" containerID="3db82d92800e77d3b7ece09cef6001114bf8a07f1cf5ac6893920c938bc328a9" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.535049 5129 scope.go:117] "RemoveContainer" containerID="60729269e9e74cce75eaa29022a5770b1873da001478e25f920eb77f5ad7805a" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.562334 5129 scope.go:117] "RemoveContainer" containerID="a7b578ca7c856174b66ad424c57c60fbf8244f1fe4ddbf628dbbf6c90301a784" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.582292 5129 scope.go:117] "RemoveContainer" containerID="37ea063e6c3a1ea8642e41cfcdf5d7f9ece1f10520882047ed8dde1ce66791ee" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.600227 5129 scope.go:117] "RemoveContainer" containerID="4f8425c516d3f71ead14b88caaec29687a3c0127db216c8109ec15e81d091767" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.629912 5129 scope.go:117] "RemoveContainer" containerID="b854816ffdb413b1f325152b1eea18a46d450c72ee8805a2b0a3db09e84d6a28" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.648873 5129 scope.go:117] "RemoveContainer" containerID="b9a8d6affbde66a3c9a08a97803076b6f677cdee9cb6d3732d5df6ebc8f3d2ac" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.667841 5129 scope.go:117] "RemoveContainer" containerID="793f4948eea9bbde654020a67c725e8c5e272cc69a7a8ca7eabd48ce098086fb" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.683285 5129 scope.go:117] "RemoveContainer" containerID="f7d6ee3f639fd61f9fb30fbfb9f3da3ff3a7d2253225681de5df251f37862907" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.704490 5129 scope.go:117] "RemoveContainer" containerID="252a4f363d08c5b6cdff2d3573a9472b02174c2666aea18e478d986cc1704a09" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.730702 5129 scope.go:117] "RemoveContainer" containerID="a51d9b57d69ec5670560c0fbc28cb54dabfd328b623f7c1ef7fa2ab822811ff1" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.754560 5129 scope.go:117] "RemoveContainer" containerID="e8ade8ce19ee6a7c918aab839e11b18d21502a596836bd2580c22973f1d8e9d4" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.779766 5129 scope.go:117] "RemoveContainer" containerID="7dfeaff5aa0d4fc1216ce7e841aba0740560b55e377ce234c07c519df6a88449" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.802160 5129 scope.go:117] "RemoveContainer" containerID="66af8e6e95a3ad382ac1cd343ee394ecbadd2bfbe11f4bccced2734d290357c4" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.827197 5129 scope.go:117] "RemoveContainer" containerID="6b11ff7c689d36d9d7972ebdf5768535187d1d46df224de6748d31adcb31b8af" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.855944 5129 scope.go:117] "RemoveContainer" containerID="32afa5447dc7817f3bfebeca62cca531b5c3a36aacad13e6ca1180d089257887" Mar 14 07:27:19 crc kubenswrapper[5129]: I0314 07:27:19.879530 5129 scope.go:117] "RemoveContainer" containerID="74d5ec83fe89138910b7e8dfccdd1837156e7d6178650fedbb4f80fe74cdc1ff" Mar 14 07:27:26 crc kubenswrapper[5129]: I0314 07:27:26.036293 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:27:26 crc kubenswrapper[5129]: E0314 07:27:26.037233 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:27:41 crc kubenswrapper[5129]: I0314 07:27:41.037063 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:27:41 crc kubenswrapper[5129]: E0314 07:27:41.038382 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:27:52 crc kubenswrapper[5129]: I0314 07:27:52.036790 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:27:52 crc kubenswrapper[5129]: E0314 07:27:52.037498 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.152326 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557888-ktwdm"] Mar 14 07:28:00 crc kubenswrapper[5129]: E0314 07:28:00.153513 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865753e9-fbb2-475f-9b60-c598f5199ecf" containerName="oc" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.153535 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="865753e9-fbb2-475f-9b60-c598f5199ecf" containerName="oc" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.153805 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77f36b2-be7b-43cb-ada4-74f524396018" containerName="mariadb-account-create-update" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.153857 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="865753e9-fbb2-475f-9b60-c598f5199ecf" containerName="oc" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.154562 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-ktwdm" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.158175 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.159055 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.159469 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.169356 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-ktwdm"] Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.202206 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qm8r\" (UniqueName: \"kubernetes.io/projected/6636a98f-5cd3-4141-81c8-93128c0dce7b-kube-api-access-4qm8r\") pod \"auto-csr-approver-29557888-ktwdm\" (UID: \"6636a98f-5cd3-4141-81c8-93128c0dce7b\") " pod="openshift-infra/auto-csr-approver-29557888-ktwdm" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.303319 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qm8r\" (UniqueName: \"kubernetes.io/projected/6636a98f-5cd3-4141-81c8-93128c0dce7b-kube-api-access-4qm8r\") pod \"auto-csr-approver-29557888-ktwdm\" (UID: \"6636a98f-5cd3-4141-81c8-93128c0dce7b\") " pod="openshift-infra/auto-csr-approver-29557888-ktwdm" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.325914 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qm8r\" (UniqueName: \"kubernetes.io/projected/6636a98f-5cd3-4141-81c8-93128c0dce7b-kube-api-access-4qm8r\") pod \"auto-csr-approver-29557888-ktwdm\" (UID: \"6636a98f-5cd3-4141-81c8-93128c0dce7b\") " pod="openshift-infra/auto-csr-approver-29557888-ktwdm" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.480990 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-ktwdm" Mar 14 07:28:00 crc kubenswrapper[5129]: I0314 07:28:00.917973 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-ktwdm"] Mar 14 07:28:01 crc kubenswrapper[5129]: I0314 07:28:01.570490 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557888-ktwdm" event={"ID":"6636a98f-5cd3-4141-81c8-93128c0dce7b","Type":"ContainerStarted","Data":"fdad79271c91834a0105b6c9f581d892a761967a4be21630888cefa0d827255d"} Mar 14 07:28:02 crc kubenswrapper[5129]: I0314 07:28:02.581711 5129 generic.go:334] "Generic (PLEG): container finished" podID="6636a98f-5cd3-4141-81c8-93128c0dce7b" containerID="5dde55797b5380580f1237f698461c0b7685a5c3287904143dc18e06be309f1a" exitCode=0 Mar 14 07:28:02 crc kubenswrapper[5129]: I0314 07:28:02.581803 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557888-ktwdm" event={"ID":"6636a98f-5cd3-4141-81c8-93128c0dce7b","Type":"ContainerDied","Data":"5dde55797b5380580f1237f698461c0b7685a5c3287904143dc18e06be309f1a"} Mar 14 07:28:03 crc kubenswrapper[5129]: I0314 07:28:03.922223 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-ktwdm" Mar 14 07:28:03 crc kubenswrapper[5129]: I0314 07:28:03.959279 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qm8r\" (UniqueName: \"kubernetes.io/projected/6636a98f-5cd3-4141-81c8-93128c0dce7b-kube-api-access-4qm8r\") pod \"6636a98f-5cd3-4141-81c8-93128c0dce7b\" (UID: \"6636a98f-5cd3-4141-81c8-93128c0dce7b\") " Mar 14 07:28:03 crc kubenswrapper[5129]: I0314 07:28:03.965890 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6636a98f-5cd3-4141-81c8-93128c0dce7b-kube-api-access-4qm8r" (OuterVolumeSpecName: "kube-api-access-4qm8r") pod "6636a98f-5cd3-4141-81c8-93128c0dce7b" (UID: "6636a98f-5cd3-4141-81c8-93128c0dce7b"). InnerVolumeSpecName "kube-api-access-4qm8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:28:04 crc kubenswrapper[5129]: I0314 07:28:04.060474 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qm8r\" (UniqueName: \"kubernetes.io/projected/6636a98f-5cd3-4141-81c8-93128c0dce7b-kube-api-access-4qm8r\") on node \"crc\" DevicePath \"\"" Mar 14 07:28:04 crc kubenswrapper[5129]: I0314 07:28:04.607061 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557888-ktwdm" event={"ID":"6636a98f-5cd3-4141-81c8-93128c0dce7b","Type":"ContainerDied","Data":"fdad79271c91834a0105b6c9f581d892a761967a4be21630888cefa0d827255d"} Mar 14 07:28:04 crc kubenswrapper[5129]: I0314 07:28:04.607154 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdad79271c91834a0105b6c9f581d892a761967a4be21630888cefa0d827255d" Mar 14 07:28:04 crc kubenswrapper[5129]: I0314 07:28:04.607175 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557888-ktwdm" Mar 14 07:28:05 crc kubenswrapper[5129]: I0314 07:28:05.015684 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-j24cn"] Mar 14 07:28:05 crc kubenswrapper[5129]: I0314 07:28:05.023885 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557882-j24cn"] Mar 14 07:28:05 crc kubenswrapper[5129]: I0314 07:28:05.036153 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:28:05 crc kubenswrapper[5129]: E0314 07:28:05.036363 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:28:06 crc kubenswrapper[5129]: I0314 07:28:06.047851 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67606449-22cf-4aed-82df-32cece6daffb" path="/var/lib/kubelet/pods/67606449-22cf-4aed-82df-32cece6daffb/volumes" Mar 14 07:28:19 crc kubenswrapper[5129]: I0314 07:28:19.036282 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:28:19 crc kubenswrapper[5129]: E0314 07:28:19.037107 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.273183 5129 scope.go:117] "RemoveContainer" containerID="6d5f8bf87e692aabeac10226f1e8e09b6e6634992f400961227fd04fb3fa5a3d" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.328397 5129 scope.go:117] "RemoveContainer" containerID="b6129aad28f25a4355bc45fcf76cc3d1877902860adc2871b519d96f22803f4c" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.367102 5129 scope.go:117] "RemoveContainer" containerID="26d0f3fccd15d22f5d226b58a7b4b02bc6754da5235ba9f5ff39da154f4b4c5b" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.389705 5129 scope.go:117] "RemoveContainer" containerID="52bd0a20548a621f45a9955bd8dc359566b8030a3da89b698db926a9a9020962" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.430091 5129 scope.go:117] "RemoveContainer" containerID="54d3e699c738bc194b2a33f855142c97e9cee53bfb0ad24709038db8332022df" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.457721 5129 scope.go:117] "RemoveContainer" containerID="688b601624978ad581d2874e16b4fd640acb7ecdf38357ac8796f3c33fdb6a93" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.499746 5129 scope.go:117] "RemoveContainer" containerID="029512079db9793f0a83f9994180a737873eded1aea3b898b28d36c92dc6e03d" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.530981 5129 scope.go:117] "RemoveContainer" containerID="58636ddf9056d5cf6b9725a0fc761ea443236adbe98622fdee0c49765d0989de" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.582318 5129 scope.go:117] "RemoveContainer" containerID="3962e27294b5cc1e11ab43a51823ea69248516d08e03f0c0c3df80ff9127b020" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.628203 5129 scope.go:117] "RemoveContainer" containerID="bc3211d9096a638fa8c4213b5bf198de4c37facb10847c262d7a2dcc1726dfc9" Mar 14 07:28:20 crc kubenswrapper[5129]: I0314 07:28:20.644048 5129 scope.go:117] "RemoveContainer" containerID="071d4c470a706315a4ab4d2dbcecacc5edcf67f5816419489da0c1c9e353496f" Mar 14 07:28:33 crc kubenswrapper[5129]: I0314 07:28:33.036873 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:28:33 crc kubenswrapper[5129]: E0314 07:28:33.037697 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:28:46 crc kubenswrapper[5129]: I0314 07:28:46.037134 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:28:46 crc kubenswrapper[5129]: E0314 07:28:46.037973 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:28:58 crc kubenswrapper[5129]: I0314 07:28:58.041133 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:28:58 crc kubenswrapper[5129]: E0314 07:28:58.041820 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:29:11 crc kubenswrapper[5129]: I0314 07:29:11.037020 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:29:11 crc kubenswrapper[5129]: E0314 07:29:11.038390 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:29:20 crc kubenswrapper[5129]: I0314 07:29:20.806397 5129 scope.go:117] "RemoveContainer" containerID="3960bf45564cd7a5d0c3f2cfcfc3a0b700a9a46196ffca4ba11e5884fbce96d1" Mar 14 07:29:20 crc kubenswrapper[5129]: I0314 07:29:20.855498 5129 scope.go:117] "RemoveContainer" containerID="4649aa78d1e37425574ecfe4a03fda95b21e45c63c426d03ab34b339d2cbed7b" Mar 14 07:29:20 crc kubenswrapper[5129]: I0314 07:29:20.897829 5129 scope.go:117] "RemoveContainer" containerID="801f9c6b9b4609a23fdfd6c9e39ce8d8281239d0826281a0aafbad38b75cd0a6" Mar 14 07:29:20 crc kubenswrapper[5129]: I0314 07:29:20.936800 5129 scope.go:117] "RemoveContainer" containerID="a731b5c1d187c5f7593f08c748a82216a8d2c9ca3e2b7a83bff17ae61bc36cd3" Mar 14 07:29:20 crc kubenswrapper[5129]: I0314 07:29:20.965482 5129 scope.go:117] "RemoveContainer" containerID="4d62c3f1e4e66d656296bfd613aa4d41a73979018105522b96e8ba63503eb9d8" Mar 14 07:29:21 crc kubenswrapper[5129]: I0314 07:29:21.014821 5129 scope.go:117] "RemoveContainer" containerID="3ad8b9fa221f117d804328f3d1d23c11e664b5ca221ea492eeb968a42da6af5c" Mar 14 07:29:21 crc kubenswrapper[5129]: I0314 07:29:21.058756 5129 scope.go:117] "RemoveContainer" containerID="26b4aec00cbbff617806653a000ecc4903072ef2737594d30098d8439aeda080" Mar 14 07:29:21 crc kubenswrapper[5129]: I0314 07:29:21.083963 5129 scope.go:117] "RemoveContainer" containerID="ca5d78ebc9e5c0ead9b825452ce86e555b271077f92d69b6cf145c6d03e2b989" Mar 14 07:29:21 crc kubenswrapper[5129]: I0314 07:29:21.103005 5129 scope.go:117] "RemoveContainer" containerID="4f31926dace18ee6fa2886d83ca81ff1aa2ddb06023c8ecc5fd52e3b66425fc3" Mar 14 07:29:25 crc kubenswrapper[5129]: I0314 07:29:25.037958 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:29:25 crc kubenswrapper[5129]: E0314 07:29:25.039075 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:29:39 crc kubenswrapper[5129]: I0314 07:29:39.037375 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:29:39 crc kubenswrapper[5129]: E0314 07:29:39.038529 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:29:50 crc kubenswrapper[5129]: I0314 07:29:50.037526 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:29:50 crc kubenswrapper[5129]: E0314 07:29:50.038718 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.153820 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557890-njt7z"] Mar 14 07:30:00 crc kubenswrapper[5129]: E0314 07:30:00.155151 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6636a98f-5cd3-4141-81c8-93128c0dce7b" containerName="oc" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.155181 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6636a98f-5cd3-4141-81c8-93128c0dce7b" containerName="oc" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.155476 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6636a98f-5cd3-4141-81c8-93128c0dce7b" containerName="oc" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.156372 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-njt7z" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.158722 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.158775 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.160218 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd"] Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.160412 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.161061 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.162275 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.162689 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.192944 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd"] Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.213201 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-njt7z"] Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.254169 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srsw\" (UniqueName: \"kubernetes.io/projected/f4e86f46-497a-42a5-b15f-fbb484545a18-kube-api-access-7srsw\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.254241 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4e86f46-497a-42a5-b15f-fbb484545a18-config-volume\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.254265 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4e86f46-497a-42a5-b15f-fbb484545a18-secret-volume\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.254393 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5sjw\" (UniqueName: \"kubernetes.io/projected/e22ffa3b-1f27-4da4-9740-ec824287f399-kube-api-access-j5sjw\") pod \"auto-csr-approver-29557890-njt7z\" (UID: \"e22ffa3b-1f27-4da4-9740-ec824287f399\") " pod="openshift-infra/auto-csr-approver-29557890-njt7z" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.355526 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srsw\" (UniqueName: \"kubernetes.io/projected/f4e86f46-497a-42a5-b15f-fbb484545a18-kube-api-access-7srsw\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.355916 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4e86f46-497a-42a5-b15f-fbb484545a18-config-volume\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.355951 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4e86f46-497a-42a5-b15f-fbb484545a18-secret-volume\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.355988 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5sjw\" (UniqueName: \"kubernetes.io/projected/e22ffa3b-1f27-4da4-9740-ec824287f399-kube-api-access-j5sjw\") pod \"auto-csr-approver-29557890-njt7z\" (UID: \"e22ffa3b-1f27-4da4-9740-ec824287f399\") " pod="openshift-infra/auto-csr-approver-29557890-njt7z" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.356800 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4e86f46-497a-42a5-b15f-fbb484545a18-config-volume\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.361403 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4e86f46-497a-42a5-b15f-fbb484545a18-secret-volume\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.379127 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5sjw\" (UniqueName: \"kubernetes.io/projected/e22ffa3b-1f27-4da4-9740-ec824287f399-kube-api-access-j5sjw\") pod \"auto-csr-approver-29557890-njt7z\" (UID: \"e22ffa3b-1f27-4da4-9740-ec824287f399\") " pod="openshift-infra/auto-csr-approver-29557890-njt7z" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.382361 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srsw\" (UniqueName: \"kubernetes.io/projected/f4e86f46-497a-42a5-b15f-fbb484545a18-kube-api-access-7srsw\") pod \"collect-profiles-29557890-glgsd\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.490746 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-njt7z" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.502650 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.912227 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-njt7z"] Mar 14 07:30:00 crc kubenswrapper[5129]: W0314 07:30:00.919142 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode22ffa3b_1f27_4da4_9740_ec824287f399.slice/crio-c1f2ec13b88669a73f8b2cef45ba14bb8eff619b629f5e4255f875a87fe1d313 WatchSource:0}: Error finding container c1f2ec13b88669a73f8b2cef45ba14bb8eff619b629f5e4255f875a87fe1d313: Status 404 returned error can't find the container with id c1f2ec13b88669a73f8b2cef45ba14bb8eff619b629f5e4255f875a87fe1d313 Mar 14 07:30:00 crc kubenswrapper[5129]: I0314 07:30:00.985521 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd"] Mar 14 07:30:00 crc kubenswrapper[5129]: W0314 07:30:00.986783 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e86f46_497a_42a5_b15f_fbb484545a18.slice/crio-114addf9accf1bf6de53257089d32e308d6692c6532274aa51911ec18629e054 WatchSource:0}: Error finding container 114addf9accf1bf6de53257089d32e308d6692c6532274aa51911ec18629e054: Status 404 returned error can't find the container with id 114addf9accf1bf6de53257089d32e308d6692c6532274aa51911ec18629e054 Mar 14 07:30:01 crc kubenswrapper[5129]: I0314 07:30:01.671595 5129 generic.go:334] "Generic (PLEG): container finished" podID="f4e86f46-497a-42a5-b15f-fbb484545a18" containerID="5ffbb904f7121a9609eb49eed71b135d7cfec0a840e955f26cde1e2667cd8115" exitCode=0 Mar 14 07:30:01 crc kubenswrapper[5129]: I0314 07:30:01.671677 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" event={"ID":"f4e86f46-497a-42a5-b15f-fbb484545a18","Type":"ContainerDied","Data":"5ffbb904f7121a9609eb49eed71b135d7cfec0a840e955f26cde1e2667cd8115"} Mar 14 07:30:01 crc kubenswrapper[5129]: I0314 07:30:01.671967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" event={"ID":"f4e86f46-497a-42a5-b15f-fbb484545a18","Type":"ContainerStarted","Data":"114addf9accf1bf6de53257089d32e308d6692c6532274aa51911ec18629e054"} Mar 14 07:30:01 crc kubenswrapper[5129]: I0314 07:30:01.673226 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-njt7z" event={"ID":"e22ffa3b-1f27-4da4-9740-ec824287f399","Type":"ContainerStarted","Data":"c1f2ec13b88669a73f8b2cef45ba14bb8eff619b629f5e4255f875a87fe1d313"} Mar 14 07:30:02 crc kubenswrapper[5129]: I0314 07:30:02.681841 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-njt7z" event={"ID":"e22ffa3b-1f27-4da4-9740-ec824287f399","Type":"ContainerStarted","Data":"bde2e6fc1e221c741b06c8376fb66c51765a55b595a1732802b52d3f9f3e09f9"} Mar 14 07:30:02 crc kubenswrapper[5129]: I0314 07:30:02.700754 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557890-njt7z" podStartSLOduration=1.410658748 podStartE2EDuration="2.700727374s" podCreationTimestamp="2026-03-14 07:30:00 +0000 UTC" firstStartedPulling="2026-03-14 07:30:00.921736808 +0000 UTC m=+1863.673652022" lastFinishedPulling="2026-03-14 07:30:02.211805434 +0000 UTC m=+1864.963720648" observedRunningTime="2026-03-14 07:30:02.699848631 +0000 UTC m=+1865.451763835" watchObservedRunningTime="2026-03-14 07:30:02.700727374 +0000 UTC m=+1865.452642558" Mar 14 07:30:02 crc kubenswrapper[5129]: I0314 07:30:02.991141 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.098869 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4e86f46-497a-42a5-b15f-fbb484545a18-config-volume\") pod \"f4e86f46-497a-42a5-b15f-fbb484545a18\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.098981 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4e86f46-497a-42a5-b15f-fbb484545a18-secret-volume\") pod \"f4e86f46-497a-42a5-b15f-fbb484545a18\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.099047 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7srsw\" (UniqueName: \"kubernetes.io/projected/f4e86f46-497a-42a5-b15f-fbb484545a18-kube-api-access-7srsw\") pod \"f4e86f46-497a-42a5-b15f-fbb484545a18\" (UID: \"f4e86f46-497a-42a5-b15f-fbb484545a18\") " Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.099359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e86f46-497a-42a5-b15f-fbb484545a18-config-volume" (OuterVolumeSpecName: "config-volume") pod "f4e86f46-497a-42a5-b15f-fbb484545a18" (UID: "f4e86f46-497a-42a5-b15f-fbb484545a18"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.099594 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4e86f46-497a-42a5-b15f-fbb484545a18-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.105311 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e86f46-497a-42a5-b15f-fbb484545a18-kube-api-access-7srsw" (OuterVolumeSpecName: "kube-api-access-7srsw") pod "f4e86f46-497a-42a5-b15f-fbb484545a18" (UID: "f4e86f46-497a-42a5-b15f-fbb484545a18"). InnerVolumeSpecName "kube-api-access-7srsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.106277 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e86f46-497a-42a5-b15f-fbb484545a18-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f4e86f46-497a-42a5-b15f-fbb484545a18" (UID: "f4e86f46-497a-42a5-b15f-fbb484545a18"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.201369 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4e86f46-497a-42a5-b15f-fbb484545a18-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.201926 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7srsw\" (UniqueName: \"kubernetes.io/projected/f4e86f46-497a-42a5-b15f-fbb484545a18-kube-api-access-7srsw\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.694329 5129 generic.go:334] "Generic (PLEG): container finished" podID="e22ffa3b-1f27-4da4-9740-ec824287f399" containerID="bde2e6fc1e221c741b06c8376fb66c51765a55b595a1732802b52d3f9f3e09f9" exitCode=0 Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.694574 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-njt7z" event={"ID":"e22ffa3b-1f27-4da4-9740-ec824287f399","Type":"ContainerDied","Data":"bde2e6fc1e221c741b06c8376fb66c51765a55b595a1732802b52d3f9f3e09f9"} Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.699974 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" event={"ID":"f4e86f46-497a-42a5-b15f-fbb484545a18","Type":"ContainerDied","Data":"114addf9accf1bf6de53257089d32e308d6692c6532274aa51911ec18629e054"} Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.700016 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114addf9accf1bf6de53257089d32e308d6692c6532274aa51911ec18629e054" Mar 14 07:30:03 crc kubenswrapper[5129]: I0314 07:30:03.700165 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd" Mar 14 07:30:04 crc kubenswrapper[5129]: I0314 07:30:04.038054 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:30:04 crc kubenswrapper[5129]: E0314 07:30:04.038338 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.025359 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-njt7z" Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.134362 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5sjw\" (UniqueName: \"kubernetes.io/projected/e22ffa3b-1f27-4da4-9740-ec824287f399-kube-api-access-j5sjw\") pod \"e22ffa3b-1f27-4da4-9740-ec824287f399\" (UID: \"e22ffa3b-1f27-4da4-9740-ec824287f399\") " Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.141483 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22ffa3b-1f27-4da4-9740-ec824287f399-kube-api-access-j5sjw" (OuterVolumeSpecName: "kube-api-access-j5sjw") pod "e22ffa3b-1f27-4da4-9740-ec824287f399" (UID: "e22ffa3b-1f27-4da4-9740-ec824287f399"). InnerVolumeSpecName "kube-api-access-j5sjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.236078 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5sjw\" (UniqueName: \"kubernetes.io/projected/e22ffa3b-1f27-4da4-9740-ec824287f399-kube-api-access-j5sjw\") on node \"crc\" DevicePath \"\"" Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.719065 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557890-njt7z" event={"ID":"e22ffa3b-1f27-4da4-9740-ec824287f399","Type":"ContainerDied","Data":"c1f2ec13b88669a73f8b2cef45ba14bb8eff619b629f5e4255f875a87fe1d313"} Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.719381 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f2ec13b88669a73f8b2cef45ba14bb8eff619b629f5e4255f875a87fe1d313" Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.719128 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557890-njt7z" Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.769114 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-67wtn"] Mar 14 07:30:05 crc kubenswrapper[5129]: I0314 07:30:05.773810 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557884-67wtn"] Mar 14 07:30:06 crc kubenswrapper[5129]: I0314 07:30:06.060458 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57fff26f-ddfc-4ac7-a184-6453a398b6d2" path="/var/lib/kubelet/pods/57fff26f-ddfc-4ac7-a184-6453a398b6d2/volumes" Mar 14 07:30:17 crc kubenswrapper[5129]: I0314 07:30:17.037239 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:30:17 crc kubenswrapper[5129]: E0314 07:30:17.038238 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:30:21 crc kubenswrapper[5129]: I0314 07:30:21.233743 5129 scope.go:117] "RemoveContainer" containerID="f34be114ca921e0d7a9c3d7e6dafcb312a64592a1b2a5df49c39e4250824fd0f" Mar 14 07:30:21 crc kubenswrapper[5129]: I0314 07:30:21.285772 5129 scope.go:117] "RemoveContainer" containerID="248d4f00157a530cd3ea73e75f32845dece93694cb179c040ed0222eb822593e" Mar 14 07:30:21 crc kubenswrapper[5129]: I0314 07:30:21.307717 5129 scope.go:117] "RemoveContainer" containerID="2656cf6975a47fcc3760e2d72b9120d77f55fb73efb37f6c295fa023565203e8" Mar 14 07:30:21 crc kubenswrapper[5129]: I0314 07:30:21.359469 5129 scope.go:117] "RemoveContainer" containerID="df33d7fbb10361dc5cb6e14bde3e758c886880e563b624273a3f21cdeda670f9" Mar 14 07:30:28 crc kubenswrapper[5129]: I0314 07:30:28.043404 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:30:28 crc kubenswrapper[5129]: I0314 07:30:28.903994 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"c38d36f06eb4b757f36b25fb37469413b65aad11bf50477a16b03ccdf9567177"} Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.154746 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557892-l6dww"] Mar 14 07:32:00 crc kubenswrapper[5129]: E0314 07:32:00.155443 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e86f46-497a-42a5-b15f-fbb484545a18" containerName="collect-profiles" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.155455 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e86f46-497a-42a5-b15f-fbb484545a18" containerName="collect-profiles" Mar 14 07:32:00 crc kubenswrapper[5129]: E0314 07:32:00.155465 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22ffa3b-1f27-4da4-9740-ec824287f399" containerName="oc" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.155471 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22ffa3b-1f27-4da4-9740-ec824287f399" containerName="oc" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.155596 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e86f46-497a-42a5-b15f-fbb484545a18" containerName="collect-profiles" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.155624 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22ffa3b-1f27-4da4-9740-ec824287f399" containerName="oc" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.156017 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-l6dww" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.158235 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.161794 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.167205 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.183409 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-l6dww"] Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.232942 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpn76\" (UniqueName: \"kubernetes.io/projected/26c85724-56b2-4975-bed6-37ccc7d8a8ad-kube-api-access-lpn76\") pod \"auto-csr-approver-29557892-l6dww\" (UID: \"26c85724-56b2-4975-bed6-37ccc7d8a8ad\") " pod="openshift-infra/auto-csr-approver-29557892-l6dww" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.334657 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpn76\" (UniqueName: \"kubernetes.io/projected/26c85724-56b2-4975-bed6-37ccc7d8a8ad-kube-api-access-lpn76\") pod \"auto-csr-approver-29557892-l6dww\" (UID: \"26c85724-56b2-4975-bed6-37ccc7d8a8ad\") " pod="openshift-infra/auto-csr-approver-29557892-l6dww" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.358014 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpn76\" (UniqueName: \"kubernetes.io/projected/26c85724-56b2-4975-bed6-37ccc7d8a8ad-kube-api-access-lpn76\") pod \"auto-csr-approver-29557892-l6dww\" (UID: \"26c85724-56b2-4975-bed6-37ccc7d8a8ad\") " pod="openshift-infra/auto-csr-approver-29557892-l6dww" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.469826 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-l6dww" Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.743428 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-l6dww"] Mar 14 07:32:00 crc kubenswrapper[5129]: I0314 07:32:00.754788 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:32:01 crc kubenswrapper[5129]: I0314 07:32:01.717379 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557892-l6dww" event={"ID":"26c85724-56b2-4975-bed6-37ccc7d8a8ad","Type":"ContainerStarted","Data":"0f92bbd4b2fba26f31d6c49d3ccbdb29362804ce06b22d3d8f3f8b8438371005"} Mar 14 07:32:02 crc kubenswrapper[5129]: I0314 07:32:02.728223 5129 generic.go:334] "Generic (PLEG): container finished" podID="26c85724-56b2-4975-bed6-37ccc7d8a8ad" containerID="7ea8bdbda0cd0bcc1e69c75947cd5755fd87b18950a0b063992d1bfbd733c467" exitCode=0 Mar 14 07:32:02 crc kubenswrapper[5129]: I0314 07:32:02.728279 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557892-l6dww" event={"ID":"26c85724-56b2-4975-bed6-37ccc7d8a8ad","Type":"ContainerDied","Data":"7ea8bdbda0cd0bcc1e69c75947cd5755fd87b18950a0b063992d1bfbd733c467"} Mar 14 07:32:04 crc kubenswrapper[5129]: I0314 07:32:04.066584 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-l6dww" Mar 14 07:32:04 crc kubenswrapper[5129]: I0314 07:32:04.206478 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpn76\" (UniqueName: \"kubernetes.io/projected/26c85724-56b2-4975-bed6-37ccc7d8a8ad-kube-api-access-lpn76\") pod \"26c85724-56b2-4975-bed6-37ccc7d8a8ad\" (UID: \"26c85724-56b2-4975-bed6-37ccc7d8a8ad\") " Mar 14 07:32:04 crc kubenswrapper[5129]: I0314 07:32:04.215028 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c85724-56b2-4975-bed6-37ccc7d8a8ad-kube-api-access-lpn76" (OuterVolumeSpecName: "kube-api-access-lpn76") pod "26c85724-56b2-4975-bed6-37ccc7d8a8ad" (UID: "26c85724-56b2-4975-bed6-37ccc7d8a8ad"). InnerVolumeSpecName "kube-api-access-lpn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:32:04 crc kubenswrapper[5129]: I0314 07:32:04.308313 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpn76\" (UniqueName: \"kubernetes.io/projected/26c85724-56b2-4975-bed6-37ccc7d8a8ad-kube-api-access-lpn76\") on node \"crc\" DevicePath \"\"" Mar 14 07:32:04 crc kubenswrapper[5129]: I0314 07:32:04.747115 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557892-l6dww" event={"ID":"26c85724-56b2-4975-bed6-37ccc7d8a8ad","Type":"ContainerDied","Data":"0f92bbd4b2fba26f31d6c49d3ccbdb29362804ce06b22d3d8f3f8b8438371005"} Mar 14 07:32:04 crc kubenswrapper[5129]: I0314 07:32:04.747189 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f92bbd4b2fba26f31d6c49d3ccbdb29362804ce06b22d3d8f3f8b8438371005" Mar 14 07:32:04 crc kubenswrapper[5129]: I0314 07:32:04.747242 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557892-l6dww" Mar 14 07:32:05 crc kubenswrapper[5129]: I0314 07:32:05.139014 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8z79x"] Mar 14 07:32:05 crc kubenswrapper[5129]: I0314 07:32:05.145683 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557886-8z79x"] Mar 14 07:32:06 crc kubenswrapper[5129]: I0314 07:32:06.050063 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865753e9-fbb2-475f-9b60-c598f5199ecf" path="/var/lib/kubelet/pods/865753e9-fbb2-475f-9b60-c598f5199ecf/volumes" Mar 14 07:32:21 crc kubenswrapper[5129]: I0314 07:32:21.439929 5129 scope.go:117] "RemoveContainer" containerID="546f73617960a1c5914f46c5b758402ea474b5b6eb6eef57c8eafae71f1308c0" Mar 14 07:32:49 crc kubenswrapper[5129]: I0314 07:32:49.574090 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:32:49 crc kubenswrapper[5129]: I0314 07:32:49.574836 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.248458 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g54fl"] Mar 14 07:33:07 crc kubenswrapper[5129]: E0314 07:33:07.249237 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c85724-56b2-4975-bed6-37ccc7d8a8ad" containerName="oc" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.249249 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c85724-56b2-4975-bed6-37ccc7d8a8ad" containerName="oc" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.249378 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c85724-56b2-4975-bed6-37ccc7d8a8ad" containerName="oc" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.250279 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.275228 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g54fl"] Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.344709 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-utilities\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.344770 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-catalog-content\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.344882 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhszx\" (UniqueName: \"kubernetes.io/projected/b2ae20eb-9b80-490b-a3de-acce8d9345cd-kube-api-access-qhszx\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.446221 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-catalog-content\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.446292 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhszx\" (UniqueName: \"kubernetes.io/projected/b2ae20eb-9b80-490b-a3de-acce8d9345cd-kube-api-access-qhszx\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.446353 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-utilities\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.446824 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-utilities\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.447078 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-catalog-content\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.453242 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6lw2v"] Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.455462 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.470687 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhszx\" (UniqueName: \"kubernetes.io/projected/b2ae20eb-9b80-490b-a3de-acce8d9345cd-kube-api-access-qhszx\") pod \"certified-operators-g54fl\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.471191 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lw2v"] Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.547988 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-catalog-content\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.548056 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-utilities\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.548142 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfgqp\" (UniqueName: \"kubernetes.io/projected/088f4180-432b-4959-acee-c1f8df7ded5e-kube-api-access-wfgqp\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.567719 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.649054 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-catalog-content\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.649289 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-utilities\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.649324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfgqp\" (UniqueName: \"kubernetes.io/projected/088f4180-432b-4959-acee-c1f8df7ded5e-kube-api-access-wfgqp\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.649838 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-catalog-content\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.649902 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-utilities\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.666350 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfgqp\" (UniqueName: \"kubernetes.io/projected/088f4180-432b-4959-acee-c1f8df7ded5e-kube-api-access-wfgqp\") pod \"community-operators-6lw2v\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.805915 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:07 crc kubenswrapper[5129]: I0314 07:33:07.879025 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g54fl"] Mar 14 07:33:07 crc kubenswrapper[5129]: W0314 07:33:07.896004 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2ae20eb_9b80_490b_a3de_acce8d9345cd.slice/crio-de74f92bb0673d83173dfc91a3484660d504540300c56d296f579c7e1903c597 WatchSource:0}: Error finding container de74f92bb0673d83173dfc91a3484660d504540300c56d296f579c7e1903c597: Status 404 returned error can't find the container with id de74f92bb0673d83173dfc91a3484660d504540300c56d296f579c7e1903c597 Mar 14 07:33:08 crc kubenswrapper[5129]: I0314 07:33:08.268727 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lw2v"] Mar 14 07:33:08 crc kubenswrapper[5129]: W0314 07:33:08.271921 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088f4180_432b_4959_acee_c1f8df7ded5e.slice/crio-552450e972d690ffd2d6bed75db84345cd42dbb721d0d22f5e76b4c33a60c7d2 WatchSource:0}: Error finding container 552450e972d690ffd2d6bed75db84345cd42dbb721d0d22f5e76b4c33a60c7d2: Status 404 returned error can't find the container with id 552450e972d690ffd2d6bed75db84345cd42dbb721d0d22f5e76b4c33a60c7d2 Mar 14 07:33:08 crc kubenswrapper[5129]: I0314 07:33:08.315790 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerID="cda08928ed4834ba98b50c424755a2fdf829a41a2202fb93316cb666424a6000" exitCode=0 Mar 14 07:33:08 crc kubenswrapper[5129]: I0314 07:33:08.315847 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g54fl" event={"ID":"b2ae20eb-9b80-490b-a3de-acce8d9345cd","Type":"ContainerDied","Data":"cda08928ed4834ba98b50c424755a2fdf829a41a2202fb93316cb666424a6000"} Mar 14 07:33:08 crc kubenswrapper[5129]: I0314 07:33:08.315909 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g54fl" event={"ID":"b2ae20eb-9b80-490b-a3de-acce8d9345cd","Type":"ContainerStarted","Data":"de74f92bb0673d83173dfc91a3484660d504540300c56d296f579c7e1903c597"} Mar 14 07:33:08 crc kubenswrapper[5129]: I0314 07:33:08.317553 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lw2v" event={"ID":"088f4180-432b-4959-acee-c1f8df7ded5e","Type":"ContainerStarted","Data":"552450e972d690ffd2d6bed75db84345cd42dbb721d0d22f5e76b4c33a60c7d2"} Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.332281 5129 generic.go:334] "Generic (PLEG): container finished" podID="088f4180-432b-4959-acee-c1f8df7ded5e" containerID="c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a" exitCode=0 Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.332378 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lw2v" event={"ID":"088f4180-432b-4959-acee-c1f8df7ded5e","Type":"ContainerDied","Data":"c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a"} Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.336636 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g54fl" event={"ID":"b2ae20eb-9b80-490b-a3de-acce8d9345cd","Type":"ContainerStarted","Data":"734fe29832ad21a15b6c4e3b56ead326cdf98b043585d5548de6f2f505ccbc61"} Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.874947 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2pd"] Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.877432 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.888137 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2pd"] Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.991128 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-utilities\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.991488 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-catalog-content\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:09 crc kubenswrapper[5129]: I0314 07:33:09.991830 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mb2r\" (UniqueName: \"kubernetes.io/projected/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-kube-api-access-2mb2r\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.092793 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-catalog-content\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.093099 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mb2r\" (UniqueName: \"kubernetes.io/projected/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-kube-api-access-2mb2r\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.093160 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-utilities\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.093575 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-utilities\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.093581 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-catalog-content\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.119376 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mb2r\" (UniqueName: \"kubernetes.io/projected/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-kube-api-access-2mb2r\") pod \"redhat-marketplace-2z2pd\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.229949 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.357281 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lw2v" event={"ID":"088f4180-432b-4959-acee-c1f8df7ded5e","Type":"ContainerStarted","Data":"dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc"} Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.359306 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerID="734fe29832ad21a15b6c4e3b56ead326cdf98b043585d5548de6f2f505ccbc61" exitCode=0 Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.359341 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g54fl" event={"ID":"b2ae20eb-9b80-490b-a3de-acce8d9345cd","Type":"ContainerDied","Data":"734fe29832ad21a15b6c4e3b56ead326cdf98b043585d5548de6f2f505ccbc61"} Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.359360 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g54fl" event={"ID":"b2ae20eb-9b80-490b-a3de-acce8d9345cd","Type":"ContainerStarted","Data":"aaa1d6915c9c51ef9b637b54a426f7a4c0687671665e48815b2d4d55cfecf4c6"} Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.452074 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g54fl" podStartSLOduration=2.01360951 podStartE2EDuration="3.452049165s" podCreationTimestamp="2026-03-14 07:33:07 +0000 UTC" firstStartedPulling="2026-03-14 07:33:08.317709291 +0000 UTC m=+2051.069624475" lastFinishedPulling="2026-03-14 07:33:09.756148906 +0000 UTC m=+2052.508064130" observedRunningTime="2026-03-14 07:33:10.398861942 +0000 UTC m=+2053.150777146" watchObservedRunningTime="2026-03-14 07:33:10.452049165 +0000 UTC m=+2053.203964349" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.452581 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-89xtm"] Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.457147 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.463774 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89xtm"] Mar 14 07:33:10 crc kubenswrapper[5129]: W0314 07:33:10.482509 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1e56a68_e86b_4abd_9dc7_dbcde1ece547.slice/crio-d37ed38c7c5fdb7a485d491e426f5f8bada149ca566b9b5785d19b8718afcf38 WatchSource:0}: Error finding container d37ed38c7c5fdb7a485d491e426f5f8bada149ca566b9b5785d19b8718afcf38: Status 404 returned error can't find the container with id d37ed38c7c5fdb7a485d491e426f5f8bada149ca566b9b5785d19b8718afcf38 Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.485416 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2pd"] Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.502882 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh2lh\" (UniqueName: \"kubernetes.io/projected/6921e59a-7b09-4511-aa9f-a16489e25d31-kube-api-access-hh2lh\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.502943 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-catalog-content\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.503005 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-utilities\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.604550 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-utilities\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.604655 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh2lh\" (UniqueName: \"kubernetes.io/projected/6921e59a-7b09-4511-aa9f-a16489e25d31-kube-api-access-hh2lh\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.604688 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-catalog-content\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.605199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-catalog-content\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.605199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-utilities\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.621895 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh2lh\" (UniqueName: \"kubernetes.io/projected/6921e59a-7b09-4511-aa9f-a16489e25d31-kube-api-access-hh2lh\") pod \"redhat-operators-89xtm\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:10 crc kubenswrapper[5129]: I0314 07:33:10.785936 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.219222 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89xtm"] Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.366783 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerID="3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5" exitCode=0 Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.366838 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2pd" event={"ID":"b1e56a68-e86b-4abd-9dc7-dbcde1ece547","Type":"ContainerDied","Data":"3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5"} Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.366905 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2pd" event={"ID":"b1e56a68-e86b-4abd-9dc7-dbcde1ece547","Type":"ContainerStarted","Data":"d37ed38c7c5fdb7a485d491e426f5f8bada149ca566b9b5785d19b8718afcf38"} Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.369183 5129 generic.go:334] "Generic (PLEG): container finished" podID="088f4180-432b-4959-acee-c1f8df7ded5e" containerID="dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc" exitCode=0 Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.369273 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lw2v" event={"ID":"088f4180-432b-4959-acee-c1f8df7ded5e","Type":"ContainerDied","Data":"dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc"} Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.372697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89xtm" event={"ID":"6921e59a-7b09-4511-aa9f-a16489e25d31","Type":"ContainerStarted","Data":"972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe"} Mar 14 07:33:11 crc kubenswrapper[5129]: I0314 07:33:11.372727 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89xtm" event={"ID":"6921e59a-7b09-4511-aa9f-a16489e25d31","Type":"ContainerStarted","Data":"9f2d85f286fee9b1a01e37534e9ae5736f8cb35601b39a685703c555e59f98b1"} Mar 14 07:33:12 crc kubenswrapper[5129]: I0314 07:33:12.382817 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerID="646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7" exitCode=0 Mar 14 07:33:12 crc kubenswrapper[5129]: I0314 07:33:12.382896 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2pd" event={"ID":"b1e56a68-e86b-4abd-9dc7-dbcde1ece547","Type":"ContainerDied","Data":"646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7"} Mar 14 07:33:12 crc kubenswrapper[5129]: I0314 07:33:12.387464 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lw2v" event={"ID":"088f4180-432b-4959-acee-c1f8df7ded5e","Type":"ContainerStarted","Data":"731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d"} Mar 14 07:33:12 crc kubenswrapper[5129]: I0314 07:33:12.389191 5129 generic.go:334] "Generic (PLEG): container finished" podID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerID="972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe" exitCode=0 Mar 14 07:33:12 crc kubenswrapper[5129]: I0314 07:33:12.389242 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89xtm" event={"ID":"6921e59a-7b09-4511-aa9f-a16489e25d31","Type":"ContainerDied","Data":"972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe"} Mar 14 07:33:12 crc kubenswrapper[5129]: I0314 07:33:12.456691 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6lw2v" podStartSLOduration=3.029435811 podStartE2EDuration="5.45667388s" podCreationTimestamp="2026-03-14 07:33:07 +0000 UTC" firstStartedPulling="2026-03-14 07:33:09.334101146 +0000 UTC m=+2052.086016330" lastFinishedPulling="2026-03-14 07:33:11.761339215 +0000 UTC m=+2054.513254399" observedRunningTime="2026-03-14 07:33:12.455664232 +0000 UTC m=+2055.207579436" watchObservedRunningTime="2026-03-14 07:33:12.45667388 +0000 UTC m=+2055.208589064" Mar 14 07:33:13 crc kubenswrapper[5129]: I0314 07:33:13.401382 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2pd" event={"ID":"b1e56a68-e86b-4abd-9dc7-dbcde1ece547","Type":"ContainerStarted","Data":"e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b"} Mar 14 07:33:13 crc kubenswrapper[5129]: I0314 07:33:13.431578 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2z2pd" podStartSLOduration=3.022265875 podStartE2EDuration="4.431558328s" podCreationTimestamp="2026-03-14 07:33:09 +0000 UTC" firstStartedPulling="2026-03-14 07:33:11.369429203 +0000 UTC m=+2054.121344397" lastFinishedPulling="2026-03-14 07:33:12.778721676 +0000 UTC m=+2055.530636850" observedRunningTime="2026-03-14 07:33:13.424194508 +0000 UTC m=+2056.176109712" watchObservedRunningTime="2026-03-14 07:33:13.431558328 +0000 UTC m=+2056.183473522" Mar 14 07:33:14 crc kubenswrapper[5129]: I0314 07:33:14.409536 5129 generic.go:334] "Generic (PLEG): container finished" podID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerID="3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b" exitCode=0 Mar 14 07:33:14 crc kubenswrapper[5129]: I0314 07:33:14.409621 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89xtm" event={"ID":"6921e59a-7b09-4511-aa9f-a16489e25d31","Type":"ContainerDied","Data":"3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b"} Mar 14 07:33:15 crc kubenswrapper[5129]: I0314 07:33:15.421577 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89xtm" event={"ID":"6921e59a-7b09-4511-aa9f-a16489e25d31","Type":"ContainerStarted","Data":"17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e"} Mar 14 07:33:15 crc kubenswrapper[5129]: I0314 07:33:15.442181 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-89xtm" podStartSLOduration=2.932800496 podStartE2EDuration="5.442159924s" podCreationTimestamp="2026-03-14 07:33:10 +0000 UTC" firstStartedPulling="2026-03-14 07:33:12.390938036 +0000 UTC m=+2055.142853220" lastFinishedPulling="2026-03-14 07:33:14.900297454 +0000 UTC m=+2057.652212648" observedRunningTime="2026-03-14 07:33:15.437111917 +0000 UTC m=+2058.189027131" watchObservedRunningTime="2026-03-14 07:33:15.442159924 +0000 UTC m=+2058.194075118" Mar 14 07:33:17 crc kubenswrapper[5129]: I0314 07:33:17.568164 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:17 crc kubenswrapper[5129]: I0314 07:33:17.568510 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:17 crc kubenswrapper[5129]: I0314 07:33:17.609931 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:17 crc kubenswrapper[5129]: I0314 07:33:17.807086 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:17 crc kubenswrapper[5129]: I0314 07:33:17.807150 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:17 crc kubenswrapper[5129]: I0314 07:33:17.843413 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:18 crc kubenswrapper[5129]: I0314 07:33:18.490224 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:18 crc kubenswrapper[5129]: I0314 07:33:18.492285 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:19 crc kubenswrapper[5129]: I0314 07:33:19.574038 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:33:19 crc kubenswrapper[5129]: I0314 07:33:19.574121 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.230847 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.233769 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.243379 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g54fl"] Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.280717 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.454037 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g54fl" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="registry-server" containerID="cri-o://aaa1d6915c9c51ef9b637b54a426f7a4c0687671665e48815b2d4d55cfecf4c6" gracePeriod=2 Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.494405 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.786105 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.786561 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.844070 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lw2v"] Mar 14 07:33:20 crc kubenswrapper[5129]: I0314 07:33:20.844338 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6lw2v" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="registry-server" containerID="cri-o://731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d" gracePeriod=2 Mar 14 07:33:21 crc kubenswrapper[5129]: I0314 07:33:21.835819 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-89xtm" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="registry-server" probeResult="failure" output=< Mar 14 07:33:21 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:33:21 crc kubenswrapper[5129]: > Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.399407 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.420278 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-catalog-content\") pod \"088f4180-432b-4959-acee-c1f8df7ded5e\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.420924 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfgqp\" (UniqueName: \"kubernetes.io/projected/088f4180-432b-4959-acee-c1f8df7ded5e-kube-api-access-wfgqp\") pod \"088f4180-432b-4959-acee-c1f8df7ded5e\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.420978 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-utilities\") pod \"088f4180-432b-4959-acee-c1f8df7ded5e\" (UID: \"088f4180-432b-4959-acee-c1f8df7ded5e\") " Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.423067 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-utilities" (OuterVolumeSpecName: "utilities") pod "088f4180-432b-4959-acee-c1f8df7ded5e" (UID: "088f4180-432b-4959-acee-c1f8df7ded5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.428414 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/088f4180-432b-4959-acee-c1f8df7ded5e-kube-api-access-wfgqp" (OuterVolumeSpecName: "kube-api-access-wfgqp") pod "088f4180-432b-4959-acee-c1f8df7ded5e" (UID: "088f4180-432b-4959-acee-c1f8df7ded5e"). InnerVolumeSpecName "kube-api-access-wfgqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.471891 5129 generic.go:334] "Generic (PLEG): container finished" podID="088f4180-432b-4959-acee-c1f8df7ded5e" containerID="731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d" exitCode=0 Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.471960 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lw2v" event={"ID":"088f4180-432b-4959-acee-c1f8df7ded5e","Type":"ContainerDied","Data":"731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d"} Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.471968 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lw2v" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.471986 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lw2v" event={"ID":"088f4180-432b-4959-acee-c1f8df7ded5e","Type":"ContainerDied","Data":"552450e972d690ffd2d6bed75db84345cd42dbb721d0d22f5e76b4c33a60c7d2"} Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.472004 5129 scope.go:117] "RemoveContainer" containerID="731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.477038 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerID="aaa1d6915c9c51ef9b637b54a426f7a4c0687671665e48815b2d4d55cfecf4c6" exitCode=0 Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.477921 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g54fl" event={"ID":"b2ae20eb-9b80-490b-a3de-acce8d9345cd","Type":"ContainerDied","Data":"aaa1d6915c9c51ef9b637b54a426f7a4c0687671665e48815b2d4d55cfecf4c6"} Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.500768 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "088f4180-432b-4959-acee-c1f8df7ded5e" (UID: "088f4180-432b-4959-acee-c1f8df7ded5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.501530 5129 scope.go:117] "RemoveContainer" containerID="dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.525064 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.525108 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfgqp\" (UniqueName: \"kubernetes.io/projected/088f4180-432b-4959-acee-c1f8df7ded5e-kube-api-access-wfgqp\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.525124 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088f4180-432b-4959-acee-c1f8df7ded5e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.532534 5129 scope.go:117] "RemoveContainer" containerID="c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.555207 5129 scope.go:117] "RemoveContainer" containerID="731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d" Mar 14 07:33:22 crc kubenswrapper[5129]: E0314 07:33:22.559974 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d\": container with ID starting with 731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d not found: ID does not exist" containerID="731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.560025 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d"} err="failed to get container status \"731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d\": rpc error: code = NotFound desc = could not find container \"731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d\": container with ID starting with 731f59a11390d3c45ef0eccfca9a9e481afd5f3c0fcf9250ba3cfbe0b5d5ac8d not found: ID does not exist" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.560055 5129 scope.go:117] "RemoveContainer" containerID="dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc" Mar 14 07:33:22 crc kubenswrapper[5129]: E0314 07:33:22.560444 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc\": container with ID starting with dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc not found: ID does not exist" containerID="dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.560488 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc"} err="failed to get container status \"dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc\": rpc error: code = NotFound desc = could not find container \"dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc\": container with ID starting with dd029fcf57833890ae4f437f60a2c6db4631d1e497fbc8cfef6fa110762457bc not found: ID does not exist" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.560518 5129 scope.go:117] "RemoveContainer" containerID="c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a" Mar 14 07:33:22 crc kubenswrapper[5129]: E0314 07:33:22.561047 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a\": container with ID starting with c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a not found: ID does not exist" containerID="c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.561082 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a"} err="failed to get container status \"c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a\": rpc error: code = NotFound desc = could not find container \"c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a\": container with ID starting with c487c3b0ed96535a2fa39f29348a53c2818104ad8e7df2e4dea984b767e3c86a not found: ID does not exist" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.636142 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.801087 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lw2v"] Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.805489 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6lw2v"] Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.828318 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-catalog-content\") pod \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.828423 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhszx\" (UniqueName: \"kubernetes.io/projected/b2ae20eb-9b80-490b-a3de-acce8d9345cd-kube-api-access-qhszx\") pod \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.828703 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-utilities\") pod \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\" (UID: \"b2ae20eb-9b80-490b-a3de-acce8d9345cd\") " Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.829522 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-utilities" (OuterVolumeSpecName: "utilities") pod "b2ae20eb-9b80-490b-a3de-acce8d9345cd" (UID: "b2ae20eb-9b80-490b-a3de-acce8d9345cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.833823 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ae20eb-9b80-490b-a3de-acce8d9345cd-kube-api-access-qhszx" (OuterVolumeSpecName: "kube-api-access-qhszx") pod "b2ae20eb-9b80-490b-a3de-acce8d9345cd" (UID: "b2ae20eb-9b80-490b-a3de-acce8d9345cd"). InnerVolumeSpecName "kube-api-access-qhszx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.879773 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2ae20eb-9b80-490b-a3de-acce8d9345cd" (UID: "b2ae20eb-9b80-490b-a3de-acce8d9345cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.929915 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.929951 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhszx\" (UniqueName: \"kubernetes.io/projected/b2ae20eb-9b80-490b-a3de-acce8d9345cd-kube-api-access-qhszx\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:22 crc kubenswrapper[5129]: I0314 07:33:22.929962 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae20eb-9b80-490b-a3de-acce8d9345cd-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:23 crc kubenswrapper[5129]: I0314 07:33:23.489719 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g54fl" event={"ID":"b2ae20eb-9b80-490b-a3de-acce8d9345cd","Type":"ContainerDied","Data":"de74f92bb0673d83173dfc91a3484660d504540300c56d296f579c7e1903c597"} Mar 14 07:33:23 crc kubenswrapper[5129]: I0314 07:33:23.489770 5129 scope.go:117] "RemoveContainer" containerID="aaa1d6915c9c51ef9b637b54a426f7a4c0687671665e48815b2d4d55cfecf4c6" Mar 14 07:33:23 crc kubenswrapper[5129]: I0314 07:33:23.489883 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g54fl" Mar 14 07:33:23 crc kubenswrapper[5129]: I0314 07:33:23.504475 5129 scope.go:117] "RemoveContainer" containerID="734fe29832ad21a15b6c4e3b56ead326cdf98b043585d5548de6f2f505ccbc61" Mar 14 07:33:23 crc kubenswrapper[5129]: I0314 07:33:23.523977 5129 scope.go:117] "RemoveContainer" containerID="cda08928ed4834ba98b50c424755a2fdf829a41a2202fb93316cb666424a6000" Mar 14 07:33:23 crc kubenswrapper[5129]: I0314 07:33:23.582551 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g54fl"] Mar 14 07:33:23 crc kubenswrapper[5129]: I0314 07:33:23.589369 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g54fl"] Mar 14 07:33:24 crc kubenswrapper[5129]: I0314 07:33:24.052181 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" path="/var/lib/kubelet/pods/088f4180-432b-4959-acee-c1f8df7ded5e/volumes" Mar 14 07:33:24 crc kubenswrapper[5129]: I0314 07:33:24.053689 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" path="/var/lib/kubelet/pods/b2ae20eb-9b80-490b-a3de-acce8d9345cd/volumes" Mar 14 07:33:24 crc kubenswrapper[5129]: I0314 07:33:24.853296 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2pd"] Mar 14 07:33:24 crc kubenswrapper[5129]: I0314 07:33:24.855374 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2z2pd" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="registry-server" containerID="cri-o://e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b" gracePeriod=2 Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.288029 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.466448 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-utilities\") pod \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.466562 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mb2r\" (UniqueName: \"kubernetes.io/projected/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-kube-api-access-2mb2r\") pod \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.466723 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-catalog-content\") pod \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\" (UID: \"b1e56a68-e86b-4abd-9dc7-dbcde1ece547\") " Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.468286 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-utilities" (OuterVolumeSpecName: "utilities") pod "b1e56a68-e86b-4abd-9dc7-dbcde1ece547" (UID: "b1e56a68-e86b-4abd-9dc7-dbcde1ece547"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.474899 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-kube-api-access-2mb2r" (OuterVolumeSpecName: "kube-api-access-2mb2r") pod "b1e56a68-e86b-4abd-9dc7-dbcde1ece547" (UID: "b1e56a68-e86b-4abd-9dc7-dbcde1ece547"). InnerVolumeSpecName "kube-api-access-2mb2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.498402 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1e56a68-e86b-4abd-9dc7-dbcde1ece547" (UID: "b1e56a68-e86b-4abd-9dc7-dbcde1ece547"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.508978 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerID="e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b" exitCode=0 Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.509024 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2pd" event={"ID":"b1e56a68-e86b-4abd-9dc7-dbcde1ece547","Type":"ContainerDied","Data":"e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b"} Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.509047 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2pd" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.509064 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2pd" event={"ID":"b1e56a68-e86b-4abd-9dc7-dbcde1ece547","Type":"ContainerDied","Data":"d37ed38c7c5fdb7a485d491e426f5f8bada149ca566b9b5785d19b8718afcf38"} Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.509083 5129 scope.go:117] "RemoveContainer" containerID="e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.530285 5129 scope.go:117] "RemoveContainer" containerID="646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.544197 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2pd"] Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.549446 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2pd"] Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.564650 5129 scope.go:117] "RemoveContainer" containerID="3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.569026 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.569056 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mb2r\" (UniqueName: \"kubernetes.io/projected/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-kube-api-access-2mb2r\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.569070 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e56a68-e86b-4abd-9dc7-dbcde1ece547-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.580520 5129 scope.go:117] "RemoveContainer" containerID="e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b" Mar 14 07:33:25 crc kubenswrapper[5129]: E0314 07:33:25.580961 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b\": container with ID starting with e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b not found: ID does not exist" containerID="e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.581000 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b"} err="failed to get container status \"e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b\": rpc error: code = NotFound desc = could not find container \"e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b\": container with ID starting with e50719558b274a9b565c6f484e3922a036b79562093399b1e59106e4e4a1901b not found: ID does not exist" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.581044 5129 scope.go:117] "RemoveContainer" containerID="646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7" Mar 14 07:33:25 crc kubenswrapper[5129]: E0314 07:33:25.581417 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7\": container with ID starting with 646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7 not found: ID does not exist" containerID="646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.581456 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7"} err="failed to get container status \"646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7\": rpc error: code = NotFound desc = could not find container \"646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7\": container with ID starting with 646647e9a0404209b57c3311967b39858c8a632a39a3c11dc23a2466438c70e7 not found: ID does not exist" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.581476 5129 scope.go:117] "RemoveContainer" containerID="3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5" Mar 14 07:33:25 crc kubenswrapper[5129]: E0314 07:33:25.581867 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5\": container with ID starting with 3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5 not found: ID does not exist" containerID="3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5" Mar 14 07:33:25 crc kubenswrapper[5129]: I0314 07:33:25.581897 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5"} err="failed to get container status \"3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5\": rpc error: code = NotFound desc = could not find container \"3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5\": container with ID starting with 3c6fb92d73b54e0dfacd619404c4b9f1128efadfe2bfa3617ebb0f945268cfc5 not found: ID does not exist" Mar 14 07:33:26 crc kubenswrapper[5129]: I0314 07:33:26.049043 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" path="/var/lib/kubelet/pods/b1e56a68-e86b-4abd-9dc7-dbcde1ece547/volumes" Mar 14 07:33:30 crc kubenswrapper[5129]: I0314 07:33:30.865950 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:30 crc kubenswrapper[5129]: I0314 07:33:30.948961 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:33 crc kubenswrapper[5129]: I0314 07:33:33.862002 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89xtm"] Mar 14 07:33:33 crc kubenswrapper[5129]: I0314 07:33:33.862771 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-89xtm" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="registry-server" containerID="cri-o://17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e" gracePeriod=2 Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.347140 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.401412 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-utilities\") pod \"6921e59a-7b09-4511-aa9f-a16489e25d31\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.401529 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-catalog-content\") pod \"6921e59a-7b09-4511-aa9f-a16489e25d31\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.401564 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh2lh\" (UniqueName: \"kubernetes.io/projected/6921e59a-7b09-4511-aa9f-a16489e25d31-kube-api-access-hh2lh\") pod \"6921e59a-7b09-4511-aa9f-a16489e25d31\" (UID: \"6921e59a-7b09-4511-aa9f-a16489e25d31\") " Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.402507 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-utilities" (OuterVolumeSpecName: "utilities") pod "6921e59a-7b09-4511-aa9f-a16489e25d31" (UID: "6921e59a-7b09-4511-aa9f-a16489e25d31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.403563 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.408457 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6921e59a-7b09-4511-aa9f-a16489e25d31-kube-api-access-hh2lh" (OuterVolumeSpecName: "kube-api-access-hh2lh") pod "6921e59a-7b09-4511-aa9f-a16489e25d31" (UID: "6921e59a-7b09-4511-aa9f-a16489e25d31"). InnerVolumeSpecName "kube-api-access-hh2lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.505729 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh2lh\" (UniqueName: \"kubernetes.io/projected/6921e59a-7b09-4511-aa9f-a16489e25d31-kube-api-access-hh2lh\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.538661 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6921e59a-7b09-4511-aa9f-a16489e25d31" (UID: "6921e59a-7b09-4511-aa9f-a16489e25d31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.592330 5129 generic.go:334] "Generic (PLEG): container finished" podID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerID="17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e" exitCode=0 Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.592385 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89xtm" event={"ID":"6921e59a-7b09-4511-aa9f-a16489e25d31","Type":"ContainerDied","Data":"17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e"} Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.592397 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89xtm" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.592417 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89xtm" event={"ID":"6921e59a-7b09-4511-aa9f-a16489e25d31","Type":"ContainerDied","Data":"9f2d85f286fee9b1a01e37534e9ae5736f8cb35601b39a685703c555e59f98b1"} Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.592440 5129 scope.go:117] "RemoveContainer" containerID="17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.606818 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6921e59a-7b09-4511-aa9f-a16489e25d31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.627707 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89xtm"] Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.628059 5129 scope.go:117] "RemoveContainer" containerID="3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.636141 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-89xtm"] Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.670352 5129 scope.go:117] "RemoveContainer" containerID="972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.685426 5129 scope.go:117] "RemoveContainer" containerID="17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e" Mar 14 07:33:34 crc kubenswrapper[5129]: E0314 07:33:34.685844 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e\": container with ID starting with 17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e not found: ID does not exist" containerID="17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.685913 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e"} err="failed to get container status \"17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e\": rpc error: code = NotFound desc = could not find container \"17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e\": container with ID starting with 17c91eac4f3a4e26231248859ab69dd1e6f0fc9a582cc25673b308ce3ec1e20e not found: ID does not exist" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.685950 5129 scope.go:117] "RemoveContainer" containerID="3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b" Mar 14 07:33:34 crc kubenswrapper[5129]: E0314 07:33:34.686361 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b\": container with ID starting with 3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b not found: ID does not exist" containerID="3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.686400 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b"} err="failed to get container status \"3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b\": rpc error: code = NotFound desc = could not find container \"3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b\": container with ID starting with 3bf455f90d5f698866df5a8427254033e87f7f99a798a5bf07fb0fcb8ec54f1b not found: ID does not exist" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.686426 5129 scope.go:117] "RemoveContainer" containerID="972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe" Mar 14 07:33:34 crc kubenswrapper[5129]: E0314 07:33:34.686713 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe\": container with ID starting with 972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe not found: ID does not exist" containerID="972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe" Mar 14 07:33:34 crc kubenswrapper[5129]: I0314 07:33:34.686748 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe"} err="failed to get container status \"972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe\": rpc error: code = NotFound desc = could not find container \"972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe\": container with ID starting with 972221d93c171016b884815977e2e888126a81b1a8bd2f9863b630fcfcc2aebe not found: ID does not exist" Mar 14 07:33:36 crc kubenswrapper[5129]: I0314 07:33:36.052888 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" path="/var/lib/kubelet/pods/6921e59a-7b09-4511-aa9f-a16489e25d31/volumes" Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.574592 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.575359 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.575433 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.576267 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c38d36f06eb4b757f36b25fb37469413b65aad11bf50477a16b03ccdf9567177"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.576380 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://c38d36f06eb4b757f36b25fb37469413b65aad11bf50477a16b03ccdf9567177" gracePeriod=600 Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.728877 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="c38d36f06eb4b757f36b25fb37469413b65aad11bf50477a16b03ccdf9567177" exitCode=0 Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.729003 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"c38d36f06eb4b757f36b25fb37469413b65aad11bf50477a16b03ccdf9567177"} Mar 14 07:33:49 crc kubenswrapper[5129]: I0314 07:33:49.729444 5129 scope.go:117] "RemoveContainer" containerID="2b4d78cd181fdd7b96499c453c28e591272e4450948f977a6bc78fc661d7c0a3" Mar 14 07:33:50 crc kubenswrapper[5129]: I0314 07:33:50.739628 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7"} Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.156124 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557894-9j6lw"] Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.158784 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.158966 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.159114 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.159229 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.159375 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.159518 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.159687 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.159819 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.159959 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.160128 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.160311 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.160471 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.160636 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.160769 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.160863 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.160934 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.161013 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.161112 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="extract-utilities" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.161232 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.161315 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.161415 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.161490 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: E0314 07:34:00.161571 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.161668 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="extract-content" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.161936 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="088f4180-432b-4959-acee-c1f8df7ded5e" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.162027 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e56a68-e86b-4abd-9dc7-dbcde1ece547" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.162107 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ae20eb-9b80-490b-a3de-acce8d9345cd" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.162199 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6921e59a-7b09-4511-aa9f-a16489e25d31" containerName="registry-server" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.163573 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-9j6lw" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.167784 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.167865 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-9j6lw"] Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.168448 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.168765 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.325255 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqnzh\" (UniqueName: \"kubernetes.io/projected/6c1755e7-2714-4bf6-8d54-af6fb12c3bac-kube-api-access-qqnzh\") pod \"auto-csr-approver-29557894-9j6lw\" (UID: \"6c1755e7-2714-4bf6-8d54-af6fb12c3bac\") " pod="openshift-infra/auto-csr-approver-29557894-9j6lw" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.426467 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqnzh\" (UniqueName: \"kubernetes.io/projected/6c1755e7-2714-4bf6-8d54-af6fb12c3bac-kube-api-access-qqnzh\") pod \"auto-csr-approver-29557894-9j6lw\" (UID: \"6c1755e7-2714-4bf6-8d54-af6fb12c3bac\") " pod="openshift-infra/auto-csr-approver-29557894-9j6lw" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.450202 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqnzh\" (UniqueName: \"kubernetes.io/projected/6c1755e7-2714-4bf6-8d54-af6fb12c3bac-kube-api-access-qqnzh\") pod \"auto-csr-approver-29557894-9j6lw\" (UID: \"6c1755e7-2714-4bf6-8d54-af6fb12c3bac\") " pod="openshift-infra/auto-csr-approver-29557894-9j6lw" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.485181 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-9j6lw" Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.719465 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-9j6lw"] Mar 14 07:34:00 crc kubenswrapper[5129]: I0314 07:34:00.830445 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557894-9j6lw" event={"ID":"6c1755e7-2714-4bf6-8d54-af6fb12c3bac","Type":"ContainerStarted","Data":"261af8fb8a96f4001bd55529c7e5e387842ce2b33b2def927721ff5bf8eca953"} Mar 14 07:34:02 crc kubenswrapper[5129]: I0314 07:34:02.849526 5129 generic.go:334] "Generic (PLEG): container finished" podID="6c1755e7-2714-4bf6-8d54-af6fb12c3bac" containerID="254a88eab577e75abb06cda05f53d3360d6511e53f5e35affb41b1e22c4c4a83" exitCode=0 Mar 14 07:34:02 crc kubenswrapper[5129]: I0314 07:34:02.849649 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557894-9j6lw" event={"ID":"6c1755e7-2714-4bf6-8d54-af6fb12c3bac","Type":"ContainerDied","Data":"254a88eab577e75abb06cda05f53d3360d6511e53f5e35affb41b1e22c4c4a83"} Mar 14 07:34:04 crc kubenswrapper[5129]: I0314 07:34:04.073068 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-9j6lw" Mar 14 07:34:04 crc kubenswrapper[5129]: I0314 07:34:04.179223 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqnzh\" (UniqueName: \"kubernetes.io/projected/6c1755e7-2714-4bf6-8d54-af6fb12c3bac-kube-api-access-qqnzh\") pod \"6c1755e7-2714-4bf6-8d54-af6fb12c3bac\" (UID: \"6c1755e7-2714-4bf6-8d54-af6fb12c3bac\") " Mar 14 07:34:04 crc kubenswrapper[5129]: I0314 07:34:04.188202 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1755e7-2714-4bf6-8d54-af6fb12c3bac-kube-api-access-qqnzh" (OuterVolumeSpecName: "kube-api-access-qqnzh") pod "6c1755e7-2714-4bf6-8d54-af6fb12c3bac" (UID: "6c1755e7-2714-4bf6-8d54-af6fb12c3bac"). InnerVolumeSpecName "kube-api-access-qqnzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:34:04 crc kubenswrapper[5129]: I0314 07:34:04.280796 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqnzh\" (UniqueName: \"kubernetes.io/projected/6c1755e7-2714-4bf6-8d54-af6fb12c3bac-kube-api-access-qqnzh\") on node \"crc\" DevicePath \"\"" Mar 14 07:34:04 crc kubenswrapper[5129]: I0314 07:34:04.862705 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557894-9j6lw" event={"ID":"6c1755e7-2714-4bf6-8d54-af6fb12c3bac","Type":"ContainerDied","Data":"261af8fb8a96f4001bd55529c7e5e387842ce2b33b2def927721ff5bf8eca953"} Mar 14 07:34:04 crc kubenswrapper[5129]: I0314 07:34:04.862999 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="261af8fb8a96f4001bd55529c7e5e387842ce2b33b2def927721ff5bf8eca953" Mar 14 07:34:04 crc kubenswrapper[5129]: I0314 07:34:04.862761 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557894-9j6lw" Mar 14 07:34:05 crc kubenswrapper[5129]: I0314 07:34:05.162794 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-ktwdm"] Mar 14 07:34:05 crc kubenswrapper[5129]: I0314 07:34:05.168861 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557888-ktwdm"] Mar 14 07:34:06 crc kubenswrapper[5129]: I0314 07:34:06.051225 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6636a98f-5cd3-4141-81c8-93128c0dce7b" path="/var/lib/kubelet/pods/6636a98f-5cd3-4141-81c8-93128c0dce7b/volumes" Mar 14 07:34:21 crc kubenswrapper[5129]: I0314 07:34:21.546055 5129 scope.go:117] "RemoveContainer" containerID="5dde55797b5380580f1237f698461c0b7685a5c3287904143dc18e06be309f1a" Mar 14 07:35:49 crc kubenswrapper[5129]: I0314 07:35:49.575010 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:35:49 crc kubenswrapper[5129]: I0314 07:35:49.576644 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.146025 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557896-f4vsw"] Mar 14 07:36:00 crc kubenswrapper[5129]: E0314 07:36:00.147145 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1755e7-2714-4bf6-8d54-af6fb12c3bac" containerName="oc" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.147165 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1755e7-2714-4bf6-8d54-af6fb12c3bac" containerName="oc" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.147324 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1755e7-2714-4bf6-8d54-af6fb12c3bac" containerName="oc" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.147888 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-f4vsw" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.152554 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.152778 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.153239 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.168136 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-f4vsw"] Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.243272 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksmgd\" (UniqueName: \"kubernetes.io/projected/ae02d693-9a0e-4c2e-b555-772be4b508fa-kube-api-access-ksmgd\") pod \"auto-csr-approver-29557896-f4vsw\" (UID: \"ae02d693-9a0e-4c2e-b555-772be4b508fa\") " pod="openshift-infra/auto-csr-approver-29557896-f4vsw" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.344677 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksmgd\" (UniqueName: \"kubernetes.io/projected/ae02d693-9a0e-4c2e-b555-772be4b508fa-kube-api-access-ksmgd\") pod \"auto-csr-approver-29557896-f4vsw\" (UID: \"ae02d693-9a0e-4c2e-b555-772be4b508fa\") " pod="openshift-infra/auto-csr-approver-29557896-f4vsw" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.370120 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksmgd\" (UniqueName: \"kubernetes.io/projected/ae02d693-9a0e-4c2e-b555-772be4b508fa-kube-api-access-ksmgd\") pod \"auto-csr-approver-29557896-f4vsw\" (UID: \"ae02d693-9a0e-4c2e-b555-772be4b508fa\") " pod="openshift-infra/auto-csr-approver-29557896-f4vsw" Mar 14 07:36:00 crc kubenswrapper[5129]: I0314 07:36:00.467548 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-f4vsw" Mar 14 07:36:01 crc kubenswrapper[5129]: I0314 07:36:01.003406 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-f4vsw"] Mar 14 07:36:01 crc kubenswrapper[5129]: I0314 07:36:01.771511 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557896-f4vsw" event={"ID":"ae02d693-9a0e-4c2e-b555-772be4b508fa","Type":"ContainerStarted","Data":"6d70c154bfa8ebec8e21114805e576191abb4a26284f620566f83be153187214"} Mar 14 07:36:02 crc kubenswrapper[5129]: I0314 07:36:02.781519 5129 generic.go:334] "Generic (PLEG): container finished" podID="ae02d693-9a0e-4c2e-b555-772be4b508fa" containerID="51567081588d0db9bd7b8de6784c5d414d6dfb55e0bfcbc40e8df813c9c61b24" exitCode=0 Mar 14 07:36:02 crc kubenswrapper[5129]: I0314 07:36:02.781576 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557896-f4vsw" event={"ID":"ae02d693-9a0e-4c2e-b555-772be4b508fa","Type":"ContainerDied","Data":"51567081588d0db9bd7b8de6784c5d414d6dfb55e0bfcbc40e8df813c9c61b24"} Mar 14 07:36:04 crc kubenswrapper[5129]: I0314 07:36:04.094079 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-f4vsw" Mar 14 07:36:04 crc kubenswrapper[5129]: I0314 07:36:04.204805 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksmgd\" (UniqueName: \"kubernetes.io/projected/ae02d693-9a0e-4c2e-b555-772be4b508fa-kube-api-access-ksmgd\") pod \"ae02d693-9a0e-4c2e-b555-772be4b508fa\" (UID: \"ae02d693-9a0e-4c2e-b555-772be4b508fa\") " Mar 14 07:36:04 crc kubenswrapper[5129]: I0314 07:36:04.211526 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae02d693-9a0e-4c2e-b555-772be4b508fa-kube-api-access-ksmgd" (OuterVolumeSpecName: "kube-api-access-ksmgd") pod "ae02d693-9a0e-4c2e-b555-772be4b508fa" (UID: "ae02d693-9a0e-4c2e-b555-772be4b508fa"). InnerVolumeSpecName "kube-api-access-ksmgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:36:04 crc kubenswrapper[5129]: I0314 07:36:04.307073 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksmgd\" (UniqueName: \"kubernetes.io/projected/ae02d693-9a0e-4c2e-b555-772be4b508fa-kube-api-access-ksmgd\") on node \"crc\" DevicePath \"\"" Mar 14 07:36:04 crc kubenswrapper[5129]: I0314 07:36:04.801679 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557896-f4vsw" event={"ID":"ae02d693-9a0e-4c2e-b555-772be4b508fa","Type":"ContainerDied","Data":"6d70c154bfa8ebec8e21114805e576191abb4a26284f620566f83be153187214"} Mar 14 07:36:04 crc kubenswrapper[5129]: I0314 07:36:04.801714 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557896-f4vsw" Mar 14 07:36:04 crc kubenswrapper[5129]: I0314 07:36:04.801749 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d70c154bfa8ebec8e21114805e576191abb4a26284f620566f83be153187214" Mar 14 07:36:05 crc kubenswrapper[5129]: I0314 07:36:05.164010 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-njt7z"] Mar 14 07:36:05 crc kubenswrapper[5129]: I0314 07:36:05.169549 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557890-njt7z"] Mar 14 07:36:06 crc kubenswrapper[5129]: I0314 07:36:06.046363 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22ffa3b-1f27-4da4-9740-ec824287f399" path="/var/lib/kubelet/pods/e22ffa3b-1f27-4da4-9740-ec824287f399/volumes" Mar 14 07:36:19 crc kubenswrapper[5129]: I0314 07:36:19.574919 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:36:19 crc kubenswrapper[5129]: I0314 07:36:19.575476 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:36:21 crc kubenswrapper[5129]: I0314 07:36:21.689736 5129 scope.go:117] "RemoveContainer" containerID="bde2e6fc1e221c741b06c8376fb66c51765a55b595a1732802b52d3f9f3e09f9" Mar 14 07:36:49 crc kubenswrapper[5129]: I0314 07:36:49.574925 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:36:49 crc kubenswrapper[5129]: I0314 07:36:49.575770 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:36:49 crc kubenswrapper[5129]: I0314 07:36:49.575870 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:36:49 crc kubenswrapper[5129]: I0314 07:36:49.577315 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:36:49 crc kubenswrapper[5129]: I0314 07:36:49.577675 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" gracePeriod=600 Mar 14 07:36:49 crc kubenswrapper[5129]: E0314 07:36:49.708063 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:36:50 crc kubenswrapper[5129]: I0314 07:36:50.209101 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" exitCode=0 Mar 14 07:36:50 crc kubenswrapper[5129]: I0314 07:36:50.209158 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7"} Mar 14 07:36:50 crc kubenswrapper[5129]: I0314 07:36:50.209226 5129 scope.go:117] "RemoveContainer" containerID="c38d36f06eb4b757f36b25fb37469413b65aad11bf50477a16b03ccdf9567177" Mar 14 07:36:50 crc kubenswrapper[5129]: I0314 07:36:50.210100 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:36:50 crc kubenswrapper[5129]: E0314 07:36:50.210834 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:37:05 crc kubenswrapper[5129]: I0314 07:37:05.036040 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:37:05 crc kubenswrapper[5129]: E0314 07:37:05.036834 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:37:18 crc kubenswrapper[5129]: I0314 07:37:18.043933 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:37:18 crc kubenswrapper[5129]: E0314 07:37:18.044711 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:37:31 crc kubenswrapper[5129]: I0314 07:37:31.036322 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:37:31 crc kubenswrapper[5129]: E0314 07:37:31.037227 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:37:43 crc kubenswrapper[5129]: I0314 07:37:43.036709 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:37:43 crc kubenswrapper[5129]: E0314 07:37:43.037918 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:37:54 crc kubenswrapper[5129]: I0314 07:37:54.037574 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:37:54 crc kubenswrapper[5129]: E0314 07:37:54.040546 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.142201 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557898-74j7m"] Mar 14 07:38:00 crc kubenswrapper[5129]: E0314 07:38:00.143969 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae02d693-9a0e-4c2e-b555-772be4b508fa" containerName="oc" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.143990 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae02d693-9a0e-4c2e-b555-772be4b508fa" containerName="oc" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.144148 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae02d693-9a0e-4c2e-b555-772be4b508fa" containerName="oc" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.144709 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-74j7m" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.148301 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.148755 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-74j7m"] Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.148905 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.149108 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.271098 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqtz\" (UniqueName: \"kubernetes.io/projected/af5cfd68-61db-42d2-8a82-f22021b73518-kube-api-access-kkqtz\") pod \"auto-csr-approver-29557898-74j7m\" (UID: \"af5cfd68-61db-42d2-8a82-f22021b73518\") " pod="openshift-infra/auto-csr-approver-29557898-74j7m" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.372056 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqtz\" (UniqueName: \"kubernetes.io/projected/af5cfd68-61db-42d2-8a82-f22021b73518-kube-api-access-kkqtz\") pod \"auto-csr-approver-29557898-74j7m\" (UID: \"af5cfd68-61db-42d2-8a82-f22021b73518\") " pod="openshift-infra/auto-csr-approver-29557898-74j7m" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.390115 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqtz\" (UniqueName: \"kubernetes.io/projected/af5cfd68-61db-42d2-8a82-f22021b73518-kube-api-access-kkqtz\") pod \"auto-csr-approver-29557898-74j7m\" (UID: \"af5cfd68-61db-42d2-8a82-f22021b73518\") " pod="openshift-infra/auto-csr-approver-29557898-74j7m" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.464215 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-74j7m" Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.698470 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-74j7m"] Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.711418 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:38:00 crc kubenswrapper[5129]: I0314 07:38:00.816562 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557898-74j7m" event={"ID":"af5cfd68-61db-42d2-8a82-f22021b73518","Type":"ContainerStarted","Data":"6b6363cda9734e5acc28d3d3b19ab55222c960d80175de8822f458df502fd7b5"} Mar 14 07:38:02 crc kubenswrapper[5129]: I0314 07:38:02.832405 5129 generic.go:334] "Generic (PLEG): container finished" podID="af5cfd68-61db-42d2-8a82-f22021b73518" containerID="25155cf366438569197a23b3ac310c69a09b8148d444c82f1f9dd463457e3e96" exitCode=0 Mar 14 07:38:02 crc kubenswrapper[5129]: I0314 07:38:02.832450 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557898-74j7m" event={"ID":"af5cfd68-61db-42d2-8a82-f22021b73518","Type":"ContainerDied","Data":"25155cf366438569197a23b3ac310c69a09b8148d444c82f1f9dd463457e3e96"} Mar 14 07:38:04 crc kubenswrapper[5129]: I0314 07:38:04.164769 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-74j7m" Mar 14 07:38:04 crc kubenswrapper[5129]: I0314 07:38:04.325715 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkqtz\" (UniqueName: \"kubernetes.io/projected/af5cfd68-61db-42d2-8a82-f22021b73518-kube-api-access-kkqtz\") pod \"af5cfd68-61db-42d2-8a82-f22021b73518\" (UID: \"af5cfd68-61db-42d2-8a82-f22021b73518\") " Mar 14 07:38:04 crc kubenswrapper[5129]: I0314 07:38:04.331735 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5cfd68-61db-42d2-8a82-f22021b73518-kube-api-access-kkqtz" (OuterVolumeSpecName: "kube-api-access-kkqtz") pod "af5cfd68-61db-42d2-8a82-f22021b73518" (UID: "af5cfd68-61db-42d2-8a82-f22021b73518"). InnerVolumeSpecName "kube-api-access-kkqtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:38:04 crc kubenswrapper[5129]: I0314 07:38:04.427119 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkqtz\" (UniqueName: \"kubernetes.io/projected/af5cfd68-61db-42d2-8a82-f22021b73518-kube-api-access-kkqtz\") on node \"crc\" DevicePath \"\"" Mar 14 07:38:04 crc kubenswrapper[5129]: I0314 07:38:04.850060 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557898-74j7m" event={"ID":"af5cfd68-61db-42d2-8a82-f22021b73518","Type":"ContainerDied","Data":"6b6363cda9734e5acc28d3d3b19ab55222c960d80175de8822f458df502fd7b5"} Mar 14 07:38:04 crc kubenswrapper[5129]: I0314 07:38:04.850106 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b6363cda9734e5acc28d3d3b19ab55222c960d80175de8822f458df502fd7b5" Mar 14 07:38:04 crc kubenswrapper[5129]: I0314 07:38:04.850104 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557898-74j7m" Mar 14 07:38:05 crc kubenswrapper[5129]: I0314 07:38:05.035862 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:38:05 crc kubenswrapper[5129]: E0314 07:38:05.036252 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:38:05 crc kubenswrapper[5129]: I0314 07:38:05.241908 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-l6dww"] Mar 14 07:38:05 crc kubenswrapper[5129]: I0314 07:38:05.249530 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557892-l6dww"] Mar 14 07:38:06 crc kubenswrapper[5129]: I0314 07:38:06.044865 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c85724-56b2-4975-bed6-37ccc7d8a8ad" path="/var/lib/kubelet/pods/26c85724-56b2-4975-bed6-37ccc7d8a8ad/volumes" Mar 14 07:38:17 crc kubenswrapper[5129]: I0314 07:38:17.037304 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:38:17 crc kubenswrapper[5129]: E0314 07:38:17.039650 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:38:21 crc kubenswrapper[5129]: I0314 07:38:21.793146 5129 scope.go:117] "RemoveContainer" containerID="7ea8bdbda0cd0bcc1e69c75947cd5755fd87b18950a0b063992d1bfbd733c467" Mar 14 07:38:31 crc kubenswrapper[5129]: I0314 07:38:31.037351 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:38:31 crc kubenswrapper[5129]: E0314 07:38:31.038224 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:38:44 crc kubenswrapper[5129]: I0314 07:38:44.036678 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:38:44 crc kubenswrapper[5129]: E0314 07:38:44.037896 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:38:56 crc kubenswrapper[5129]: I0314 07:38:56.036942 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:38:56 crc kubenswrapper[5129]: E0314 07:38:56.037698 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:39:09 crc kubenswrapper[5129]: I0314 07:39:09.036007 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:39:09 crc kubenswrapper[5129]: E0314 07:39:09.036681 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:39:22 crc kubenswrapper[5129]: I0314 07:39:22.036524 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:39:22 crc kubenswrapper[5129]: E0314 07:39:22.037315 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:39:34 crc kubenswrapper[5129]: I0314 07:39:34.037165 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:39:34 crc kubenswrapper[5129]: E0314 07:39:34.037899 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:39:48 crc kubenswrapper[5129]: I0314 07:39:48.040263 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:39:48 crc kubenswrapper[5129]: E0314 07:39:48.041962 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:39:59 crc kubenswrapper[5129]: I0314 07:39:59.036774 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:39:59 crc kubenswrapper[5129]: E0314 07:39:59.037464 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.144062 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557900-sfz55"] Mar 14 07:40:00 crc kubenswrapper[5129]: E0314 07:40:00.144681 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5cfd68-61db-42d2-8a82-f22021b73518" containerName="oc" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.144700 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5cfd68-61db-42d2-8a82-f22021b73518" containerName="oc" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.144866 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5cfd68-61db-42d2-8a82-f22021b73518" containerName="oc" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.145316 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-sfz55" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.146942 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.148094 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.148939 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.152015 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-sfz55"] Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.249375 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr829\" (UniqueName: \"kubernetes.io/projected/f79804d4-f7ad-41d0-9f30-ec5814f2b75a-kube-api-access-vr829\") pod \"auto-csr-approver-29557900-sfz55\" (UID: \"f79804d4-f7ad-41d0-9f30-ec5814f2b75a\") " pod="openshift-infra/auto-csr-approver-29557900-sfz55" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.351109 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr829\" (UniqueName: \"kubernetes.io/projected/f79804d4-f7ad-41d0-9f30-ec5814f2b75a-kube-api-access-vr829\") pod \"auto-csr-approver-29557900-sfz55\" (UID: \"f79804d4-f7ad-41d0-9f30-ec5814f2b75a\") " pod="openshift-infra/auto-csr-approver-29557900-sfz55" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.371994 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr829\" (UniqueName: \"kubernetes.io/projected/f79804d4-f7ad-41d0-9f30-ec5814f2b75a-kube-api-access-vr829\") pod \"auto-csr-approver-29557900-sfz55\" (UID: \"f79804d4-f7ad-41d0-9f30-ec5814f2b75a\") " pod="openshift-infra/auto-csr-approver-29557900-sfz55" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.460148 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-sfz55" Mar 14 07:40:00 crc kubenswrapper[5129]: I0314 07:40:00.844317 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-sfz55"] Mar 14 07:40:01 crc kubenswrapper[5129]: I0314 07:40:01.681536 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557900-sfz55" event={"ID":"f79804d4-f7ad-41d0-9f30-ec5814f2b75a","Type":"ContainerStarted","Data":"41d72f97c28efa7a14f457894fe4dfb75ebfdc713336e7b6c9537e15624b3c9c"} Mar 14 07:40:02 crc kubenswrapper[5129]: I0314 07:40:02.692223 5129 generic.go:334] "Generic (PLEG): container finished" podID="f79804d4-f7ad-41d0-9f30-ec5814f2b75a" containerID="f8a212801cb360516b604349a37d7d9f7251d59775f3a01ea785cfa98c75f4a4" exitCode=0 Mar 14 07:40:02 crc kubenswrapper[5129]: I0314 07:40:02.692294 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557900-sfz55" event={"ID":"f79804d4-f7ad-41d0-9f30-ec5814f2b75a","Type":"ContainerDied","Data":"f8a212801cb360516b604349a37d7d9f7251d59775f3a01ea785cfa98c75f4a4"} Mar 14 07:40:03 crc kubenswrapper[5129]: I0314 07:40:03.931992 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-sfz55" Mar 14 07:40:04 crc kubenswrapper[5129]: I0314 07:40:04.097333 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr829\" (UniqueName: \"kubernetes.io/projected/f79804d4-f7ad-41d0-9f30-ec5814f2b75a-kube-api-access-vr829\") pod \"f79804d4-f7ad-41d0-9f30-ec5814f2b75a\" (UID: \"f79804d4-f7ad-41d0-9f30-ec5814f2b75a\") " Mar 14 07:40:04 crc kubenswrapper[5129]: I0314 07:40:04.103221 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79804d4-f7ad-41d0-9f30-ec5814f2b75a-kube-api-access-vr829" (OuterVolumeSpecName: "kube-api-access-vr829") pod "f79804d4-f7ad-41d0-9f30-ec5814f2b75a" (UID: "f79804d4-f7ad-41d0-9f30-ec5814f2b75a"). InnerVolumeSpecName "kube-api-access-vr829". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:40:04 crc kubenswrapper[5129]: I0314 07:40:04.199012 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr829\" (UniqueName: \"kubernetes.io/projected/f79804d4-f7ad-41d0-9f30-ec5814f2b75a-kube-api-access-vr829\") on node \"crc\" DevicePath \"\"" Mar 14 07:40:04 crc kubenswrapper[5129]: I0314 07:40:04.709457 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557900-sfz55" event={"ID":"f79804d4-f7ad-41d0-9f30-ec5814f2b75a","Type":"ContainerDied","Data":"41d72f97c28efa7a14f457894fe4dfb75ebfdc713336e7b6c9537e15624b3c9c"} Mar 14 07:40:04 crc kubenswrapper[5129]: I0314 07:40:04.709908 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41d72f97c28efa7a14f457894fe4dfb75ebfdc713336e7b6c9537e15624b3c9c" Mar 14 07:40:04 crc kubenswrapper[5129]: I0314 07:40:04.709529 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557900-sfz55" Mar 14 07:40:05 crc kubenswrapper[5129]: I0314 07:40:05.005145 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-9j6lw"] Mar 14 07:40:05 crc kubenswrapper[5129]: I0314 07:40:05.012417 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557894-9j6lw"] Mar 14 07:40:06 crc kubenswrapper[5129]: I0314 07:40:06.048868 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1755e7-2714-4bf6-8d54-af6fb12c3bac" path="/var/lib/kubelet/pods/6c1755e7-2714-4bf6-8d54-af6fb12c3bac/volumes" Mar 14 07:40:10 crc kubenswrapper[5129]: I0314 07:40:10.036817 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:40:10 crc kubenswrapper[5129]: E0314 07:40:10.037562 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:40:21 crc kubenswrapper[5129]: I0314 07:40:21.037067 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:40:21 crc kubenswrapper[5129]: E0314 07:40:21.038294 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:40:21 crc kubenswrapper[5129]: I0314 07:40:21.882301 5129 scope.go:117] "RemoveContainer" containerID="254a88eab577e75abb06cda05f53d3360d6511e53f5e35affb41b1e22c4c4a83" Mar 14 07:40:34 crc kubenswrapper[5129]: I0314 07:40:34.036189 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:40:34 crc kubenswrapper[5129]: E0314 07:40:34.037038 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:40:45 crc kubenswrapper[5129]: I0314 07:40:45.036828 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:40:45 crc kubenswrapper[5129]: E0314 07:40:45.037330 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:40:57 crc kubenswrapper[5129]: I0314 07:40:57.036301 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:40:57 crc kubenswrapper[5129]: E0314 07:40:57.036964 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:41:09 crc kubenswrapper[5129]: I0314 07:41:09.037300 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:41:09 crc kubenswrapper[5129]: E0314 07:41:09.038761 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:41:20 crc kubenswrapper[5129]: I0314 07:41:20.037333 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:41:20 crc kubenswrapper[5129]: E0314 07:41:20.038102 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:41:34 crc kubenswrapper[5129]: I0314 07:41:34.037360 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:41:34 crc kubenswrapper[5129]: E0314 07:41:34.038878 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:41:48 crc kubenswrapper[5129]: I0314 07:41:48.042530 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:41:48 crc kubenswrapper[5129]: E0314 07:41:48.043360 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.145089 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557902-rr487"] Mar 14 07:42:00 crc kubenswrapper[5129]: E0314 07:42:00.146292 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79804d4-f7ad-41d0-9f30-ec5814f2b75a" containerName="oc" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.146309 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79804d4-f7ad-41d0-9f30-ec5814f2b75a" containerName="oc" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.146480 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79804d4-f7ad-41d0-9f30-ec5814f2b75a" containerName="oc" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.147282 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-rr487" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.154549 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.154773 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.154908 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.171298 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-rr487"] Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.324416 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7fkj\" (UniqueName: \"kubernetes.io/projected/ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c-kube-api-access-t7fkj\") pod \"auto-csr-approver-29557902-rr487\" (UID: \"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c\") " pod="openshift-infra/auto-csr-approver-29557902-rr487" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.425390 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7fkj\" (UniqueName: \"kubernetes.io/projected/ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c-kube-api-access-t7fkj\") pod \"auto-csr-approver-29557902-rr487\" (UID: \"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c\") " pod="openshift-infra/auto-csr-approver-29557902-rr487" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.449050 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7fkj\" (UniqueName: \"kubernetes.io/projected/ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c-kube-api-access-t7fkj\") pod \"auto-csr-approver-29557902-rr487\" (UID: \"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c\") " pod="openshift-infra/auto-csr-approver-29557902-rr487" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.471919 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-rr487" Mar 14 07:42:00 crc kubenswrapper[5129]: I0314 07:42:00.878810 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-rr487"] Mar 14 07:42:01 crc kubenswrapper[5129]: I0314 07:42:01.659883 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557902-rr487" event={"ID":"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c","Type":"ContainerStarted","Data":"42f170f4fd703ef0d6380e1b3a952646de120975533e4fffdf0c52f37fd1381b"} Mar 14 07:42:02 crc kubenswrapper[5129]: I0314 07:42:02.669575 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557902-rr487" event={"ID":"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c","Type":"ContainerStarted","Data":"3ecf920c966401a0d0e544eb06bc1492cec602283305a6679499b05d3fd9c55f"} Mar 14 07:42:02 crc kubenswrapper[5129]: I0314 07:42:02.688763 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557902-rr487" podStartSLOduration=1.325118996 podStartE2EDuration="2.688746754s" podCreationTimestamp="2026-03-14 07:42:00 +0000 UTC" firstStartedPulling="2026-03-14 07:42:00.883174078 +0000 UTC m=+2583.635089262" lastFinishedPulling="2026-03-14 07:42:02.246801806 +0000 UTC m=+2584.998717020" observedRunningTime="2026-03-14 07:42:02.684464479 +0000 UTC m=+2585.436379683" watchObservedRunningTime="2026-03-14 07:42:02.688746754 +0000 UTC m=+2585.440661938" Mar 14 07:42:03 crc kubenswrapper[5129]: I0314 07:42:03.036957 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:42:03 crc kubenswrapper[5129]: I0314 07:42:03.678651 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"eb37c38a2b671eb228c48734c9694784fed3a17bbf09d28871edf6b039cd1362"} Mar 14 07:42:03 crc kubenswrapper[5129]: I0314 07:42:03.680144 5129 generic.go:334] "Generic (PLEG): container finished" podID="ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c" containerID="3ecf920c966401a0d0e544eb06bc1492cec602283305a6679499b05d3fd9c55f" exitCode=0 Mar 14 07:42:03 crc kubenswrapper[5129]: I0314 07:42:03.680200 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557902-rr487" event={"ID":"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c","Type":"ContainerDied","Data":"3ecf920c966401a0d0e544eb06bc1492cec602283305a6679499b05d3fd9c55f"} Mar 14 07:42:04 crc kubenswrapper[5129]: I0314 07:42:04.998147 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-rr487" Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.200855 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7fkj\" (UniqueName: \"kubernetes.io/projected/ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c-kube-api-access-t7fkj\") pod \"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c\" (UID: \"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c\") " Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.210252 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c-kube-api-access-t7fkj" (OuterVolumeSpecName: "kube-api-access-t7fkj") pod "ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c" (UID: "ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c"). InnerVolumeSpecName "kube-api-access-t7fkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.302363 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7fkj\" (UniqueName: \"kubernetes.io/projected/ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c-kube-api-access-t7fkj\") on node \"crc\" DevicePath \"\"" Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.761220 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-f4vsw"] Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.763732 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557896-f4vsw"] Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.966051 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557902-rr487" event={"ID":"ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c","Type":"ContainerDied","Data":"42f170f4fd703ef0d6380e1b3a952646de120975533e4fffdf0c52f37fd1381b"} Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.966108 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42f170f4fd703ef0d6380e1b3a952646de120975533e4fffdf0c52f37fd1381b" Mar 14 07:42:05 crc kubenswrapper[5129]: I0314 07:42:05.966117 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557902-rr487" Mar 14 07:42:06 crc kubenswrapper[5129]: I0314 07:42:06.046430 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae02d693-9a0e-4c2e-b555-772be4b508fa" path="/var/lib/kubelet/pods/ae02d693-9a0e-4c2e-b555-772be4b508fa/volumes" Mar 14 07:42:21 crc kubenswrapper[5129]: I0314 07:42:21.985490 5129 scope.go:117] "RemoveContainer" containerID="51567081588d0db9bd7b8de6784c5d414d6dfb55e0bfcbc40e8df813c9c61b24" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.410452 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zw98d"] Mar 14 07:43:12 crc kubenswrapper[5129]: E0314 07:43:12.411689 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c" containerName="oc" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.411716 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c" containerName="oc" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.411966 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c" containerName="oc" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.413754 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.430757 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zw98d"] Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.532912 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddm6\" (UniqueName: \"kubernetes.io/projected/56064446-e75f-4582-9568-33846b3f98eb-kube-api-access-kddm6\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.533061 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-utilities\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.533114 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-catalog-content\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.634112 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-utilities\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.634192 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-catalog-content\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.634282 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddm6\" (UniqueName: \"kubernetes.io/projected/56064446-e75f-4582-9568-33846b3f98eb-kube-api-access-kddm6\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.634717 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-catalog-content\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.635012 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-utilities\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.657467 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddm6\" (UniqueName: \"kubernetes.io/projected/56064446-e75f-4582-9568-33846b3f98eb-kube-api-access-kddm6\") pod \"redhat-operators-zw98d\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.741617 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:12 crc kubenswrapper[5129]: I0314 07:43:12.993907 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zw98d"] Mar 14 07:43:13 crc kubenswrapper[5129]: I0314 07:43:13.509506 5129 generic.go:334] "Generic (PLEG): container finished" podID="56064446-e75f-4582-9568-33846b3f98eb" containerID="6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d" exitCode=0 Mar 14 07:43:13 crc kubenswrapper[5129]: I0314 07:43:13.509709 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw98d" event={"ID":"56064446-e75f-4582-9568-33846b3f98eb","Type":"ContainerDied","Data":"6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d"} Mar 14 07:43:13 crc kubenswrapper[5129]: I0314 07:43:13.509816 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw98d" event={"ID":"56064446-e75f-4582-9568-33846b3f98eb","Type":"ContainerStarted","Data":"8bfd10b9de71129370739e3afa35d66f5907108b2ad5789f116935ac3d3dfbfe"} Mar 14 07:43:13 crc kubenswrapper[5129]: I0314 07:43:13.511471 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:43:14 crc kubenswrapper[5129]: I0314 07:43:14.517166 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw98d" event={"ID":"56064446-e75f-4582-9568-33846b3f98eb","Type":"ContainerStarted","Data":"4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb"} Mar 14 07:43:15 crc kubenswrapper[5129]: I0314 07:43:15.526566 5129 generic.go:334] "Generic (PLEG): container finished" podID="56064446-e75f-4582-9568-33846b3f98eb" containerID="4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb" exitCode=0 Mar 14 07:43:15 crc kubenswrapper[5129]: I0314 07:43:15.526689 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw98d" event={"ID":"56064446-e75f-4582-9568-33846b3f98eb","Type":"ContainerDied","Data":"4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb"} Mar 14 07:43:17 crc kubenswrapper[5129]: I0314 07:43:17.544019 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw98d" event={"ID":"56064446-e75f-4582-9568-33846b3f98eb","Type":"ContainerStarted","Data":"59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1"} Mar 14 07:43:17 crc kubenswrapper[5129]: I0314 07:43:17.567391 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zw98d" podStartSLOduration=2.630585549 podStartE2EDuration="5.56736419s" podCreationTimestamp="2026-03-14 07:43:12 +0000 UTC" firstStartedPulling="2026-03-14 07:43:13.511229424 +0000 UTC m=+2656.263144608" lastFinishedPulling="2026-03-14 07:43:16.448008065 +0000 UTC m=+2659.199923249" observedRunningTime="2026-03-14 07:43:17.560828402 +0000 UTC m=+2660.312743586" watchObservedRunningTime="2026-03-14 07:43:17.56736419 +0000 UTC m=+2660.319279394" Mar 14 07:43:22 crc kubenswrapper[5129]: I0314 07:43:22.742075 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:22 crc kubenswrapper[5129]: I0314 07:43:22.742816 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:23 crc kubenswrapper[5129]: I0314 07:43:23.793144 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zw98d" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="registry-server" probeResult="failure" output=< Mar 14 07:43:23 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:43:23 crc kubenswrapper[5129]: > Mar 14 07:43:32 crc kubenswrapper[5129]: I0314 07:43:32.806413 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:32 crc kubenswrapper[5129]: I0314 07:43:32.861164 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:33 crc kubenswrapper[5129]: I0314 07:43:33.046371 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zw98d"] Mar 14 07:43:34 crc kubenswrapper[5129]: I0314 07:43:34.678727 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zw98d" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="registry-server" containerID="cri-o://59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1" gracePeriod=2 Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.071492 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.120625 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-catalog-content\") pod \"56064446-e75f-4582-9568-33846b3f98eb\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.120672 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-utilities\") pod \"56064446-e75f-4582-9568-33846b3f98eb\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.120773 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kddm6\" (UniqueName: \"kubernetes.io/projected/56064446-e75f-4582-9568-33846b3f98eb-kube-api-access-kddm6\") pod \"56064446-e75f-4582-9568-33846b3f98eb\" (UID: \"56064446-e75f-4582-9568-33846b3f98eb\") " Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.122738 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-utilities" (OuterVolumeSpecName: "utilities") pod "56064446-e75f-4582-9568-33846b3f98eb" (UID: "56064446-e75f-4582-9568-33846b3f98eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.131937 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56064446-e75f-4582-9568-33846b3f98eb-kube-api-access-kddm6" (OuterVolumeSpecName: "kube-api-access-kddm6") pod "56064446-e75f-4582-9568-33846b3f98eb" (UID: "56064446-e75f-4582-9568-33846b3f98eb"). InnerVolumeSpecName "kube-api-access-kddm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.221874 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.221901 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kddm6\" (UniqueName: \"kubernetes.io/projected/56064446-e75f-4582-9568-33846b3f98eb-kube-api-access-kddm6\") on node \"crc\" DevicePath \"\"" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.250971 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56064446-e75f-4582-9568-33846b3f98eb" (UID: "56064446-e75f-4582-9568-33846b3f98eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.323427 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56064446-e75f-4582-9568-33846b3f98eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.687897 5129 generic.go:334] "Generic (PLEG): container finished" podID="56064446-e75f-4582-9568-33846b3f98eb" containerID="59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1" exitCode=0 Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.687948 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw98d" event={"ID":"56064446-e75f-4582-9568-33846b3f98eb","Type":"ContainerDied","Data":"59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1"} Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.687972 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw98d" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.688575 5129 scope.go:117] "RemoveContainer" containerID="59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.688503 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw98d" event={"ID":"56064446-e75f-4582-9568-33846b3f98eb","Type":"ContainerDied","Data":"8bfd10b9de71129370739e3afa35d66f5907108b2ad5789f116935ac3d3dfbfe"} Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.718776 5129 scope.go:117] "RemoveContainer" containerID="4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.718911 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zw98d"] Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.723740 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zw98d"] Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.735324 5129 scope.go:117] "RemoveContainer" containerID="6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.768395 5129 scope.go:117] "RemoveContainer" containerID="59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1" Mar 14 07:43:35 crc kubenswrapper[5129]: E0314 07:43:35.768881 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1\": container with ID starting with 59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1 not found: ID does not exist" containerID="59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.768934 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1"} err="failed to get container status \"59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1\": rpc error: code = NotFound desc = could not find container \"59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1\": container with ID starting with 59a63d33bde1662be35b0232bb41b3209f358961d1b610d8159fb874fa0b8af1 not found: ID does not exist" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.768963 5129 scope.go:117] "RemoveContainer" containerID="4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb" Mar 14 07:43:35 crc kubenswrapper[5129]: E0314 07:43:35.769580 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb\": container with ID starting with 4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb not found: ID does not exist" containerID="4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.769639 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb"} err="failed to get container status \"4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb\": rpc error: code = NotFound desc = could not find container \"4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb\": container with ID starting with 4349cd463fc9fc9bf802bd74417bc78e6bd0692faea4b079ee4414bd440695fb not found: ID does not exist" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.769662 5129 scope.go:117] "RemoveContainer" containerID="6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d" Mar 14 07:43:35 crc kubenswrapper[5129]: E0314 07:43:35.769925 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d\": container with ID starting with 6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d not found: ID does not exist" containerID="6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d" Mar 14 07:43:35 crc kubenswrapper[5129]: I0314 07:43:35.769955 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d"} err="failed to get container status \"6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d\": rpc error: code = NotFound desc = could not find container \"6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d\": container with ID starting with 6c988e570f241bf2d66cd2495b1f5ba98f2c7456a20bf3282351aefdb480159d not found: ID does not exist" Mar 14 07:43:36 crc kubenswrapper[5129]: I0314 07:43:36.046840 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56064446-e75f-4582-9568-33846b3f98eb" path="/var/lib/kubelet/pods/56064446-e75f-4582-9568-33846b3f98eb/volumes" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.144699 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557904-zn9vc"] Mar 14 07:44:00 crc kubenswrapper[5129]: E0314 07:44:00.145521 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="extract-content" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.145537 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="extract-content" Mar 14 07:44:00 crc kubenswrapper[5129]: E0314 07:44:00.145549 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="extract-utilities" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.145556 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="extract-utilities" Mar 14 07:44:00 crc kubenswrapper[5129]: E0314 07:44:00.145567 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="registry-server" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.145573 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="registry-server" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.145762 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="56064446-e75f-4582-9568-33846b3f98eb" containerName="registry-server" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.146422 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-zn9vc" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.151180 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.154229 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.154229 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.156372 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-zn9vc"] Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.341178 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvrbb\" (UniqueName: \"kubernetes.io/projected/8bfced3b-98f3-4942-be06-41ecf1619f1c-kube-api-access-bvrbb\") pod \"auto-csr-approver-29557904-zn9vc\" (UID: \"8bfced3b-98f3-4942-be06-41ecf1619f1c\") " pod="openshift-infra/auto-csr-approver-29557904-zn9vc" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.443586 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvrbb\" (UniqueName: \"kubernetes.io/projected/8bfced3b-98f3-4942-be06-41ecf1619f1c-kube-api-access-bvrbb\") pod \"auto-csr-approver-29557904-zn9vc\" (UID: \"8bfced3b-98f3-4942-be06-41ecf1619f1c\") " pod="openshift-infra/auto-csr-approver-29557904-zn9vc" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.464293 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvrbb\" (UniqueName: \"kubernetes.io/projected/8bfced3b-98f3-4942-be06-41ecf1619f1c-kube-api-access-bvrbb\") pod \"auto-csr-approver-29557904-zn9vc\" (UID: \"8bfced3b-98f3-4942-be06-41ecf1619f1c\") " pod="openshift-infra/auto-csr-approver-29557904-zn9vc" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.472852 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-zn9vc" Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.711888 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-zn9vc"] Mar 14 07:44:00 crc kubenswrapper[5129]: I0314 07:44:00.903539 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557904-zn9vc" event={"ID":"8bfced3b-98f3-4942-be06-41ecf1619f1c","Type":"ContainerStarted","Data":"6a9f0614dc9c9552c6876d903b502aa40940fefb1f0881574a333c2e8da6c26b"} Mar 14 07:44:02 crc kubenswrapper[5129]: I0314 07:44:02.925080 5129 generic.go:334] "Generic (PLEG): container finished" podID="8bfced3b-98f3-4942-be06-41ecf1619f1c" containerID="52ada0934457f21cbdc70c28a526a5c6abd8e9bd660c88051c8ce106c55b8bf8" exitCode=0 Mar 14 07:44:02 crc kubenswrapper[5129]: I0314 07:44:02.925411 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557904-zn9vc" event={"ID":"8bfced3b-98f3-4942-be06-41ecf1619f1c","Type":"ContainerDied","Data":"52ada0934457f21cbdc70c28a526a5c6abd8e9bd660c88051c8ce106c55b8bf8"} Mar 14 07:44:04 crc kubenswrapper[5129]: I0314 07:44:04.212465 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-zn9vc" Mar 14 07:44:04 crc kubenswrapper[5129]: I0314 07:44:04.412382 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvrbb\" (UniqueName: \"kubernetes.io/projected/8bfced3b-98f3-4942-be06-41ecf1619f1c-kube-api-access-bvrbb\") pod \"8bfced3b-98f3-4942-be06-41ecf1619f1c\" (UID: \"8bfced3b-98f3-4942-be06-41ecf1619f1c\") " Mar 14 07:44:04 crc kubenswrapper[5129]: I0314 07:44:04.420533 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bfced3b-98f3-4942-be06-41ecf1619f1c-kube-api-access-bvrbb" (OuterVolumeSpecName: "kube-api-access-bvrbb") pod "8bfced3b-98f3-4942-be06-41ecf1619f1c" (UID: "8bfced3b-98f3-4942-be06-41ecf1619f1c"). InnerVolumeSpecName "kube-api-access-bvrbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:44:04 crc kubenswrapper[5129]: I0314 07:44:04.513943 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvrbb\" (UniqueName: \"kubernetes.io/projected/8bfced3b-98f3-4942-be06-41ecf1619f1c-kube-api-access-bvrbb\") on node \"crc\" DevicePath \"\"" Mar 14 07:44:04 crc kubenswrapper[5129]: I0314 07:44:04.942277 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557904-zn9vc" event={"ID":"8bfced3b-98f3-4942-be06-41ecf1619f1c","Type":"ContainerDied","Data":"6a9f0614dc9c9552c6876d903b502aa40940fefb1f0881574a333c2e8da6c26b"} Mar 14 07:44:04 crc kubenswrapper[5129]: I0314 07:44:04.942319 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a9f0614dc9c9552c6876d903b502aa40940fefb1f0881574a333c2e8da6c26b" Mar 14 07:44:04 crc kubenswrapper[5129]: I0314 07:44:04.942331 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557904-zn9vc" Mar 14 07:44:05 crc kubenswrapper[5129]: I0314 07:44:05.290296 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-74j7m"] Mar 14 07:44:05 crc kubenswrapper[5129]: I0314 07:44:05.301022 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557898-74j7m"] Mar 14 07:44:06 crc kubenswrapper[5129]: I0314 07:44:06.697538 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af5cfd68-61db-42d2-8a82-f22021b73518" path="/var/lib/kubelet/pods/af5cfd68-61db-42d2-8a82-f22021b73518/volumes" Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.760007 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fmnvg"] Mar 14 07:44:13 crc kubenswrapper[5129]: E0314 07:44:13.762056 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfced3b-98f3-4942-be06-41ecf1619f1c" containerName="oc" Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.762167 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfced3b-98f3-4942-be06-41ecf1619f1c" containerName="oc" Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.762515 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bfced3b-98f3-4942-be06-41ecf1619f1c" containerName="oc" Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.763992 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.776151 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fmnvg"] Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.940109 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tzdt\" (UniqueName: \"kubernetes.io/projected/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-kube-api-access-6tzdt\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.940397 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-utilities\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:13 crc kubenswrapper[5129]: I0314 07:44:13.940518 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-catalog-content\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.041950 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tzdt\" (UniqueName: \"kubernetes.io/projected/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-kube-api-access-6tzdt\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.042036 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-utilities\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.042151 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-catalog-content\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.042793 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-catalog-content\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.043098 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-utilities\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.064798 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tzdt\" (UniqueName: \"kubernetes.io/projected/3f7b1d09-8c5d-41b6-80fa-2de4b81d6912-kube-api-access-6tzdt\") pod \"community-operators-fmnvg\" (UID: \"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912\") " pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.085193 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:14 crc kubenswrapper[5129]: I0314 07:44:14.629961 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fmnvg"] Mar 14 07:44:15 crc kubenswrapper[5129]: I0314 07:44:15.028010 5129 generic.go:334] "Generic (PLEG): container finished" podID="3f7b1d09-8c5d-41b6-80fa-2de4b81d6912" containerID="47532f3e68d8529542fb569a0a2b5a8bfbd47429ecb4a7d790dbca628393936a" exitCode=0 Mar 14 07:44:15 crc kubenswrapper[5129]: I0314 07:44:15.028077 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fmnvg" event={"ID":"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912","Type":"ContainerDied","Data":"47532f3e68d8529542fb569a0a2b5a8bfbd47429ecb4a7d790dbca628393936a"} Mar 14 07:44:15 crc kubenswrapper[5129]: I0314 07:44:15.028125 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fmnvg" event={"ID":"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912","Type":"ContainerStarted","Data":"b8977e5d66b56daaa034500906e1d974b39a303e56dc602c4d78576673078e45"} Mar 14 07:44:19 crc kubenswrapper[5129]: I0314 07:44:19.574864 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:44:19 crc kubenswrapper[5129]: I0314 07:44:19.575409 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:44:20 crc kubenswrapper[5129]: I0314 07:44:20.067201 5129 generic.go:334] "Generic (PLEG): container finished" podID="3f7b1d09-8c5d-41b6-80fa-2de4b81d6912" containerID="9328b20641e22879bedb229d6679ad57cd42f846a2c48daae92a6dfbc23afc71" exitCode=0 Mar 14 07:44:20 crc kubenswrapper[5129]: I0314 07:44:20.067287 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fmnvg" event={"ID":"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912","Type":"ContainerDied","Data":"9328b20641e22879bedb229d6679ad57cd42f846a2c48daae92a6dfbc23afc71"} Mar 14 07:44:22 crc kubenswrapper[5129]: I0314 07:44:22.061442 5129 scope.go:117] "RemoveContainer" containerID="25155cf366438569197a23b3ac310c69a09b8148d444c82f1f9dd463457e3e96" Mar 14 07:44:22 crc kubenswrapper[5129]: I0314 07:44:22.086864 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fmnvg" event={"ID":"3f7b1d09-8c5d-41b6-80fa-2de4b81d6912","Type":"ContainerStarted","Data":"8aa91d51690575697f02c14b009dae5316e06101a2b5333c41a9b62c7ebd0414"} Mar 14 07:44:22 crc kubenswrapper[5129]: I0314 07:44:22.113199 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fmnvg" podStartSLOduration=2.783893167 podStartE2EDuration="9.1131778s" podCreationTimestamp="2026-03-14 07:44:13 +0000 UTC" firstStartedPulling="2026-03-14 07:44:15.031823137 +0000 UTC m=+2717.783738321" lastFinishedPulling="2026-03-14 07:44:21.36110777 +0000 UTC m=+2724.113022954" observedRunningTime="2026-03-14 07:44:22.10800589 +0000 UTC m=+2724.859921084" watchObservedRunningTime="2026-03-14 07:44:22.1131778 +0000 UTC m=+2724.865092984" Mar 14 07:44:24 crc kubenswrapper[5129]: I0314 07:44:24.086041 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:24 crc kubenswrapper[5129]: I0314 07:44:24.086341 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:24 crc kubenswrapper[5129]: I0314 07:44:24.125211 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:34 crc kubenswrapper[5129]: I0314 07:44:34.157126 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fmnvg" Mar 14 07:44:34 crc kubenswrapper[5129]: I0314 07:44:34.247192 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fmnvg"] Mar 14 07:44:34 crc kubenswrapper[5129]: I0314 07:44:34.286699 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc684"] Mar 14 07:44:34 crc kubenswrapper[5129]: I0314 07:44:34.286915 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mc684" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="registry-server" containerID="cri-o://759440f4fa219c00e39d4322715d8260cf42be47cb6771beeb2dd0415460ddff" gracePeriod=2 Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.182199 5129 generic.go:334] "Generic (PLEG): container finished" podID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerID="759440f4fa219c00e39d4322715d8260cf42be47cb6771beeb2dd0415460ddff" exitCode=0 Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.182240 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc684" event={"ID":"de8a905c-2cbf-426e-8272-fa1897f95c39","Type":"ContainerDied","Data":"759440f4fa219c00e39d4322715d8260cf42be47cb6771beeb2dd0415460ddff"} Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.258540 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.375835 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj77k\" (UniqueName: \"kubernetes.io/projected/de8a905c-2cbf-426e-8272-fa1897f95c39-kube-api-access-vj77k\") pod \"de8a905c-2cbf-426e-8272-fa1897f95c39\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.375943 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-catalog-content\") pod \"de8a905c-2cbf-426e-8272-fa1897f95c39\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.376048 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-utilities\") pod \"de8a905c-2cbf-426e-8272-fa1897f95c39\" (UID: \"de8a905c-2cbf-426e-8272-fa1897f95c39\") " Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.376796 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-utilities" (OuterVolumeSpecName: "utilities") pod "de8a905c-2cbf-426e-8272-fa1897f95c39" (UID: "de8a905c-2cbf-426e-8272-fa1897f95c39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.376912 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.380962 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8a905c-2cbf-426e-8272-fa1897f95c39-kube-api-access-vj77k" (OuterVolumeSpecName: "kube-api-access-vj77k") pod "de8a905c-2cbf-426e-8272-fa1897f95c39" (UID: "de8a905c-2cbf-426e-8272-fa1897f95c39"). InnerVolumeSpecName "kube-api-access-vj77k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.477866 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj77k\" (UniqueName: \"kubernetes.io/projected/de8a905c-2cbf-426e-8272-fa1897f95c39-kube-api-access-vj77k\") on node \"crc\" DevicePath \"\"" Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.806144 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de8a905c-2cbf-426e-8272-fa1897f95c39" (UID: "de8a905c-2cbf-426e-8272-fa1897f95c39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:44:35 crc kubenswrapper[5129]: I0314 07:44:35.882570 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de8a905c-2cbf-426e-8272-fa1897f95c39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:44:36 crc kubenswrapper[5129]: I0314 07:44:36.192539 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc684" event={"ID":"de8a905c-2cbf-426e-8272-fa1897f95c39","Type":"ContainerDied","Data":"214156216dc86dec446259ca979f291cf6ce569a8efb2644054840269e1bc6b7"} Mar 14 07:44:36 crc kubenswrapper[5129]: I0314 07:44:36.192588 5129 scope.go:117] "RemoveContainer" containerID="759440f4fa219c00e39d4322715d8260cf42be47cb6771beeb2dd0415460ddff" Mar 14 07:44:36 crc kubenswrapper[5129]: I0314 07:44:36.192647 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc684" Mar 14 07:44:36 crc kubenswrapper[5129]: I0314 07:44:36.214854 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc684"] Mar 14 07:44:36 crc kubenswrapper[5129]: I0314 07:44:36.219996 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mc684"] Mar 14 07:44:36 crc kubenswrapper[5129]: I0314 07:44:36.307046 5129 scope.go:117] "RemoveContainer" containerID="1fcadc9bffb4c85ce81e6004b70904db66a7cdf2eba5a3e397abf0d3f3c78540" Mar 14 07:44:36 crc kubenswrapper[5129]: I0314 07:44:36.328795 5129 scope.go:117] "RemoveContainer" containerID="65e1caa371f1b5a059a7c758d882bd8f051ad55860e8a1b494cf9cc1ee226f8c" Mar 14 07:44:38 crc kubenswrapper[5129]: I0314 07:44:38.047250 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" path="/var/lib/kubelet/pods/de8a905c-2cbf-426e-8272-fa1897f95c39/volumes" Mar 14 07:44:49 crc kubenswrapper[5129]: I0314 07:44:49.574953 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:44:49 crc kubenswrapper[5129]: I0314 07:44:49.575520 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.145079 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p"] Mar 14 07:45:00 crc kubenswrapper[5129]: E0314 07:45:00.147694 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="extract-content" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.147715 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="extract-content" Mar 14 07:45:00 crc kubenswrapper[5129]: E0314 07:45:00.147736 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="registry-server" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.147743 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="registry-server" Mar 14 07:45:00 crc kubenswrapper[5129]: E0314 07:45:00.147752 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="extract-utilities" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.147759 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="extract-utilities" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.147907 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8a905c-2cbf-426e-8272-fa1897f95c39" containerName="registry-server" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.148354 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.151483 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.154833 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.155303 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p"] Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.248778 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-secret-volume\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.249190 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-config-volume\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.249240 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8z9\" (UniqueName: \"kubernetes.io/projected/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-kube-api-access-4b8z9\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.350390 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-secret-volume\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.350455 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-config-volume\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.350481 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8z9\" (UniqueName: \"kubernetes.io/projected/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-kube-api-access-4b8z9\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.351728 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-config-volume\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.364775 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-secret-volume\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.374701 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8z9\" (UniqueName: \"kubernetes.io/projected/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-kube-api-access-4b8z9\") pod \"collect-profiles-29557905-5b65p\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.468840 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:00 crc kubenswrapper[5129]: I0314 07:45:00.902323 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p"] Mar 14 07:45:01 crc kubenswrapper[5129]: I0314 07:45:01.403898 5129 generic.go:334] "Generic (PLEG): container finished" podID="bd5d079a-0e66-4f29-afcd-58cfbff4d26a" containerID="ff6681691946acfec8d2b67200d13477b1b44942bea993e698af895f27efbcc5" exitCode=0 Mar 14 07:45:01 crc kubenswrapper[5129]: I0314 07:45:01.403935 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" event={"ID":"bd5d079a-0e66-4f29-afcd-58cfbff4d26a","Type":"ContainerDied","Data":"ff6681691946acfec8d2b67200d13477b1b44942bea993e698af895f27efbcc5"} Mar 14 07:45:01 crc kubenswrapper[5129]: I0314 07:45:01.403958 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" event={"ID":"bd5d079a-0e66-4f29-afcd-58cfbff4d26a","Type":"ContainerStarted","Data":"830bbf61668af6578f918d13b2dd89b409b03272ccceeba8bb159bf527b09171"} Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.661633 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.787197 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b8z9\" (UniqueName: \"kubernetes.io/projected/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-kube-api-access-4b8z9\") pod \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.787403 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-config-volume\") pod \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.787515 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-secret-volume\") pod \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\" (UID: \"bd5d079a-0e66-4f29-afcd-58cfbff4d26a\") " Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.788166 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-config-volume" (OuterVolumeSpecName: "config-volume") pod "bd5d079a-0e66-4f29-afcd-58cfbff4d26a" (UID: "bd5d079a-0e66-4f29-afcd-58cfbff4d26a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.796014 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bd5d079a-0e66-4f29-afcd-58cfbff4d26a" (UID: "bd5d079a-0e66-4f29-afcd-58cfbff4d26a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.796359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-kube-api-access-4b8z9" (OuterVolumeSpecName: "kube-api-access-4b8z9") pod "bd5d079a-0e66-4f29-afcd-58cfbff4d26a" (UID: "bd5d079a-0e66-4f29-afcd-58cfbff4d26a"). InnerVolumeSpecName "kube-api-access-4b8z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.889428 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b8z9\" (UniqueName: \"kubernetes.io/projected/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-kube-api-access-4b8z9\") on node \"crc\" DevicePath \"\"" Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.889473 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:45:02 crc kubenswrapper[5129]: I0314 07:45:02.889494 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd5d079a-0e66-4f29-afcd-58cfbff4d26a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 07:45:03 crc kubenswrapper[5129]: I0314 07:45:03.419670 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" event={"ID":"bd5d079a-0e66-4f29-afcd-58cfbff4d26a","Type":"ContainerDied","Data":"830bbf61668af6578f918d13b2dd89b409b03272ccceeba8bb159bf527b09171"} Mar 14 07:45:03 crc kubenswrapper[5129]: I0314 07:45:03.420023 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="830bbf61668af6578f918d13b2dd89b409b03272ccceeba8bb159bf527b09171" Mar 14 07:45:03 crc kubenswrapper[5129]: I0314 07:45:03.419782 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p" Mar 14 07:45:03 crc kubenswrapper[5129]: I0314 07:45:03.742807 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5"] Mar 14 07:45:03 crc kubenswrapper[5129]: I0314 07:45:03.747541 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557860-zdvj5"] Mar 14 07:45:04 crc kubenswrapper[5129]: I0314 07:45:04.046125 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="021f1f3b-8ee9-424f-9c04-c56631332e92" path="/var/lib/kubelet/pods/021f1f3b-8ee9-424f-9c04-c56631332e92/volumes" Mar 14 07:45:19 crc kubenswrapper[5129]: I0314 07:45:19.574550 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:45:19 crc kubenswrapper[5129]: I0314 07:45:19.575039 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:45:19 crc kubenswrapper[5129]: I0314 07:45:19.575089 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:45:19 crc kubenswrapper[5129]: I0314 07:45:19.575636 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb37c38a2b671eb228c48734c9694784fed3a17bbf09d28871edf6b039cd1362"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:45:19 crc kubenswrapper[5129]: I0314 07:45:19.575702 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://eb37c38a2b671eb228c48734c9694784fed3a17bbf09d28871edf6b039cd1362" gracePeriod=600 Mar 14 07:45:20 crc kubenswrapper[5129]: I0314 07:45:20.545886 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="eb37c38a2b671eb228c48734c9694784fed3a17bbf09d28871edf6b039cd1362" exitCode=0 Mar 14 07:45:20 crc kubenswrapper[5129]: I0314 07:45:20.545967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"eb37c38a2b671eb228c48734c9694784fed3a17bbf09d28871edf6b039cd1362"} Mar 14 07:45:20 crc kubenswrapper[5129]: I0314 07:45:20.546219 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38"} Mar 14 07:45:20 crc kubenswrapper[5129]: I0314 07:45:20.546241 5129 scope.go:117] "RemoveContainer" containerID="88fc6b8f48625d79a7078b80124522fbc0f2546d969af10933fb35cf7b3e1bd7" Mar 14 07:45:22 crc kubenswrapper[5129]: I0314 07:45:22.146679 5129 scope.go:117] "RemoveContainer" containerID="d78d5a7174eb10dce09651d1b84244d39b7ad36c4f9c6d9f01a997b9baba6566" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.149393 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557906-8h2wz"] Mar 14 07:46:00 crc kubenswrapper[5129]: E0314 07:46:00.150293 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5d079a-0e66-4f29-afcd-58cfbff4d26a" containerName="collect-profiles" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.150310 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d079a-0e66-4f29-afcd-58cfbff4d26a" containerName="collect-profiles" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.150514 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5d079a-0e66-4f29-afcd-58cfbff4d26a" containerName="collect-profiles" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.151104 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.155126 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.155305 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.155825 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.166142 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-8h2wz"] Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.353393 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5d8g\" (UniqueName: \"kubernetes.io/projected/1c922f39-4e21-43dc-bb33-9c2b6d8fdb67-kube-api-access-p5d8g\") pod \"auto-csr-approver-29557906-8h2wz\" (UID: \"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67\") " pod="openshift-infra/auto-csr-approver-29557906-8h2wz" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.455023 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5d8g\" (UniqueName: \"kubernetes.io/projected/1c922f39-4e21-43dc-bb33-9c2b6d8fdb67-kube-api-access-p5d8g\") pod \"auto-csr-approver-29557906-8h2wz\" (UID: \"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67\") " pod="openshift-infra/auto-csr-approver-29557906-8h2wz" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.480270 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5d8g\" (UniqueName: \"kubernetes.io/projected/1c922f39-4e21-43dc-bb33-9c2b6d8fdb67-kube-api-access-p5d8g\") pod \"auto-csr-approver-29557906-8h2wz\" (UID: \"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67\") " pod="openshift-infra/auto-csr-approver-29557906-8h2wz" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.524904 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" Mar 14 07:46:00 crc kubenswrapper[5129]: I0314 07:46:00.961859 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-8h2wz"] Mar 14 07:46:01 crc kubenswrapper[5129]: I0314 07:46:01.220551 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" event={"ID":"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67","Type":"ContainerStarted","Data":"86f8b0ab3e4c7a9ace055b4fbba61527ee8fd54a2d18e54cccd2baade4db6153"} Mar 14 07:46:03 crc kubenswrapper[5129]: I0314 07:46:03.240301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" event={"ID":"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67","Type":"ContainerStarted","Data":"55e889f187173b9f6c1f2a600ced2f2e4fe22887f8781cd17bd93cffe5dac47f"} Mar 14 07:46:03 crc kubenswrapper[5129]: I0314 07:46:03.261873 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" podStartSLOduration=1.430101676 podStartE2EDuration="3.261848939s" podCreationTimestamp="2026-03-14 07:46:00 +0000 UTC" firstStartedPulling="2026-03-14 07:46:00.971524617 +0000 UTC m=+2823.723439821" lastFinishedPulling="2026-03-14 07:46:02.80327187 +0000 UTC m=+2825.555187084" observedRunningTime="2026-03-14 07:46:03.257704756 +0000 UTC m=+2826.009619940" watchObservedRunningTime="2026-03-14 07:46:03.261848939 +0000 UTC m=+2826.013764133" Mar 14 07:46:04 crc kubenswrapper[5129]: I0314 07:46:04.248049 5129 generic.go:334] "Generic (PLEG): container finished" podID="1c922f39-4e21-43dc-bb33-9c2b6d8fdb67" containerID="55e889f187173b9f6c1f2a600ced2f2e4fe22887f8781cd17bd93cffe5dac47f" exitCode=0 Mar 14 07:46:04 crc kubenswrapper[5129]: I0314 07:46:04.248101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" event={"ID":"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67","Type":"ContainerDied","Data":"55e889f187173b9f6c1f2a600ced2f2e4fe22887f8781cd17bd93cffe5dac47f"} Mar 14 07:46:05 crc kubenswrapper[5129]: I0314 07:46:05.587225 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" Mar 14 07:46:05 crc kubenswrapper[5129]: I0314 07:46:05.726926 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5d8g\" (UniqueName: \"kubernetes.io/projected/1c922f39-4e21-43dc-bb33-9c2b6d8fdb67-kube-api-access-p5d8g\") pod \"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67\" (UID: \"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67\") " Mar 14 07:46:05 crc kubenswrapper[5129]: I0314 07:46:05.732952 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c922f39-4e21-43dc-bb33-9c2b6d8fdb67-kube-api-access-p5d8g" (OuterVolumeSpecName: "kube-api-access-p5d8g") pod "1c922f39-4e21-43dc-bb33-9c2b6d8fdb67" (UID: "1c922f39-4e21-43dc-bb33-9c2b6d8fdb67"). InnerVolumeSpecName "kube-api-access-p5d8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:46:05 crc kubenswrapper[5129]: I0314 07:46:05.828451 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5d8g\" (UniqueName: \"kubernetes.io/projected/1c922f39-4e21-43dc-bb33-9c2b6d8fdb67-kube-api-access-p5d8g\") on node \"crc\" DevicePath \"\"" Mar 14 07:46:06 crc kubenswrapper[5129]: I0314 07:46:06.265185 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" Mar 14 07:46:06 crc kubenswrapper[5129]: I0314 07:46:06.285767 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557906-8h2wz" event={"ID":"1c922f39-4e21-43dc-bb33-9c2b6d8fdb67","Type":"ContainerDied","Data":"86f8b0ab3e4c7a9ace055b4fbba61527ee8fd54a2d18e54cccd2baade4db6153"} Mar 14 07:46:06 crc kubenswrapper[5129]: I0314 07:46:06.285813 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86f8b0ab3e4c7a9ace055b4fbba61527ee8fd54a2d18e54cccd2baade4db6153" Mar 14 07:46:06 crc kubenswrapper[5129]: I0314 07:46:06.325478 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-sfz55"] Mar 14 07:46:06 crc kubenswrapper[5129]: I0314 07:46:06.331191 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557900-sfz55"] Mar 14 07:46:08 crc kubenswrapper[5129]: I0314 07:46:08.044898 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79804d4-f7ad-41d0-9f30-ec5814f2b75a" path="/var/lib/kubelet/pods/f79804d4-f7ad-41d0-9f30-ec5814f2b75a/volumes" Mar 14 07:46:22 crc kubenswrapper[5129]: I0314 07:46:22.284470 5129 scope.go:117] "RemoveContainer" containerID="f8a212801cb360516b604349a37d7d9f7251d59775f3a01ea785cfa98c75f4a4" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.616667 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-86z96"] Mar 14 07:46:59 crc kubenswrapper[5129]: E0314 07:46:59.617643 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c922f39-4e21-43dc-bb33-9c2b6d8fdb67" containerName="oc" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.617663 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c922f39-4e21-43dc-bb33-9c2b6d8fdb67" containerName="oc" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.617908 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c922f39-4e21-43dc-bb33-9c2b6d8fdb67" containerName="oc" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.619466 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.637810 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-86z96"] Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.701520 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-catalog-content\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.701563 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fp9\" (UniqueName: \"kubernetes.io/projected/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-kube-api-access-v5fp9\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.701586 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-utilities\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.802945 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-catalog-content\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.803004 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fp9\" (UniqueName: \"kubernetes.io/projected/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-kube-api-access-v5fp9\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.803035 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-utilities\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.803485 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-catalog-content\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.803555 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-utilities\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.835100 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fp9\" (UniqueName: \"kubernetes.io/projected/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-kube-api-access-v5fp9\") pod \"redhat-marketplace-86z96\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:46:59 crc kubenswrapper[5129]: I0314 07:46:59.939143 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:47:00 crc kubenswrapper[5129]: I0314 07:47:00.386497 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-86z96"] Mar 14 07:47:00 crc kubenswrapper[5129]: I0314 07:47:00.757437 5129 generic.go:334] "Generic (PLEG): container finished" podID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerID="196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b" exitCode=0 Mar 14 07:47:00 crc kubenswrapper[5129]: I0314 07:47:00.757501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86z96" event={"ID":"c74fe47a-1436-4fd6-8dc4-155b61f32dc1","Type":"ContainerDied","Data":"196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b"} Mar 14 07:47:00 crc kubenswrapper[5129]: I0314 07:47:00.757588 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86z96" event={"ID":"c74fe47a-1436-4fd6-8dc4-155b61f32dc1","Type":"ContainerStarted","Data":"6ed1ffafb138cbfcf1dd1d69a8922501ccb0df17a879bbb2f4da8b0c16de6d39"} Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.776700 5129 generic.go:334] "Generic (PLEG): container finished" podID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerID="a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51" exitCode=0 Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.776765 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86z96" event={"ID":"c74fe47a-1436-4fd6-8dc4-155b61f32dc1","Type":"ContainerDied","Data":"a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51"} Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.828851 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hwnqj"] Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.832285 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.840178 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwnqj"] Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.951149 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-utilities\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.951219 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-catalog-content\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:02 crc kubenswrapper[5129]: I0314 07:47:02.951274 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq2g7\" (UniqueName: \"kubernetes.io/projected/6fd78237-0675-4c50-8385-a973f5aa0c6c-kube-api-access-vq2g7\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.052929 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-utilities\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.053001 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-catalog-content\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.053043 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq2g7\" (UniqueName: \"kubernetes.io/projected/6fd78237-0675-4c50-8385-a973f5aa0c6c-kube-api-access-vq2g7\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.053551 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-utilities\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.053715 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-catalog-content\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.075675 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq2g7\" (UniqueName: \"kubernetes.io/projected/6fd78237-0675-4c50-8385-a973f5aa0c6c-kube-api-access-vq2g7\") pod \"certified-operators-hwnqj\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.166411 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.520772 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwnqj"] Mar 14 07:47:03 crc kubenswrapper[5129]: I0314 07:47:03.786418 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnqj" event={"ID":"6fd78237-0675-4c50-8385-a973f5aa0c6c","Type":"ContainerStarted","Data":"aa7e7406dfdd857dc566a8e32ee11966968a39f7ae791549e99b6923c06568fc"} Mar 14 07:47:04 crc kubenswrapper[5129]: I0314 07:47:04.797925 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86z96" event={"ID":"c74fe47a-1436-4fd6-8dc4-155b61f32dc1","Type":"ContainerStarted","Data":"231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f"} Mar 14 07:47:04 crc kubenswrapper[5129]: I0314 07:47:04.799520 5129 generic.go:334] "Generic (PLEG): container finished" podID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerID="c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633" exitCode=0 Mar 14 07:47:04 crc kubenswrapper[5129]: I0314 07:47:04.799547 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnqj" event={"ID":"6fd78237-0675-4c50-8385-a973f5aa0c6c","Type":"ContainerDied","Data":"c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633"} Mar 14 07:47:04 crc kubenswrapper[5129]: I0314 07:47:04.823164 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-86z96" podStartSLOduration=3.003168351 podStartE2EDuration="5.823137288s" podCreationTimestamp="2026-03-14 07:46:59 +0000 UTC" firstStartedPulling="2026-03-14 07:47:00.759296666 +0000 UTC m=+2883.511211870" lastFinishedPulling="2026-03-14 07:47:03.579265633 +0000 UTC m=+2886.331180807" observedRunningTime="2026-03-14 07:47:04.819300045 +0000 UTC m=+2887.571215239" watchObservedRunningTime="2026-03-14 07:47:04.823137288 +0000 UTC m=+2887.575052522" Mar 14 07:47:05 crc kubenswrapper[5129]: I0314 07:47:05.809236 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnqj" event={"ID":"6fd78237-0675-4c50-8385-a973f5aa0c6c","Type":"ContainerStarted","Data":"91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb"} Mar 14 07:47:06 crc kubenswrapper[5129]: I0314 07:47:06.817046 5129 generic.go:334] "Generic (PLEG): container finished" podID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerID="91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb" exitCode=0 Mar 14 07:47:06 crc kubenswrapper[5129]: I0314 07:47:06.817136 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnqj" event={"ID":"6fd78237-0675-4c50-8385-a973f5aa0c6c","Type":"ContainerDied","Data":"91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb"} Mar 14 07:47:07 crc kubenswrapper[5129]: I0314 07:47:07.828351 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnqj" event={"ID":"6fd78237-0675-4c50-8385-a973f5aa0c6c","Type":"ContainerStarted","Data":"a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b"} Mar 14 07:47:07 crc kubenswrapper[5129]: I0314 07:47:07.850463 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hwnqj" podStartSLOduration=3.364510571 podStartE2EDuration="5.850442603s" podCreationTimestamp="2026-03-14 07:47:02 +0000 UTC" firstStartedPulling="2026-03-14 07:47:04.801330739 +0000 UTC m=+2887.553245923" lastFinishedPulling="2026-03-14 07:47:07.287262771 +0000 UTC m=+2890.039177955" observedRunningTime="2026-03-14 07:47:07.844374349 +0000 UTC m=+2890.596289533" watchObservedRunningTime="2026-03-14 07:47:07.850442603 +0000 UTC m=+2890.602357787" Mar 14 07:47:09 crc kubenswrapper[5129]: I0314 07:47:09.942107 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:47:09 crc kubenswrapper[5129]: I0314 07:47:09.942628 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:47:09 crc kubenswrapper[5129]: I0314 07:47:09.990405 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:47:10 crc kubenswrapper[5129]: I0314 07:47:10.908115 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:47:11 crc kubenswrapper[5129]: I0314 07:47:11.800204 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-86z96"] Mar 14 07:47:12 crc kubenswrapper[5129]: I0314 07:47:12.863896 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-86z96" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="registry-server" containerID="cri-o://231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f" gracePeriod=2 Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.167188 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.167247 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.222863 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.830832 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.894023 5129 generic.go:334] "Generic (PLEG): container finished" podID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerID="231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f" exitCode=0 Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.894102 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86z96" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.894107 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86z96" event={"ID":"c74fe47a-1436-4fd6-8dc4-155b61f32dc1","Type":"ContainerDied","Data":"231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f"} Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.894466 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86z96" event={"ID":"c74fe47a-1436-4fd6-8dc4-155b61f32dc1","Type":"ContainerDied","Data":"6ed1ffafb138cbfcf1dd1d69a8922501ccb0df17a879bbb2f4da8b0c16de6d39"} Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.894486 5129 scope.go:117] "RemoveContainer" containerID="231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.911567 5129 scope.go:117] "RemoveContainer" containerID="a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.917399 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-utilities\") pod \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.917444 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-catalog-content\") pod \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.917579 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5fp9\" (UniqueName: \"kubernetes.io/projected/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-kube-api-access-v5fp9\") pod \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\" (UID: \"c74fe47a-1436-4fd6-8dc4-155b61f32dc1\") " Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.924784 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-kube-api-access-v5fp9" (OuterVolumeSpecName: "kube-api-access-v5fp9") pod "c74fe47a-1436-4fd6-8dc4-155b61f32dc1" (UID: "c74fe47a-1436-4fd6-8dc4-155b61f32dc1"). InnerVolumeSpecName "kube-api-access-v5fp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.925859 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-utilities" (OuterVolumeSpecName: "utilities") pod "c74fe47a-1436-4fd6-8dc4-155b61f32dc1" (UID: "c74fe47a-1436-4fd6-8dc4-155b61f32dc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.934272 5129 scope.go:117] "RemoveContainer" containerID="196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.943678 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.952698 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c74fe47a-1436-4fd6-8dc4-155b61f32dc1" (UID: "c74fe47a-1436-4fd6-8dc4-155b61f32dc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.983448 5129 scope.go:117] "RemoveContainer" containerID="231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f" Mar 14 07:47:13 crc kubenswrapper[5129]: E0314 07:47:13.984124 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f\": container with ID starting with 231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f not found: ID does not exist" containerID="231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.984189 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f"} err="failed to get container status \"231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f\": rpc error: code = NotFound desc = could not find container \"231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f\": container with ID starting with 231494c449d80be4758124b3a2e40eedda4dd253de51ba005cf7a1d6b9f6a07f not found: ID does not exist" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.984234 5129 scope.go:117] "RemoveContainer" containerID="a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51" Mar 14 07:47:13 crc kubenswrapper[5129]: E0314 07:47:13.984942 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51\": container with ID starting with a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51 not found: ID does not exist" containerID="a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.984969 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51"} err="failed to get container status \"a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51\": rpc error: code = NotFound desc = could not find container \"a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51\": container with ID starting with a3b6f1221f321c106628461db0780c4134e854b7a38d728e7153cd7cab97be51 not found: ID does not exist" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.984984 5129 scope.go:117] "RemoveContainer" containerID="196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b" Mar 14 07:47:13 crc kubenswrapper[5129]: E0314 07:47:13.985288 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b\": container with ID starting with 196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b not found: ID does not exist" containerID="196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b" Mar 14 07:47:13 crc kubenswrapper[5129]: I0314 07:47:13.985337 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b"} err="failed to get container status \"196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b\": rpc error: code = NotFound desc = could not find container \"196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b\": container with ID starting with 196c885ff2e9e9bce99c85edc73055b52bdc9ba1b4c7197a986378515b75253b not found: ID does not exist" Mar 14 07:47:14 crc kubenswrapper[5129]: I0314 07:47:14.019236 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:14 crc kubenswrapper[5129]: I0314 07:47:14.019273 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:14 crc kubenswrapper[5129]: I0314 07:47:14.019283 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5fp9\" (UniqueName: \"kubernetes.io/projected/c74fe47a-1436-4fd6-8dc4-155b61f32dc1-kube-api-access-v5fp9\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:14 crc kubenswrapper[5129]: I0314 07:47:14.210802 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-86z96"] Mar 14 07:47:14 crc kubenswrapper[5129]: I0314 07:47:14.221207 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-86z96"] Mar 14 07:47:14 crc kubenswrapper[5129]: I0314 07:47:14.413751 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwnqj"] Mar 14 07:47:15 crc kubenswrapper[5129]: I0314 07:47:15.915570 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hwnqj" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="registry-server" containerID="cri-o://a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b" gracePeriod=2 Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.049893 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" path="/var/lib/kubelet/pods/c74fe47a-1436-4fd6-8dc4-155b61f32dc1/volumes" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.369124 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.457848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq2g7\" (UniqueName: \"kubernetes.io/projected/6fd78237-0675-4c50-8385-a973f5aa0c6c-kube-api-access-vq2g7\") pod \"6fd78237-0675-4c50-8385-a973f5aa0c6c\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.457934 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-catalog-content\") pod \"6fd78237-0675-4c50-8385-a973f5aa0c6c\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.458118 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-utilities\") pod \"6fd78237-0675-4c50-8385-a973f5aa0c6c\" (UID: \"6fd78237-0675-4c50-8385-a973f5aa0c6c\") " Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.459120 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-utilities" (OuterVolumeSpecName: "utilities") pod "6fd78237-0675-4c50-8385-a973f5aa0c6c" (UID: "6fd78237-0675-4c50-8385-a973f5aa0c6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.464585 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd78237-0675-4c50-8385-a973f5aa0c6c-kube-api-access-vq2g7" (OuterVolumeSpecName: "kube-api-access-vq2g7") pod "6fd78237-0675-4c50-8385-a973f5aa0c6c" (UID: "6fd78237-0675-4c50-8385-a973f5aa0c6c"). InnerVolumeSpecName "kube-api-access-vq2g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.510528 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fd78237-0675-4c50-8385-a973f5aa0c6c" (UID: "6fd78237-0675-4c50-8385-a973f5aa0c6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.559521 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.559561 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq2g7\" (UniqueName: \"kubernetes.io/projected/6fd78237-0675-4c50-8385-a973f5aa0c6c-kube-api-access-vq2g7\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.559574 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd78237-0675-4c50-8385-a973f5aa0c6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.924892 5129 generic.go:334] "Generic (PLEG): container finished" podID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerID="a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b" exitCode=0 Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.924929 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnqj" event={"ID":"6fd78237-0675-4c50-8385-a973f5aa0c6c","Type":"ContainerDied","Data":"a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b"} Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.924954 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnqj" event={"ID":"6fd78237-0675-4c50-8385-a973f5aa0c6c","Type":"ContainerDied","Data":"aa7e7406dfdd857dc566a8e32ee11966968a39f7ae791549e99b6923c06568fc"} Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.924971 5129 scope.go:117] "RemoveContainer" containerID="a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.925076 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnqj" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.951011 5129 scope.go:117] "RemoveContainer" containerID="91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb" Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.959154 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwnqj"] Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.967855 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hwnqj"] Mar 14 07:47:16 crc kubenswrapper[5129]: I0314 07:47:16.987203 5129 scope.go:117] "RemoveContainer" containerID="c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633" Mar 14 07:47:17 crc kubenswrapper[5129]: I0314 07:47:17.006246 5129 scope.go:117] "RemoveContainer" containerID="a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b" Mar 14 07:47:17 crc kubenswrapper[5129]: E0314 07:47:17.006851 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b\": container with ID starting with a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b not found: ID does not exist" containerID="a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b" Mar 14 07:47:17 crc kubenswrapper[5129]: I0314 07:47:17.006892 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b"} err="failed to get container status \"a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b\": rpc error: code = NotFound desc = could not find container \"a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b\": container with ID starting with a2b8fd05319bef185e8d4ed674984cbcbf106e7d3965c04c9747baae46e9503b not found: ID does not exist" Mar 14 07:47:17 crc kubenswrapper[5129]: I0314 07:47:17.006913 5129 scope.go:117] "RemoveContainer" containerID="91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb" Mar 14 07:47:17 crc kubenswrapper[5129]: E0314 07:47:17.007322 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb\": container with ID starting with 91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb not found: ID does not exist" containerID="91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb" Mar 14 07:47:17 crc kubenswrapper[5129]: I0314 07:47:17.007570 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb"} err="failed to get container status \"91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb\": rpc error: code = NotFound desc = could not find container \"91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb\": container with ID starting with 91d7d1e04908fa42226ec4018750a030a3c070ab9839b1507abb99b626ad29fb not found: ID does not exist" Mar 14 07:47:17 crc kubenswrapper[5129]: I0314 07:47:17.007613 5129 scope.go:117] "RemoveContainer" containerID="c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633" Mar 14 07:47:17 crc kubenswrapper[5129]: E0314 07:47:17.007935 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633\": container with ID starting with c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633 not found: ID does not exist" containerID="c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633" Mar 14 07:47:17 crc kubenswrapper[5129]: I0314 07:47:17.007979 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633"} err="failed to get container status \"c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633\": rpc error: code = NotFound desc = could not find container \"c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633\": container with ID starting with c0f8442e338465bdda9bb6ae056361a7992bee1a9912e056f1c8c9737c257633 not found: ID does not exist" Mar 14 07:47:18 crc kubenswrapper[5129]: I0314 07:47:18.049352 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" path="/var/lib/kubelet/pods/6fd78237-0675-4c50-8385-a973f5aa0c6c/volumes" Mar 14 07:47:49 crc kubenswrapper[5129]: I0314 07:47:49.573862 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:47:49 crc kubenswrapper[5129]: I0314 07:47:49.574419 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.149827 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557908-h525p"] Mar 14 07:48:00 crc kubenswrapper[5129]: E0314 07:48:00.150809 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="extract-content" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.150829 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="extract-content" Mar 14 07:48:00 crc kubenswrapper[5129]: E0314 07:48:00.150848 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="extract-content" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.150858 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="extract-content" Mar 14 07:48:00 crc kubenswrapper[5129]: E0314 07:48:00.150884 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.150896 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[5129]: E0314 07:48:00.150910 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="extract-utilities" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.150920 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="extract-utilities" Mar 14 07:48:00 crc kubenswrapper[5129]: E0314 07:48:00.150935 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="extract-utilities" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.150944 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="extract-utilities" Mar 14 07:48:00 crc kubenswrapper[5129]: E0314 07:48:00.150972 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.150982 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.151201 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd78237-0675-4c50-8385-a973f5aa0c6c" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.151233 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74fe47a-1436-4fd6-8dc4-155b61f32dc1" containerName="registry-server" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.151966 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-h525p" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.154779 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.154852 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.155453 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.160240 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-h525p"] Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.235830 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-752ss\" (UniqueName: \"kubernetes.io/projected/44662781-f028-4236-ba9d-6afeb110cb74-kube-api-access-752ss\") pod \"auto-csr-approver-29557908-h525p\" (UID: \"44662781-f028-4236-ba9d-6afeb110cb74\") " pod="openshift-infra/auto-csr-approver-29557908-h525p" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.337662 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-752ss\" (UniqueName: \"kubernetes.io/projected/44662781-f028-4236-ba9d-6afeb110cb74-kube-api-access-752ss\") pod \"auto-csr-approver-29557908-h525p\" (UID: \"44662781-f028-4236-ba9d-6afeb110cb74\") " pod="openshift-infra/auto-csr-approver-29557908-h525p" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.369428 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-752ss\" (UniqueName: \"kubernetes.io/projected/44662781-f028-4236-ba9d-6afeb110cb74-kube-api-access-752ss\") pod \"auto-csr-approver-29557908-h525p\" (UID: \"44662781-f028-4236-ba9d-6afeb110cb74\") " pod="openshift-infra/auto-csr-approver-29557908-h525p" Mar 14 07:48:00 crc kubenswrapper[5129]: I0314 07:48:00.475520 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-h525p" Mar 14 07:48:01 crc kubenswrapper[5129]: I0314 07:48:01.007778 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-h525p"] Mar 14 07:48:01 crc kubenswrapper[5129]: I0314 07:48:01.260282 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557908-h525p" event={"ID":"44662781-f028-4236-ba9d-6afeb110cb74","Type":"ContainerStarted","Data":"55f7ee6e6756453033935073629be275412a578bd790489ced730cba803ed1c8"} Mar 14 07:48:04 crc kubenswrapper[5129]: I0314 07:48:04.287081 5129 generic.go:334] "Generic (PLEG): container finished" podID="44662781-f028-4236-ba9d-6afeb110cb74" containerID="73834bbef5271873104e68c1ef2aff2aef5d0aff2d91900e2fdd708c447a7686" exitCode=0 Mar 14 07:48:04 crc kubenswrapper[5129]: I0314 07:48:04.287291 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557908-h525p" event={"ID":"44662781-f028-4236-ba9d-6afeb110cb74","Type":"ContainerDied","Data":"73834bbef5271873104e68c1ef2aff2aef5d0aff2d91900e2fdd708c447a7686"} Mar 14 07:48:05 crc kubenswrapper[5129]: I0314 07:48:05.604817 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-h525p" Mar 14 07:48:05 crc kubenswrapper[5129]: I0314 07:48:05.718075 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-752ss\" (UniqueName: \"kubernetes.io/projected/44662781-f028-4236-ba9d-6afeb110cb74-kube-api-access-752ss\") pod \"44662781-f028-4236-ba9d-6afeb110cb74\" (UID: \"44662781-f028-4236-ba9d-6afeb110cb74\") " Mar 14 07:48:05 crc kubenswrapper[5129]: I0314 07:48:05.724705 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44662781-f028-4236-ba9d-6afeb110cb74-kube-api-access-752ss" (OuterVolumeSpecName: "kube-api-access-752ss") pod "44662781-f028-4236-ba9d-6afeb110cb74" (UID: "44662781-f028-4236-ba9d-6afeb110cb74"). InnerVolumeSpecName "kube-api-access-752ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:48:05 crc kubenswrapper[5129]: I0314 07:48:05.819686 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-752ss\" (UniqueName: \"kubernetes.io/projected/44662781-f028-4236-ba9d-6afeb110cb74-kube-api-access-752ss\") on node \"crc\" DevicePath \"\"" Mar 14 07:48:06 crc kubenswrapper[5129]: I0314 07:48:06.302862 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557908-h525p" event={"ID":"44662781-f028-4236-ba9d-6afeb110cb74","Type":"ContainerDied","Data":"55f7ee6e6756453033935073629be275412a578bd790489ced730cba803ed1c8"} Mar 14 07:48:06 crc kubenswrapper[5129]: I0314 07:48:06.302897 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f7ee6e6756453033935073629be275412a578bd790489ced730cba803ed1c8" Mar 14 07:48:06 crc kubenswrapper[5129]: I0314 07:48:06.302930 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557908-h525p" Mar 14 07:48:06 crc kubenswrapper[5129]: I0314 07:48:06.673697 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-rr487"] Mar 14 07:48:06 crc kubenswrapper[5129]: I0314 07:48:06.681471 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557902-rr487"] Mar 14 07:48:08 crc kubenswrapper[5129]: I0314 07:48:08.045775 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c" path="/var/lib/kubelet/pods/ab5ea775-822a-4f9e-9e35-cd0d7d9ee60c/volumes" Mar 14 07:48:19 crc kubenswrapper[5129]: I0314 07:48:19.574332 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:48:19 crc kubenswrapper[5129]: I0314 07:48:19.575250 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:48:22 crc kubenswrapper[5129]: I0314 07:48:22.404205 5129 scope.go:117] "RemoveContainer" containerID="3ecf920c966401a0d0e544eb06bc1492cec602283305a6679499b05d3fd9c55f" Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.574436 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.575158 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.575213 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.576151 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.576209 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" gracePeriod=600 Mar 14 07:48:49 crc kubenswrapper[5129]: E0314 07:48:49.789648 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.967573 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" exitCode=0 Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.967643 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38"} Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.967747 5129 scope.go:117] "RemoveContainer" containerID="eb37c38a2b671eb228c48734c9694784fed3a17bbf09d28871edf6b039cd1362" Mar 14 07:48:49 crc kubenswrapper[5129]: I0314 07:48:49.968342 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:48:49 crc kubenswrapper[5129]: E0314 07:48:49.968770 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:49:04 crc kubenswrapper[5129]: I0314 07:49:04.036575 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:49:04 crc kubenswrapper[5129]: E0314 07:49:04.037627 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:49:17 crc kubenswrapper[5129]: I0314 07:49:17.037038 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:49:17 crc kubenswrapper[5129]: E0314 07:49:17.038205 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:49:29 crc kubenswrapper[5129]: I0314 07:49:29.036722 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:49:29 crc kubenswrapper[5129]: E0314 07:49:29.037890 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:49:42 crc kubenswrapper[5129]: I0314 07:49:42.036932 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:49:42 crc kubenswrapper[5129]: E0314 07:49:42.039491 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:49:54 crc kubenswrapper[5129]: I0314 07:49:54.038095 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:49:54 crc kubenswrapper[5129]: E0314 07:49:54.039510 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.164860 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557910-mv4zh"] Mar 14 07:50:00 crc kubenswrapper[5129]: E0314 07:50:00.168549 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44662781-f028-4236-ba9d-6afeb110cb74" containerName="oc" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.168776 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="44662781-f028-4236-ba9d-6afeb110cb74" containerName="oc" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.169196 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="44662781-f028-4236-ba9d-6afeb110cb74" containerName="oc" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.170282 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-mv4zh" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.173350 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.173906 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.174688 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.181113 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-mv4zh"] Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.239091 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hw5h\" (UniqueName: \"kubernetes.io/projected/36229ab4-47a6-4953-bbc0-f5335f5643e1-kube-api-access-8hw5h\") pod \"auto-csr-approver-29557910-mv4zh\" (UID: \"36229ab4-47a6-4953-bbc0-f5335f5643e1\") " pod="openshift-infra/auto-csr-approver-29557910-mv4zh" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.341120 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hw5h\" (UniqueName: \"kubernetes.io/projected/36229ab4-47a6-4953-bbc0-f5335f5643e1-kube-api-access-8hw5h\") pod \"auto-csr-approver-29557910-mv4zh\" (UID: \"36229ab4-47a6-4953-bbc0-f5335f5643e1\") " pod="openshift-infra/auto-csr-approver-29557910-mv4zh" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.365218 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hw5h\" (UniqueName: \"kubernetes.io/projected/36229ab4-47a6-4953-bbc0-f5335f5643e1-kube-api-access-8hw5h\") pod \"auto-csr-approver-29557910-mv4zh\" (UID: \"36229ab4-47a6-4953-bbc0-f5335f5643e1\") " pod="openshift-infra/auto-csr-approver-29557910-mv4zh" Mar 14 07:50:00 crc kubenswrapper[5129]: I0314 07:50:00.497470 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-mv4zh" Mar 14 07:50:01 crc kubenswrapper[5129]: I0314 07:50:01.006357 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-mv4zh"] Mar 14 07:50:01 crc kubenswrapper[5129]: I0314 07:50:01.016633 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:50:01 crc kubenswrapper[5129]: I0314 07:50:01.589219 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557910-mv4zh" event={"ID":"36229ab4-47a6-4953-bbc0-f5335f5643e1","Type":"ContainerStarted","Data":"5c3aa72cc6751a4a6e3f3e83787869bd1906dcb807f6b77c6c84bcc51a678c8b"} Mar 14 07:50:04 crc kubenswrapper[5129]: I0314 07:50:04.618261 5129 generic.go:334] "Generic (PLEG): container finished" podID="36229ab4-47a6-4953-bbc0-f5335f5643e1" containerID="867fe2f6ada3474fb03ab134af565e68f9e19cfdcf92ad709bf5b4a188ead05d" exitCode=0 Mar 14 07:50:04 crc kubenswrapper[5129]: I0314 07:50:04.618494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557910-mv4zh" event={"ID":"36229ab4-47a6-4953-bbc0-f5335f5643e1","Type":"ContainerDied","Data":"867fe2f6ada3474fb03ab134af565e68f9e19cfdcf92ad709bf5b4a188ead05d"} Mar 14 07:50:05 crc kubenswrapper[5129]: I0314 07:50:05.928654 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-mv4zh" Mar 14 07:50:06 crc kubenswrapper[5129]: I0314 07:50:06.065292 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hw5h\" (UniqueName: \"kubernetes.io/projected/36229ab4-47a6-4953-bbc0-f5335f5643e1-kube-api-access-8hw5h\") pod \"36229ab4-47a6-4953-bbc0-f5335f5643e1\" (UID: \"36229ab4-47a6-4953-bbc0-f5335f5643e1\") " Mar 14 07:50:06 crc kubenswrapper[5129]: I0314 07:50:06.073840 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36229ab4-47a6-4953-bbc0-f5335f5643e1-kube-api-access-8hw5h" (OuterVolumeSpecName: "kube-api-access-8hw5h") pod "36229ab4-47a6-4953-bbc0-f5335f5643e1" (UID: "36229ab4-47a6-4953-bbc0-f5335f5643e1"). InnerVolumeSpecName "kube-api-access-8hw5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:50:06 crc kubenswrapper[5129]: I0314 07:50:06.167815 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hw5h\" (UniqueName: \"kubernetes.io/projected/36229ab4-47a6-4953-bbc0-f5335f5643e1-kube-api-access-8hw5h\") on node \"crc\" DevicePath \"\"" Mar 14 07:50:06 crc kubenswrapper[5129]: I0314 07:50:06.641258 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557910-mv4zh" event={"ID":"36229ab4-47a6-4953-bbc0-f5335f5643e1","Type":"ContainerDied","Data":"5c3aa72cc6751a4a6e3f3e83787869bd1906dcb807f6b77c6c84bcc51a678c8b"} Mar 14 07:50:06 crc kubenswrapper[5129]: I0314 07:50:06.641309 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c3aa72cc6751a4a6e3f3e83787869bd1906dcb807f6b77c6c84bcc51a678c8b" Mar 14 07:50:06 crc kubenswrapper[5129]: I0314 07:50:06.641349 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557910-mv4zh" Mar 14 07:50:07 crc kubenswrapper[5129]: I0314 07:50:07.012077 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-zn9vc"] Mar 14 07:50:07 crc kubenswrapper[5129]: I0314 07:50:07.017333 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557904-zn9vc"] Mar 14 07:50:08 crc kubenswrapper[5129]: I0314 07:50:08.051929 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bfced3b-98f3-4942-be06-41ecf1619f1c" path="/var/lib/kubelet/pods/8bfced3b-98f3-4942-be06-41ecf1619f1c/volumes" Mar 14 07:50:09 crc kubenswrapper[5129]: I0314 07:50:09.036408 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:50:09 crc kubenswrapper[5129]: E0314 07:50:09.037156 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:50:21 crc kubenswrapper[5129]: I0314 07:50:21.036888 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:50:21 crc kubenswrapper[5129]: E0314 07:50:21.037702 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:50:22 crc kubenswrapper[5129]: I0314 07:50:22.504852 5129 scope.go:117] "RemoveContainer" containerID="52ada0934457f21cbdc70c28a526a5c6abd8e9bd660c88051c8ce106c55b8bf8" Mar 14 07:50:35 crc kubenswrapper[5129]: I0314 07:50:35.036408 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:50:35 crc kubenswrapper[5129]: E0314 07:50:35.037109 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:50:46 crc kubenswrapper[5129]: I0314 07:50:46.036177 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:50:46 crc kubenswrapper[5129]: E0314 07:50:46.037026 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:50:58 crc kubenswrapper[5129]: I0314 07:50:58.042852 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:50:58 crc kubenswrapper[5129]: E0314 07:50:58.043569 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:51:10 crc kubenswrapper[5129]: I0314 07:51:10.036078 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:51:10 crc kubenswrapper[5129]: E0314 07:51:10.036858 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:51:25 crc kubenswrapper[5129]: I0314 07:51:25.036150 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:51:25 crc kubenswrapper[5129]: E0314 07:51:25.037056 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:51:40 crc kubenswrapper[5129]: I0314 07:51:40.036712 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:51:40 crc kubenswrapper[5129]: E0314 07:51:40.037891 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:51:55 crc kubenswrapper[5129]: I0314 07:51:55.036759 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:51:55 crc kubenswrapper[5129]: E0314 07:51:55.037588 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.152587 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557912-svm7s"] Mar 14 07:52:00 crc kubenswrapper[5129]: E0314 07:52:00.154138 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36229ab4-47a6-4953-bbc0-f5335f5643e1" containerName="oc" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.154163 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="36229ab4-47a6-4953-bbc0-f5335f5643e1" containerName="oc" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.154406 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="36229ab4-47a6-4953-bbc0-f5335f5643e1" containerName="oc" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.155436 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-svm7s" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.161058 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.161230 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.161372 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.168335 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-svm7s"] Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.312995 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97lkc\" (UniqueName: \"kubernetes.io/projected/13d7d18c-2c51-4e29-98cf-8bb2d50310bb-kube-api-access-97lkc\") pod \"auto-csr-approver-29557912-svm7s\" (UID: \"13d7d18c-2c51-4e29-98cf-8bb2d50310bb\") " pod="openshift-infra/auto-csr-approver-29557912-svm7s" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.414768 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97lkc\" (UniqueName: \"kubernetes.io/projected/13d7d18c-2c51-4e29-98cf-8bb2d50310bb-kube-api-access-97lkc\") pod \"auto-csr-approver-29557912-svm7s\" (UID: \"13d7d18c-2c51-4e29-98cf-8bb2d50310bb\") " pod="openshift-infra/auto-csr-approver-29557912-svm7s" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.437170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97lkc\" (UniqueName: \"kubernetes.io/projected/13d7d18c-2c51-4e29-98cf-8bb2d50310bb-kube-api-access-97lkc\") pod \"auto-csr-approver-29557912-svm7s\" (UID: \"13d7d18c-2c51-4e29-98cf-8bb2d50310bb\") " pod="openshift-infra/auto-csr-approver-29557912-svm7s" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.477036 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-svm7s" Mar 14 07:52:00 crc kubenswrapper[5129]: I0314 07:52:00.958997 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-svm7s"] Mar 14 07:52:01 crc kubenswrapper[5129]: I0314 07:52:01.537344 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557912-svm7s" event={"ID":"13d7d18c-2c51-4e29-98cf-8bb2d50310bb","Type":"ContainerStarted","Data":"db680132bb30bf2d29f83d24db1e7ac0e15b795183dec61ac5a9d6ffb32e1709"} Mar 14 07:52:03 crc kubenswrapper[5129]: I0314 07:52:03.559726 5129 generic.go:334] "Generic (PLEG): container finished" podID="13d7d18c-2c51-4e29-98cf-8bb2d50310bb" containerID="043c0f27e708e4c540d9ce3a08e4109ccf4d04569c1c3886a36b76f040578308" exitCode=0 Mar 14 07:52:03 crc kubenswrapper[5129]: I0314 07:52:03.559906 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557912-svm7s" event={"ID":"13d7d18c-2c51-4e29-98cf-8bb2d50310bb","Type":"ContainerDied","Data":"043c0f27e708e4c540d9ce3a08e4109ccf4d04569c1c3886a36b76f040578308"} Mar 14 07:52:04 crc kubenswrapper[5129]: I0314 07:52:04.938562 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-svm7s" Mar 14 07:52:05 crc kubenswrapper[5129]: I0314 07:52:05.091413 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97lkc\" (UniqueName: \"kubernetes.io/projected/13d7d18c-2c51-4e29-98cf-8bb2d50310bb-kube-api-access-97lkc\") pod \"13d7d18c-2c51-4e29-98cf-8bb2d50310bb\" (UID: \"13d7d18c-2c51-4e29-98cf-8bb2d50310bb\") " Mar 14 07:52:05 crc kubenswrapper[5129]: I0314 07:52:05.098802 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d7d18c-2c51-4e29-98cf-8bb2d50310bb-kube-api-access-97lkc" (OuterVolumeSpecName: "kube-api-access-97lkc") pod "13d7d18c-2c51-4e29-98cf-8bb2d50310bb" (UID: "13d7d18c-2c51-4e29-98cf-8bb2d50310bb"). InnerVolumeSpecName "kube-api-access-97lkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:52:05 crc kubenswrapper[5129]: I0314 07:52:05.193071 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97lkc\" (UniqueName: \"kubernetes.io/projected/13d7d18c-2c51-4e29-98cf-8bb2d50310bb-kube-api-access-97lkc\") on node \"crc\" DevicePath \"\"" Mar 14 07:52:05 crc kubenswrapper[5129]: I0314 07:52:05.578308 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557912-svm7s" event={"ID":"13d7d18c-2c51-4e29-98cf-8bb2d50310bb","Type":"ContainerDied","Data":"db680132bb30bf2d29f83d24db1e7ac0e15b795183dec61ac5a9d6ffb32e1709"} Mar 14 07:52:05 crc kubenswrapper[5129]: I0314 07:52:05.578373 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db680132bb30bf2d29f83d24db1e7ac0e15b795183dec61ac5a9d6ffb32e1709" Mar 14 07:52:05 crc kubenswrapper[5129]: I0314 07:52:05.578461 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557912-svm7s" Mar 14 07:52:06 crc kubenswrapper[5129]: I0314 07:52:06.020072 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-8h2wz"] Mar 14 07:52:06 crc kubenswrapper[5129]: I0314 07:52:06.028308 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557906-8h2wz"] Mar 14 07:52:06 crc kubenswrapper[5129]: I0314 07:52:06.046744 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c922f39-4e21-43dc-bb33-9c2b6d8fdb67" path="/var/lib/kubelet/pods/1c922f39-4e21-43dc-bb33-9c2b6d8fdb67/volumes" Mar 14 07:52:07 crc kubenswrapper[5129]: I0314 07:52:07.036795 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:52:07 crc kubenswrapper[5129]: E0314 07:52:07.037724 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:52:18 crc kubenswrapper[5129]: I0314 07:52:18.040079 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:52:18 crc kubenswrapper[5129]: E0314 07:52:18.040890 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:52:22 crc kubenswrapper[5129]: I0314 07:52:22.643334 5129 scope.go:117] "RemoveContainer" containerID="55e889f187173b9f6c1f2a600ced2f2e4fe22887f8781cd17bd93cffe5dac47f" Mar 14 07:52:33 crc kubenswrapper[5129]: I0314 07:52:33.036384 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:52:33 crc kubenswrapper[5129]: E0314 07:52:33.037221 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:52:46 crc kubenswrapper[5129]: I0314 07:52:46.036570 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:52:46 crc kubenswrapper[5129]: E0314 07:52:46.037523 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:52:58 crc kubenswrapper[5129]: I0314 07:52:58.045586 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:52:58 crc kubenswrapper[5129]: E0314 07:52:58.046840 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:53:10 crc kubenswrapper[5129]: I0314 07:53:10.036549 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:53:10 crc kubenswrapper[5129]: E0314 07:53:10.037454 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:53:25 crc kubenswrapper[5129]: I0314 07:53:25.036075 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:53:25 crc kubenswrapper[5129]: E0314 07:53:25.036873 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:53:38 crc kubenswrapper[5129]: I0314 07:53:38.039898 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:53:38 crc kubenswrapper[5129]: E0314 07:53:38.041666 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 07:53:53 crc kubenswrapper[5129]: I0314 07:53:53.037090 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:53:53 crc kubenswrapper[5129]: I0314 07:53:53.484080 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"8906fc096299b710e84f146f378d355e3f323666ba280a3f46f8886223fc9eb7"} Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.171329 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557914-s27tv"] Mar 14 07:54:00 crc kubenswrapper[5129]: E0314 07:54:00.172225 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d7d18c-2c51-4e29-98cf-8bb2d50310bb" containerName="oc" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.172240 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d7d18c-2c51-4e29-98cf-8bb2d50310bb" containerName="oc" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.172404 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d7d18c-2c51-4e29-98cf-8bb2d50310bb" containerName="oc" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.172853 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-s27tv" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.175993 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.176064 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.176405 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.192276 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-s27tv"] Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.266843 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7v66\" (UniqueName: \"kubernetes.io/projected/0ca67976-197b-42a1-9f59-d08349f5568b-kube-api-access-d7v66\") pod \"auto-csr-approver-29557914-s27tv\" (UID: \"0ca67976-197b-42a1-9f59-d08349f5568b\") " pod="openshift-infra/auto-csr-approver-29557914-s27tv" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.368631 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7v66\" (UniqueName: \"kubernetes.io/projected/0ca67976-197b-42a1-9f59-d08349f5568b-kube-api-access-d7v66\") pod \"auto-csr-approver-29557914-s27tv\" (UID: \"0ca67976-197b-42a1-9f59-d08349f5568b\") " pod="openshift-infra/auto-csr-approver-29557914-s27tv" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.388010 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7v66\" (UniqueName: \"kubernetes.io/projected/0ca67976-197b-42a1-9f59-d08349f5568b-kube-api-access-d7v66\") pod \"auto-csr-approver-29557914-s27tv\" (UID: \"0ca67976-197b-42a1-9f59-d08349f5568b\") " pod="openshift-infra/auto-csr-approver-29557914-s27tv" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.516895 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-s27tv" Mar 14 07:54:00 crc kubenswrapper[5129]: I0314 07:54:00.924170 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-s27tv"] Mar 14 07:54:01 crc kubenswrapper[5129]: I0314 07:54:01.542173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557914-s27tv" event={"ID":"0ca67976-197b-42a1-9f59-d08349f5568b","Type":"ContainerStarted","Data":"38ccfe2f8500dfab431ecb8918fd670ab94023066f7dc2abed5adfc8921c303e"} Mar 14 07:54:03 crc kubenswrapper[5129]: I0314 07:54:03.557898 5129 generic.go:334] "Generic (PLEG): container finished" podID="0ca67976-197b-42a1-9f59-d08349f5568b" containerID="aea66245a6ebfe3e584aae19887982142057dd9a68403d75ce2867d45d88df9b" exitCode=0 Mar 14 07:54:03 crc kubenswrapper[5129]: I0314 07:54:03.558012 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557914-s27tv" event={"ID":"0ca67976-197b-42a1-9f59-d08349f5568b","Type":"ContainerDied","Data":"aea66245a6ebfe3e584aae19887982142057dd9a68403d75ce2867d45d88df9b"} Mar 14 07:54:04 crc kubenswrapper[5129]: I0314 07:54:04.858030 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-s27tv" Mar 14 07:54:04 crc kubenswrapper[5129]: I0314 07:54:04.935387 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7v66\" (UniqueName: \"kubernetes.io/projected/0ca67976-197b-42a1-9f59-d08349f5568b-kube-api-access-d7v66\") pod \"0ca67976-197b-42a1-9f59-d08349f5568b\" (UID: \"0ca67976-197b-42a1-9f59-d08349f5568b\") " Mar 14 07:54:04 crc kubenswrapper[5129]: I0314 07:54:04.942162 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca67976-197b-42a1-9f59-d08349f5568b-kube-api-access-d7v66" (OuterVolumeSpecName: "kube-api-access-d7v66") pod "0ca67976-197b-42a1-9f59-d08349f5568b" (UID: "0ca67976-197b-42a1-9f59-d08349f5568b"). InnerVolumeSpecName "kube-api-access-d7v66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:54:05 crc kubenswrapper[5129]: I0314 07:54:05.036810 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7v66\" (UniqueName: \"kubernetes.io/projected/0ca67976-197b-42a1-9f59-d08349f5568b-kube-api-access-d7v66\") on node \"crc\" DevicePath \"\"" Mar 14 07:54:05 crc kubenswrapper[5129]: I0314 07:54:05.576475 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557914-s27tv" event={"ID":"0ca67976-197b-42a1-9f59-d08349f5568b","Type":"ContainerDied","Data":"38ccfe2f8500dfab431ecb8918fd670ab94023066f7dc2abed5adfc8921c303e"} Mar 14 07:54:05 crc kubenswrapper[5129]: I0314 07:54:05.576542 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ccfe2f8500dfab431ecb8918fd670ab94023066f7dc2abed5adfc8921c303e" Mar 14 07:54:05 crc kubenswrapper[5129]: I0314 07:54:05.576630 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557914-s27tv" Mar 14 07:54:05 crc kubenswrapper[5129]: I0314 07:54:05.925999 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-h525p"] Mar 14 07:54:05 crc kubenswrapper[5129]: I0314 07:54:05.932400 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557908-h525p"] Mar 14 07:54:06 crc kubenswrapper[5129]: I0314 07:54:06.045313 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44662781-f028-4236-ba9d-6afeb110cb74" path="/var/lib/kubelet/pods/44662781-f028-4236-ba9d-6afeb110cb74/volumes" Mar 14 07:54:22 crc kubenswrapper[5129]: I0314 07:54:22.738218 5129 scope.go:117] "RemoveContainer" containerID="73834bbef5271873104e68c1ef2aff2aef5d0aff2d91900e2fdd708c447a7686" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.612532 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s96zg"] Mar 14 07:54:41 crc kubenswrapper[5129]: E0314 07:54:41.613363 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca67976-197b-42a1-9f59-d08349f5568b" containerName="oc" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.613377 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca67976-197b-42a1-9f59-d08349f5568b" containerName="oc" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.613507 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca67976-197b-42a1-9f59-d08349f5568b" containerName="oc" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.614459 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.629966 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s96zg"] Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.768257 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-catalog-content\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.768628 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqhtb\" (UniqueName: \"kubernetes.io/projected/33040bc2-becc-4fe3-8121-719a79dea3e3-kube-api-access-qqhtb\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.769062 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-utilities\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.871324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqhtb\" (UniqueName: \"kubernetes.io/projected/33040bc2-becc-4fe3-8121-719a79dea3e3-kube-api-access-qqhtb\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.872148 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-utilities\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.872267 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-catalog-content\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.872962 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-utilities\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.873320 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-catalog-content\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.893726 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqhtb\" (UniqueName: \"kubernetes.io/projected/33040bc2-becc-4fe3-8121-719a79dea3e3-kube-api-access-qqhtb\") pod \"redhat-operators-s96zg\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:41 crc kubenswrapper[5129]: I0314 07:54:41.934397 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:42 crc kubenswrapper[5129]: I0314 07:54:42.342342 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s96zg"] Mar 14 07:54:42 crc kubenswrapper[5129]: I0314 07:54:42.842618 5129 generic.go:334] "Generic (PLEG): container finished" podID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerID="f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0" exitCode=0 Mar 14 07:54:42 crc kubenswrapper[5129]: I0314 07:54:42.842671 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s96zg" event={"ID":"33040bc2-becc-4fe3-8121-719a79dea3e3","Type":"ContainerDied","Data":"f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0"} Mar 14 07:54:42 crc kubenswrapper[5129]: I0314 07:54:42.842706 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s96zg" event={"ID":"33040bc2-becc-4fe3-8121-719a79dea3e3","Type":"ContainerStarted","Data":"39f0fce2063ec69bf4398bce6c447da86aab876e0606c7bca701866f66a0768a"} Mar 14 07:54:44 crc kubenswrapper[5129]: I0314 07:54:44.865364 5129 generic.go:334] "Generic (PLEG): container finished" podID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerID="a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8" exitCode=0 Mar 14 07:54:44 crc kubenswrapper[5129]: I0314 07:54:44.865515 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s96zg" event={"ID":"33040bc2-becc-4fe3-8121-719a79dea3e3","Type":"ContainerDied","Data":"a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8"} Mar 14 07:54:45 crc kubenswrapper[5129]: I0314 07:54:45.879559 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s96zg" event={"ID":"33040bc2-becc-4fe3-8121-719a79dea3e3","Type":"ContainerStarted","Data":"da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156"} Mar 14 07:54:45 crc kubenswrapper[5129]: I0314 07:54:45.906239 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s96zg" podStartSLOduration=2.451304746 podStartE2EDuration="4.906218263s" podCreationTimestamp="2026-03-14 07:54:41 +0000 UTC" firstStartedPulling="2026-03-14 07:54:42.844350257 +0000 UTC m=+3345.596265441" lastFinishedPulling="2026-03-14 07:54:45.299263734 +0000 UTC m=+3348.051178958" observedRunningTime="2026-03-14 07:54:45.90572262 +0000 UTC m=+3348.657637874" watchObservedRunningTime="2026-03-14 07:54:45.906218263 +0000 UTC m=+3348.658133457" Mar 14 07:54:51 crc kubenswrapper[5129]: I0314 07:54:51.935193 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:51 crc kubenswrapper[5129]: I0314 07:54:51.937859 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:54:53 crc kubenswrapper[5129]: I0314 07:54:53.012748 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s96zg" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="registry-server" probeResult="failure" output=< Mar 14 07:54:53 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 07:54:53 crc kubenswrapper[5129]: > Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.193632 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vsthf"] Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.195758 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.199554 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsthf"] Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.459662 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btstl\" (UniqueName: \"kubernetes.io/projected/d6e7701b-4535-420d-9844-352c8c38a84a-kube-api-access-btstl\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.460303 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-utilities\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.461484 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-catalog-content\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.562494 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-catalog-content\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.562560 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btstl\" (UniqueName: \"kubernetes.io/projected/d6e7701b-4535-420d-9844-352c8c38a84a-kube-api-access-btstl\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.562586 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-utilities\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.563156 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-utilities\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.563406 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-catalog-content\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.581272 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btstl\" (UniqueName: \"kubernetes.io/projected/d6e7701b-4535-420d-9844-352c8c38a84a-kube-api-access-btstl\") pod \"community-operators-vsthf\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:00 crc kubenswrapper[5129]: I0314 07:55:00.670092 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:01 crc kubenswrapper[5129]: I0314 07:55:01.169874 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsthf"] Mar 14 07:55:02 crc kubenswrapper[5129]: I0314 07:55:02.003311 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:55:02 crc kubenswrapper[5129]: I0314 07:55:02.029485 5129 generic.go:334] "Generic (PLEG): container finished" podID="d6e7701b-4535-420d-9844-352c8c38a84a" containerID="55b7f263a7b012e0d5156634745586b0d53aa439a4ec1e6ee6bfb519c09f9d54" exitCode=0 Mar 14 07:55:02 crc kubenswrapper[5129]: I0314 07:55:02.029533 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsthf" event={"ID":"d6e7701b-4535-420d-9844-352c8c38a84a","Type":"ContainerDied","Data":"55b7f263a7b012e0d5156634745586b0d53aa439a4ec1e6ee6bfb519c09f9d54"} Mar 14 07:55:02 crc kubenswrapper[5129]: I0314 07:55:02.029561 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsthf" event={"ID":"d6e7701b-4535-420d-9844-352c8c38a84a","Type":"ContainerStarted","Data":"237ad6f4c75e434b17bb5a361bd88d5139d109021f0fb1c1aa877d65c4a5e499"} Mar 14 07:55:02 crc kubenswrapper[5129]: I0314 07:55:02.031490 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 07:55:02 crc kubenswrapper[5129]: I0314 07:55:02.057635 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:55:04 crc kubenswrapper[5129]: I0314 07:55:04.066458 5129 generic.go:334] "Generic (PLEG): container finished" podID="d6e7701b-4535-420d-9844-352c8c38a84a" containerID="168df2a2a436197e5badc96d2f105a3822f45314b97f22949216e4e53aa72004" exitCode=0 Mar 14 07:55:04 crc kubenswrapper[5129]: I0314 07:55:04.066584 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsthf" event={"ID":"d6e7701b-4535-420d-9844-352c8c38a84a","Type":"ContainerDied","Data":"168df2a2a436197e5badc96d2f105a3822f45314b97f22949216e4e53aa72004"} Mar 14 07:55:04 crc kubenswrapper[5129]: I0314 07:55:04.365030 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s96zg"] Mar 14 07:55:04 crc kubenswrapper[5129]: I0314 07:55:04.365795 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s96zg" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="registry-server" containerID="cri-o://da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156" gracePeriod=2 Mar 14 07:55:05 crc kubenswrapper[5129]: I0314 07:55:05.832430 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:55:05 crc kubenswrapper[5129]: I0314 07:55:05.941385 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-utilities\") pod \"33040bc2-becc-4fe3-8121-719a79dea3e3\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " Mar 14 07:55:05 crc kubenswrapper[5129]: I0314 07:55:05.941722 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-catalog-content\") pod \"33040bc2-becc-4fe3-8121-719a79dea3e3\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " Mar 14 07:55:05 crc kubenswrapper[5129]: I0314 07:55:05.941841 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqhtb\" (UniqueName: \"kubernetes.io/projected/33040bc2-becc-4fe3-8121-719a79dea3e3-kube-api-access-qqhtb\") pod \"33040bc2-becc-4fe3-8121-719a79dea3e3\" (UID: \"33040bc2-becc-4fe3-8121-719a79dea3e3\") " Mar 14 07:55:05 crc kubenswrapper[5129]: I0314 07:55:05.943326 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-utilities" (OuterVolumeSpecName: "utilities") pod "33040bc2-becc-4fe3-8121-719a79dea3e3" (UID: "33040bc2-becc-4fe3-8121-719a79dea3e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:55:05 crc kubenswrapper[5129]: I0314 07:55:05.949944 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33040bc2-becc-4fe3-8121-719a79dea3e3-kube-api-access-qqhtb" (OuterVolumeSpecName: "kube-api-access-qqhtb") pod "33040bc2-becc-4fe3-8121-719a79dea3e3" (UID: "33040bc2-becc-4fe3-8121-719a79dea3e3"). InnerVolumeSpecName "kube-api-access-qqhtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.044438 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.044935 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqhtb\" (UniqueName: \"kubernetes.io/projected/33040bc2-becc-4fe3-8121-719a79dea3e3-kube-api-access-qqhtb\") on node \"crc\" DevicePath \"\"" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.085118 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsthf" event={"ID":"d6e7701b-4535-420d-9844-352c8c38a84a","Type":"ContainerStarted","Data":"39d575c6a68b47d69c7ae017f267c1601fea1639ab7bf8164e23e3a3a3c40d33"} Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.087405 5129 generic.go:334] "Generic (PLEG): container finished" podID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerID="da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156" exitCode=0 Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.087457 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s96zg" event={"ID":"33040bc2-becc-4fe3-8121-719a79dea3e3","Type":"ContainerDied","Data":"da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156"} Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.087480 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s96zg" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.087489 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s96zg" event={"ID":"33040bc2-becc-4fe3-8121-719a79dea3e3","Type":"ContainerDied","Data":"39f0fce2063ec69bf4398bce6c447da86aab876e0606c7bca701866f66a0768a"} Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.087513 5129 scope.go:117] "RemoveContainer" containerID="da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.088967 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33040bc2-becc-4fe3-8121-719a79dea3e3" (UID: "33040bc2-becc-4fe3-8121-719a79dea3e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.111327 5129 scope.go:117] "RemoveContainer" containerID="a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.112072 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vsthf" podStartSLOduration=2.429122436 podStartE2EDuration="6.112054387s" podCreationTimestamp="2026-03-14 07:55:00 +0000 UTC" firstStartedPulling="2026-03-14 07:55:02.031266268 +0000 UTC m=+3364.783181452" lastFinishedPulling="2026-03-14 07:55:05.714198219 +0000 UTC m=+3368.466113403" observedRunningTime="2026-03-14 07:55:06.104975655 +0000 UTC m=+3368.856890839" watchObservedRunningTime="2026-03-14 07:55:06.112054387 +0000 UTC m=+3368.863969571" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.132043 5129 scope.go:117] "RemoveContainer" containerID="f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.146383 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33040bc2-becc-4fe3-8121-719a79dea3e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.173223 5129 scope.go:117] "RemoveContainer" containerID="da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156" Mar 14 07:55:06 crc kubenswrapper[5129]: E0314 07:55:06.173689 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156\": container with ID starting with da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156 not found: ID does not exist" containerID="da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.173722 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156"} err="failed to get container status \"da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156\": rpc error: code = NotFound desc = could not find container \"da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156\": container with ID starting with da8c2f7eb7e73cdff18bc12cb30d14292fa3f06eb7602ca41b2945c04b1b0156 not found: ID does not exist" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.173740 5129 scope.go:117] "RemoveContainer" containerID="a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8" Mar 14 07:55:06 crc kubenswrapper[5129]: E0314 07:55:06.174089 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8\": container with ID starting with a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8 not found: ID does not exist" containerID="a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.174119 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8"} err="failed to get container status \"a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8\": rpc error: code = NotFound desc = could not find container \"a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8\": container with ID starting with a013deab9ee07a8782e6d807cdd1768692dbc1eb266a19956ddc907dd4416dc8 not found: ID does not exist" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.174134 5129 scope.go:117] "RemoveContainer" containerID="f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0" Mar 14 07:55:06 crc kubenswrapper[5129]: E0314 07:55:06.174374 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0\": container with ID starting with f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0 not found: ID does not exist" containerID="f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.174390 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0"} err="failed to get container status \"f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0\": rpc error: code = NotFound desc = could not find container \"f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0\": container with ID starting with f9fc3a8d52b923427a388cafe0703e0b77a1cd703f98e909a771323273ae75a0 not found: ID does not exist" Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.432399 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s96zg"] Mar 14 07:55:06 crc kubenswrapper[5129]: I0314 07:55:06.438957 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s96zg"] Mar 14 07:55:08 crc kubenswrapper[5129]: I0314 07:55:08.045029 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" path="/var/lib/kubelet/pods/33040bc2-becc-4fe3-8121-719a79dea3e3/volumes" Mar 14 07:55:10 crc kubenswrapper[5129]: I0314 07:55:10.670994 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:10 crc kubenswrapper[5129]: I0314 07:55:10.671815 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:10 crc kubenswrapper[5129]: I0314 07:55:10.735479 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:11 crc kubenswrapper[5129]: I0314 07:55:11.198643 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:11 crc kubenswrapper[5129]: I0314 07:55:11.768236 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsthf"] Mar 14 07:55:13 crc kubenswrapper[5129]: I0314 07:55:13.151701 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vsthf" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="registry-server" containerID="cri-o://39d575c6a68b47d69c7ae017f267c1601fea1639ab7bf8164e23e3a3a3c40d33" gracePeriod=2 Mar 14 07:55:15 crc kubenswrapper[5129]: I0314 07:55:15.175956 5129 generic.go:334] "Generic (PLEG): container finished" podID="d6e7701b-4535-420d-9844-352c8c38a84a" containerID="39d575c6a68b47d69c7ae017f267c1601fea1639ab7bf8164e23e3a3a3c40d33" exitCode=0 Mar 14 07:55:15 crc kubenswrapper[5129]: I0314 07:55:15.176019 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsthf" event={"ID":"d6e7701b-4535-420d-9844-352c8c38a84a","Type":"ContainerDied","Data":"39d575c6a68b47d69c7ae017f267c1601fea1639ab7bf8164e23e3a3a3c40d33"} Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.119906 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.195018 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsthf" event={"ID":"d6e7701b-4535-420d-9844-352c8c38a84a","Type":"ContainerDied","Data":"237ad6f4c75e434b17bb5a361bd88d5139d109021f0fb1c1aa877d65c4a5e499"} Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.195100 5129 scope.go:117] "RemoveContainer" containerID="39d575c6a68b47d69c7ae017f267c1601fea1639ab7bf8164e23e3a3a3c40d33" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.195103 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsthf" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.213499 5129 scope.go:117] "RemoveContainer" containerID="168df2a2a436197e5badc96d2f105a3822f45314b97f22949216e4e53aa72004" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.228901 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-catalog-content\") pod \"d6e7701b-4535-420d-9844-352c8c38a84a\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.228963 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btstl\" (UniqueName: \"kubernetes.io/projected/d6e7701b-4535-420d-9844-352c8c38a84a-kube-api-access-btstl\") pod \"d6e7701b-4535-420d-9844-352c8c38a84a\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.229006 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-utilities\") pod \"d6e7701b-4535-420d-9844-352c8c38a84a\" (UID: \"d6e7701b-4535-420d-9844-352c8c38a84a\") " Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.230444 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-utilities" (OuterVolumeSpecName: "utilities") pod "d6e7701b-4535-420d-9844-352c8c38a84a" (UID: "d6e7701b-4535-420d-9844-352c8c38a84a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.234466 5129 scope.go:117] "RemoveContainer" containerID="55b7f263a7b012e0d5156634745586b0d53aa439a4ec1e6ee6bfb519c09f9d54" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.238805 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e7701b-4535-420d-9844-352c8c38a84a-kube-api-access-btstl" (OuterVolumeSpecName: "kube-api-access-btstl") pod "d6e7701b-4535-420d-9844-352c8c38a84a" (UID: "d6e7701b-4535-420d-9844-352c8c38a84a"). InnerVolumeSpecName "kube-api-access-btstl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.330700 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btstl\" (UniqueName: \"kubernetes.io/projected/d6e7701b-4535-420d-9844-352c8c38a84a-kube-api-access-btstl\") on node \"crc\" DevicePath \"\"" Mar 14 07:55:17 crc kubenswrapper[5129]: I0314 07:55:17.330753 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:55:19 crc kubenswrapper[5129]: I0314 07:55:19.151647 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6e7701b-4535-420d-9844-352c8c38a84a" (UID: "d6e7701b-4535-420d-9844-352c8c38a84a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:55:19 crc kubenswrapper[5129]: I0314 07:55:19.160510 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e7701b-4535-420d-9844-352c8c38a84a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:55:19 crc kubenswrapper[5129]: I0314 07:55:19.326589 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsthf"] Mar 14 07:55:19 crc kubenswrapper[5129]: I0314 07:55:19.334789 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vsthf"] Mar 14 07:55:20 crc kubenswrapper[5129]: I0314 07:55:20.045574 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" path="/var/lib/kubelet/pods/d6e7701b-4535-420d-9844-352c8c38a84a/volumes" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.160256 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557916-d4rdg"] Mar 14 07:56:00 crc kubenswrapper[5129]: E0314 07:56:00.161358 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="extract-utilities" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161384 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="extract-utilities" Mar 14 07:56:00 crc kubenswrapper[5129]: E0314 07:56:00.161408 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="extract-utilities" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161421 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="extract-utilities" Mar 14 07:56:00 crc kubenswrapper[5129]: E0314 07:56:00.161451 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="extract-content" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161466 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="extract-content" Mar 14 07:56:00 crc kubenswrapper[5129]: E0314 07:56:00.161488 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="registry-server" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161500 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="registry-server" Mar 14 07:56:00 crc kubenswrapper[5129]: E0314 07:56:00.161521 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="extract-content" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161533 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="extract-content" Mar 14 07:56:00 crc kubenswrapper[5129]: E0314 07:56:00.161570 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="registry-server" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161582 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="registry-server" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161870 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e7701b-4535-420d-9844-352c8c38a84a" containerName="registry-server" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.161902 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="33040bc2-becc-4fe3-8121-719a79dea3e3" containerName="registry-server" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.162629 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-d4rdg" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.165780 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.165810 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.167713 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.172246 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-d4rdg"] Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.305117 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbsc\" (UniqueName: \"kubernetes.io/projected/30939fec-e426-43a9-8400-0bbb6487042f-kube-api-access-4rbsc\") pod \"auto-csr-approver-29557916-d4rdg\" (UID: \"30939fec-e426-43a9-8400-0bbb6487042f\") " pod="openshift-infra/auto-csr-approver-29557916-d4rdg" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.406241 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbsc\" (UniqueName: \"kubernetes.io/projected/30939fec-e426-43a9-8400-0bbb6487042f-kube-api-access-4rbsc\") pod \"auto-csr-approver-29557916-d4rdg\" (UID: \"30939fec-e426-43a9-8400-0bbb6487042f\") " pod="openshift-infra/auto-csr-approver-29557916-d4rdg" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.428993 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbsc\" (UniqueName: \"kubernetes.io/projected/30939fec-e426-43a9-8400-0bbb6487042f-kube-api-access-4rbsc\") pod \"auto-csr-approver-29557916-d4rdg\" (UID: \"30939fec-e426-43a9-8400-0bbb6487042f\") " pod="openshift-infra/auto-csr-approver-29557916-d4rdg" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.532100 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-d4rdg" Mar 14 07:56:00 crc kubenswrapper[5129]: I0314 07:56:00.950049 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-d4rdg"] Mar 14 07:56:01 crc kubenswrapper[5129]: I0314 07:56:01.526584 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557916-d4rdg" event={"ID":"30939fec-e426-43a9-8400-0bbb6487042f","Type":"ContainerStarted","Data":"01d2258d624d672af106d68c6002e126481b4e467144c079ce8fc2aba520143e"} Mar 14 07:56:05 crc kubenswrapper[5129]: I0314 07:56:05.558940 5129 generic.go:334] "Generic (PLEG): container finished" podID="30939fec-e426-43a9-8400-0bbb6487042f" containerID="c2a398bc07be7ee3a375df99861e753adcd0a5cc7ef7d643ee974acddab39455" exitCode=0 Mar 14 07:56:05 crc kubenswrapper[5129]: I0314 07:56:05.559046 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557916-d4rdg" event={"ID":"30939fec-e426-43a9-8400-0bbb6487042f","Type":"ContainerDied","Data":"c2a398bc07be7ee3a375df99861e753adcd0a5cc7ef7d643ee974acddab39455"} Mar 14 07:56:06 crc kubenswrapper[5129]: I0314 07:56:06.918407 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-d4rdg" Mar 14 07:56:07 crc kubenswrapper[5129]: I0314 07:56:07.011933 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rbsc\" (UniqueName: \"kubernetes.io/projected/30939fec-e426-43a9-8400-0bbb6487042f-kube-api-access-4rbsc\") pod \"30939fec-e426-43a9-8400-0bbb6487042f\" (UID: \"30939fec-e426-43a9-8400-0bbb6487042f\") " Mar 14 07:56:07 crc kubenswrapper[5129]: I0314 07:56:07.021804 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30939fec-e426-43a9-8400-0bbb6487042f-kube-api-access-4rbsc" (OuterVolumeSpecName: "kube-api-access-4rbsc") pod "30939fec-e426-43a9-8400-0bbb6487042f" (UID: "30939fec-e426-43a9-8400-0bbb6487042f"). InnerVolumeSpecName "kube-api-access-4rbsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:56:07 crc kubenswrapper[5129]: I0314 07:56:07.114031 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rbsc\" (UniqueName: \"kubernetes.io/projected/30939fec-e426-43a9-8400-0bbb6487042f-kube-api-access-4rbsc\") on node \"crc\" DevicePath \"\"" Mar 14 07:56:07 crc kubenswrapper[5129]: I0314 07:56:07.589785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557916-d4rdg" event={"ID":"30939fec-e426-43a9-8400-0bbb6487042f","Type":"ContainerDied","Data":"01d2258d624d672af106d68c6002e126481b4e467144c079ce8fc2aba520143e"} Mar 14 07:56:07 crc kubenswrapper[5129]: I0314 07:56:07.589826 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d2258d624d672af106d68c6002e126481b4e467144c079ce8fc2aba520143e" Mar 14 07:56:07 crc kubenswrapper[5129]: I0314 07:56:07.589827 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557916-d4rdg" Mar 14 07:56:08 crc kubenswrapper[5129]: I0314 07:56:08.017108 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-mv4zh"] Mar 14 07:56:08 crc kubenswrapper[5129]: I0314 07:56:08.028463 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557910-mv4zh"] Mar 14 07:56:08 crc kubenswrapper[5129]: I0314 07:56:08.044273 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36229ab4-47a6-4953-bbc0-f5335f5643e1" path="/var/lib/kubelet/pods/36229ab4-47a6-4953-bbc0-f5335f5643e1/volumes" Mar 14 07:56:19 crc kubenswrapper[5129]: I0314 07:56:19.574645 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:56:19 crc kubenswrapper[5129]: I0314 07:56:19.575320 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:56:22 crc kubenswrapper[5129]: I0314 07:56:22.871061 5129 scope.go:117] "RemoveContainer" containerID="867fe2f6ada3474fb03ab134af565e68f9e19cfdcf92ad709bf5b4a188ead05d" Mar 14 07:56:49 crc kubenswrapper[5129]: I0314 07:56:49.574779 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:56:49 crc kubenswrapper[5129]: I0314 07:56:49.575179 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:57:19 crc kubenswrapper[5129]: I0314 07:57:19.574571 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:57:19 crc kubenswrapper[5129]: I0314 07:57:19.575574 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:57:19 crc kubenswrapper[5129]: I0314 07:57:19.575650 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 07:57:19 crc kubenswrapper[5129]: I0314 07:57:19.576379 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8906fc096299b710e84f146f378d355e3f323666ba280a3f46f8886223fc9eb7"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 07:57:19 crc kubenswrapper[5129]: I0314 07:57:19.576445 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://8906fc096299b710e84f146f378d355e3f323666ba280a3f46f8886223fc9eb7" gracePeriod=600 Mar 14 07:57:20 crc kubenswrapper[5129]: I0314 07:57:20.160350 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="8906fc096299b710e84f146f378d355e3f323666ba280a3f46f8886223fc9eb7" exitCode=0 Mar 14 07:57:20 crc kubenswrapper[5129]: I0314 07:57:20.160462 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"8906fc096299b710e84f146f378d355e3f323666ba280a3f46f8886223fc9eb7"} Mar 14 07:57:20 crc kubenswrapper[5129]: I0314 07:57:20.160759 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1"} Mar 14 07:57:20 crc kubenswrapper[5129]: I0314 07:57:20.160782 5129 scope.go:117] "RemoveContainer" containerID="3056221c3051bf8401ab17aae2764408107453e28b72a0459f768a86e0ca4b38" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.166655 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557918-g6f4v"] Mar 14 07:58:00 crc kubenswrapper[5129]: E0314 07:58:00.168166 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30939fec-e426-43a9-8400-0bbb6487042f" containerName="oc" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.168189 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="30939fec-e426-43a9-8400-0bbb6487042f" containerName="oc" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.168364 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="30939fec-e426-43a9-8400-0bbb6487042f" containerName="oc" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.169036 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-g6f4v" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.173295 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.173922 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.174346 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.198552 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-g6f4v"] Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.313451 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whcl\" (UniqueName: \"kubernetes.io/projected/0eb3640d-7197-4f66-a9cc-56febda4eb73-kube-api-access-5whcl\") pod \"auto-csr-approver-29557918-g6f4v\" (UID: \"0eb3640d-7197-4f66-a9cc-56febda4eb73\") " pod="openshift-infra/auto-csr-approver-29557918-g6f4v" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.415709 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whcl\" (UniqueName: \"kubernetes.io/projected/0eb3640d-7197-4f66-a9cc-56febda4eb73-kube-api-access-5whcl\") pod \"auto-csr-approver-29557918-g6f4v\" (UID: \"0eb3640d-7197-4f66-a9cc-56febda4eb73\") " pod="openshift-infra/auto-csr-approver-29557918-g6f4v" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.455672 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whcl\" (UniqueName: \"kubernetes.io/projected/0eb3640d-7197-4f66-a9cc-56febda4eb73-kube-api-access-5whcl\") pod \"auto-csr-approver-29557918-g6f4v\" (UID: \"0eb3640d-7197-4f66-a9cc-56febda4eb73\") " pod="openshift-infra/auto-csr-approver-29557918-g6f4v" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.504314 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-g6f4v" Mar 14 07:58:00 crc kubenswrapper[5129]: I0314 07:58:00.987956 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-g6f4v"] Mar 14 07:58:01 crc kubenswrapper[5129]: I0314 07:58:01.575572 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557918-g6f4v" event={"ID":"0eb3640d-7197-4f66-a9cc-56febda4eb73","Type":"ContainerStarted","Data":"135a9fa9599426b328176cfcd3284c474fe6951ef058cddfc33c5944d55dc1fb"} Mar 14 07:58:04 crc kubenswrapper[5129]: I0314 07:58:04.617456 5129 generic.go:334] "Generic (PLEG): container finished" podID="0eb3640d-7197-4f66-a9cc-56febda4eb73" containerID="13c47c3b19304504043d732b1aade54c969881d9900c9f75d325ddb98b87230a" exitCode=0 Mar 14 07:58:04 crc kubenswrapper[5129]: I0314 07:58:04.617592 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557918-g6f4v" event={"ID":"0eb3640d-7197-4f66-a9cc-56febda4eb73","Type":"ContainerDied","Data":"13c47c3b19304504043d732b1aade54c969881d9900c9f75d325ddb98b87230a"} Mar 14 07:58:05 crc kubenswrapper[5129]: I0314 07:58:05.974872 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-g6f4v" Mar 14 07:58:06 crc kubenswrapper[5129]: I0314 07:58:06.125669 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5whcl\" (UniqueName: \"kubernetes.io/projected/0eb3640d-7197-4f66-a9cc-56febda4eb73-kube-api-access-5whcl\") pod \"0eb3640d-7197-4f66-a9cc-56febda4eb73\" (UID: \"0eb3640d-7197-4f66-a9cc-56febda4eb73\") " Mar 14 07:58:06 crc kubenswrapper[5129]: I0314 07:58:06.141852 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb3640d-7197-4f66-a9cc-56febda4eb73-kube-api-access-5whcl" (OuterVolumeSpecName: "kube-api-access-5whcl") pod "0eb3640d-7197-4f66-a9cc-56febda4eb73" (UID: "0eb3640d-7197-4f66-a9cc-56febda4eb73"). InnerVolumeSpecName "kube-api-access-5whcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:58:06 crc kubenswrapper[5129]: I0314 07:58:06.227050 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5whcl\" (UniqueName: \"kubernetes.io/projected/0eb3640d-7197-4f66-a9cc-56febda4eb73-kube-api-access-5whcl\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:06 crc kubenswrapper[5129]: I0314 07:58:06.641052 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557918-g6f4v" event={"ID":"0eb3640d-7197-4f66-a9cc-56febda4eb73","Type":"ContainerDied","Data":"135a9fa9599426b328176cfcd3284c474fe6951ef058cddfc33c5944d55dc1fb"} Mar 14 07:58:06 crc kubenswrapper[5129]: I0314 07:58:06.641133 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135a9fa9599426b328176cfcd3284c474fe6951ef058cddfc33c5944d55dc1fb" Mar 14 07:58:06 crc kubenswrapper[5129]: I0314 07:58:06.641162 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557918-g6f4v" Mar 14 07:58:07 crc kubenswrapper[5129]: I0314 07:58:07.082837 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-svm7s"] Mar 14 07:58:07 crc kubenswrapper[5129]: I0314 07:58:07.094116 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557912-svm7s"] Mar 14 07:58:08 crc kubenswrapper[5129]: I0314 07:58:08.050393 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d7d18c-2c51-4e29-98cf-8bb2d50310bb" path="/var/lib/kubelet/pods/13d7d18c-2c51-4e29-98cf-8bb2d50310bb/volumes" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.696370 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2nfx"] Mar 14 07:58:12 crc kubenswrapper[5129]: E0314 07:58:12.697050 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb3640d-7197-4f66-a9cc-56febda4eb73" containerName="oc" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.697067 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb3640d-7197-4f66-a9cc-56febda4eb73" containerName="oc" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.697276 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb3640d-7197-4f66-a9cc-56febda4eb73" containerName="oc" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.698432 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.723217 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2nfx"] Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.726409 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-catalog-content\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.726465 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-utilities\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.726499 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzgrm\" (UniqueName: \"kubernetes.io/projected/6b2436eb-da97-42ed-85b5-864759e488dc-kube-api-access-rzgrm\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.827841 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzgrm\" (UniqueName: \"kubernetes.io/projected/6b2436eb-da97-42ed-85b5-864759e488dc-kube-api-access-rzgrm\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.827972 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-catalog-content\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.827997 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-utilities\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.828480 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-utilities\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.828791 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-catalog-content\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:12 crc kubenswrapper[5129]: I0314 07:58:12.852922 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzgrm\" (UniqueName: \"kubernetes.io/projected/6b2436eb-da97-42ed-85b5-864759e488dc-kube-api-access-rzgrm\") pod \"certified-operators-s2nfx\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:13 crc kubenswrapper[5129]: I0314 07:58:13.025396 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:13 crc kubenswrapper[5129]: I0314 07:58:13.272779 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2nfx"] Mar 14 07:58:13 crc kubenswrapper[5129]: W0314 07:58:13.289079 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b2436eb_da97_42ed_85b5_864759e488dc.slice/crio-f21b46edef64287cdd7a93dd19ecc0b8df70568277108fea8212336776d00ad1 WatchSource:0}: Error finding container f21b46edef64287cdd7a93dd19ecc0b8df70568277108fea8212336776d00ad1: Status 404 returned error can't find the container with id f21b46edef64287cdd7a93dd19ecc0b8df70568277108fea8212336776d00ad1 Mar 14 07:58:13 crc kubenswrapper[5129]: I0314 07:58:13.703973 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2nfx" event={"ID":"6b2436eb-da97-42ed-85b5-864759e488dc","Type":"ContainerStarted","Data":"f21b46edef64287cdd7a93dd19ecc0b8df70568277108fea8212336776d00ad1"} Mar 14 07:58:14 crc kubenswrapper[5129]: I0314 07:58:14.716012 5129 generic.go:334] "Generic (PLEG): container finished" podID="6b2436eb-da97-42ed-85b5-864759e488dc" containerID="c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154" exitCode=0 Mar 14 07:58:14 crc kubenswrapper[5129]: I0314 07:58:14.716151 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2nfx" event={"ID":"6b2436eb-da97-42ed-85b5-864759e488dc","Type":"ContainerDied","Data":"c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154"} Mar 14 07:58:16 crc kubenswrapper[5129]: I0314 07:58:16.754908 5129 generic.go:334] "Generic (PLEG): container finished" podID="6b2436eb-da97-42ed-85b5-864759e488dc" containerID="4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb" exitCode=0 Mar 14 07:58:16 crc kubenswrapper[5129]: I0314 07:58:16.755115 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2nfx" event={"ID":"6b2436eb-da97-42ed-85b5-864759e488dc","Type":"ContainerDied","Data":"4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb"} Mar 14 07:58:17 crc kubenswrapper[5129]: I0314 07:58:17.767195 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2nfx" event={"ID":"6b2436eb-da97-42ed-85b5-864759e488dc","Type":"ContainerStarted","Data":"dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae"} Mar 14 07:58:17 crc kubenswrapper[5129]: I0314 07:58:17.792787 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2nfx" podStartSLOduration=3.279164035 podStartE2EDuration="5.792763181s" podCreationTimestamp="2026-03-14 07:58:12 +0000 UTC" firstStartedPulling="2026-03-14 07:58:14.719900736 +0000 UTC m=+3557.471815960" lastFinishedPulling="2026-03-14 07:58:17.233499922 +0000 UTC m=+3559.985415106" observedRunningTime="2026-03-14 07:58:17.791391793 +0000 UTC m=+3560.543307007" watchObservedRunningTime="2026-03-14 07:58:17.792763181 +0000 UTC m=+3560.544678365" Mar 14 07:58:22 crc kubenswrapper[5129]: I0314 07:58:22.981577 5129 scope.go:117] "RemoveContainer" containerID="043c0f27e708e4c540d9ce3a08e4109ccf4d04569c1c3886a36b76f040578308" Mar 14 07:58:23 crc kubenswrapper[5129]: I0314 07:58:23.027822 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:23 crc kubenswrapper[5129]: I0314 07:58:23.031385 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:23 crc kubenswrapper[5129]: I0314 07:58:23.093874 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:23 crc kubenswrapper[5129]: I0314 07:58:23.881116 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:23 crc kubenswrapper[5129]: I0314 07:58:23.939063 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2nfx"] Mar 14 07:58:25 crc kubenswrapper[5129]: I0314 07:58:25.843417 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2nfx" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="registry-server" containerID="cri-o://dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae" gracePeriod=2 Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.832415 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.852826 5129 generic.go:334] "Generic (PLEG): container finished" podID="6b2436eb-da97-42ed-85b5-864759e488dc" containerID="dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae" exitCode=0 Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.852883 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2nfx" event={"ID":"6b2436eb-da97-42ed-85b5-864759e488dc","Type":"ContainerDied","Data":"dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae"} Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.852927 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2nfx" event={"ID":"6b2436eb-da97-42ed-85b5-864759e488dc","Type":"ContainerDied","Data":"f21b46edef64287cdd7a93dd19ecc0b8df70568277108fea8212336776d00ad1"} Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.852920 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2nfx" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.853003 5129 scope.go:117] "RemoveContainer" containerID="dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.910308 5129 scope.go:117] "RemoveContainer" containerID="4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.930788 5129 scope.go:117] "RemoveContainer" containerID="c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.966253 5129 scope.go:117] "RemoveContainer" containerID="dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae" Mar 14 07:58:26 crc kubenswrapper[5129]: E0314 07:58:26.966868 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae\": container with ID starting with dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae not found: ID does not exist" containerID="dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.966957 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae"} err="failed to get container status \"dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae\": rpc error: code = NotFound desc = could not find container \"dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae\": container with ID starting with dde88e9b1d3866e14c0f16348b9d80f361e21094b15e52e15fafc7e7e5341dae not found: ID does not exist" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.967017 5129 scope.go:117] "RemoveContainer" containerID="4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb" Mar 14 07:58:26 crc kubenswrapper[5129]: E0314 07:58:26.967912 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb\": container with ID starting with 4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb not found: ID does not exist" containerID="4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.967956 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb"} err="failed to get container status \"4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb\": rpc error: code = NotFound desc = could not find container \"4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb\": container with ID starting with 4533fe8ef033ae49adf7b64372b0b076cbb7a57dd70c45fdd9e0edaf22d90deb not found: ID does not exist" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.967980 5129 scope.go:117] "RemoveContainer" containerID="c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154" Mar 14 07:58:26 crc kubenswrapper[5129]: E0314 07:58:26.968362 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154\": container with ID starting with c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154 not found: ID does not exist" containerID="c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.968413 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154"} err="failed to get container status \"c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154\": rpc error: code = NotFound desc = could not find container \"c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154\": container with ID starting with c641412fa83d1751faa8eb81ad47ecd17d4a246bc109398557bf5717cb0f2154 not found: ID does not exist" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.979201 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-utilities\") pod \"6b2436eb-da97-42ed-85b5-864759e488dc\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.979291 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-catalog-content\") pod \"6b2436eb-da97-42ed-85b5-864759e488dc\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.979582 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzgrm\" (UniqueName: \"kubernetes.io/projected/6b2436eb-da97-42ed-85b5-864759e488dc-kube-api-access-rzgrm\") pod \"6b2436eb-da97-42ed-85b5-864759e488dc\" (UID: \"6b2436eb-da97-42ed-85b5-864759e488dc\") " Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.982786 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-utilities" (OuterVolumeSpecName: "utilities") pod "6b2436eb-da97-42ed-85b5-864759e488dc" (UID: "6b2436eb-da97-42ed-85b5-864759e488dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:58:26 crc kubenswrapper[5129]: I0314 07:58:26.986768 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2436eb-da97-42ed-85b5-864759e488dc-kube-api-access-rzgrm" (OuterVolumeSpecName: "kube-api-access-rzgrm") pod "6b2436eb-da97-42ed-85b5-864759e488dc" (UID: "6b2436eb-da97-42ed-85b5-864759e488dc"). InnerVolumeSpecName "kube-api-access-rzgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:58:27 crc kubenswrapper[5129]: I0314 07:58:27.083306 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzgrm\" (UniqueName: \"kubernetes.io/projected/6b2436eb-da97-42ed-85b5-864759e488dc-kube-api-access-rzgrm\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:27 crc kubenswrapper[5129]: I0314 07:58:27.083382 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:27 crc kubenswrapper[5129]: I0314 07:58:27.905088 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b2436eb-da97-42ed-85b5-864759e488dc" (UID: "6b2436eb-da97-42ed-85b5-864759e488dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:58:27 crc kubenswrapper[5129]: I0314 07:58:27.999130 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2436eb-da97-42ed-85b5-864759e488dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:58:28 crc kubenswrapper[5129]: I0314 07:58:28.115739 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2nfx"] Mar 14 07:58:28 crc kubenswrapper[5129]: I0314 07:58:28.125485 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2nfx"] Mar 14 07:58:30 crc kubenswrapper[5129]: I0314 07:58:30.052915 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" path="/var/lib/kubelet/pods/6b2436eb-da97-42ed-85b5-864759e488dc/volumes" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.208255 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rlxdr"] Mar 14 07:58:49 crc kubenswrapper[5129]: E0314 07:58:49.210279 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="extract-content" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.210311 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="extract-content" Mar 14 07:58:49 crc kubenswrapper[5129]: E0314 07:58:49.210333 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="registry-server" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.210355 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="registry-server" Mar 14 07:58:49 crc kubenswrapper[5129]: E0314 07:58:49.210436 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="extract-utilities" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.210455 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="extract-utilities" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.212718 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2436eb-da97-42ed-85b5-864759e488dc" containerName="registry-server" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.217939 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.254246 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlxdr"] Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.356343 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-catalog-content\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.356486 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2ndq\" (UniqueName: \"kubernetes.io/projected/c85863fe-75eb-477c-a2f3-799385049e59-kube-api-access-s2ndq\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.356675 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-utilities\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.458353 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-catalog-content\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.458449 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2ndq\" (UniqueName: \"kubernetes.io/projected/c85863fe-75eb-477c-a2f3-799385049e59-kube-api-access-s2ndq\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.458507 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-utilities\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.458909 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-catalog-content\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.459765 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-utilities\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.492007 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2ndq\" (UniqueName: \"kubernetes.io/projected/c85863fe-75eb-477c-a2f3-799385049e59-kube-api-access-s2ndq\") pod \"redhat-marketplace-rlxdr\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:49 crc kubenswrapper[5129]: I0314 07:58:49.596663 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:50 crc kubenswrapper[5129]: I0314 07:58:50.205869 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlxdr"] Mar 14 07:58:51 crc kubenswrapper[5129]: I0314 07:58:51.071182 5129 generic.go:334] "Generic (PLEG): container finished" podID="c85863fe-75eb-477c-a2f3-799385049e59" containerID="1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5" exitCode=0 Mar 14 07:58:51 crc kubenswrapper[5129]: I0314 07:58:51.071276 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlxdr" event={"ID":"c85863fe-75eb-477c-a2f3-799385049e59","Type":"ContainerDied","Data":"1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5"} Mar 14 07:58:51 crc kubenswrapper[5129]: I0314 07:58:51.071512 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlxdr" event={"ID":"c85863fe-75eb-477c-a2f3-799385049e59","Type":"ContainerStarted","Data":"f4d9e4fedc28da7e5ce7f29a2f9a966dcd8868db1df383578f7d8dbb358549f7"} Mar 14 07:58:52 crc kubenswrapper[5129]: I0314 07:58:52.081107 5129 generic.go:334] "Generic (PLEG): container finished" podID="c85863fe-75eb-477c-a2f3-799385049e59" containerID="9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53" exitCode=0 Mar 14 07:58:52 crc kubenswrapper[5129]: I0314 07:58:52.081169 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlxdr" event={"ID":"c85863fe-75eb-477c-a2f3-799385049e59","Type":"ContainerDied","Data":"9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53"} Mar 14 07:58:53 crc kubenswrapper[5129]: I0314 07:58:53.093825 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlxdr" event={"ID":"c85863fe-75eb-477c-a2f3-799385049e59","Type":"ContainerStarted","Data":"966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c"} Mar 14 07:58:53 crc kubenswrapper[5129]: I0314 07:58:53.124225 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rlxdr" podStartSLOduration=2.4128134709999998 podStartE2EDuration="4.124200301s" podCreationTimestamp="2026-03-14 07:58:49 +0000 UTC" firstStartedPulling="2026-03-14 07:58:51.073257149 +0000 UTC m=+3593.825172333" lastFinishedPulling="2026-03-14 07:58:52.784643969 +0000 UTC m=+3595.536559163" observedRunningTime="2026-03-14 07:58:53.122830053 +0000 UTC m=+3595.874745277" watchObservedRunningTime="2026-03-14 07:58:53.124200301 +0000 UTC m=+3595.876115495" Mar 14 07:58:59 crc kubenswrapper[5129]: I0314 07:58:59.597061 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:59 crc kubenswrapper[5129]: I0314 07:58:59.597714 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:58:59 crc kubenswrapper[5129]: I0314 07:58:59.641848 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:59:00 crc kubenswrapper[5129]: I0314 07:59:00.192247 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:59:00 crc kubenswrapper[5129]: I0314 07:59:00.233910 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlxdr"] Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.159618 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rlxdr" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="registry-server" containerID="cri-o://966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c" gracePeriod=2 Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.565385 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.680376 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2ndq\" (UniqueName: \"kubernetes.io/projected/c85863fe-75eb-477c-a2f3-799385049e59-kube-api-access-s2ndq\") pod \"c85863fe-75eb-477c-a2f3-799385049e59\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.680440 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-catalog-content\") pod \"c85863fe-75eb-477c-a2f3-799385049e59\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.680521 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-utilities\") pod \"c85863fe-75eb-477c-a2f3-799385049e59\" (UID: \"c85863fe-75eb-477c-a2f3-799385049e59\") " Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.681570 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-utilities" (OuterVolumeSpecName: "utilities") pod "c85863fe-75eb-477c-a2f3-799385049e59" (UID: "c85863fe-75eb-477c-a2f3-799385049e59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.695481 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85863fe-75eb-477c-a2f3-799385049e59-kube-api-access-s2ndq" (OuterVolumeSpecName: "kube-api-access-s2ndq") pod "c85863fe-75eb-477c-a2f3-799385049e59" (UID: "c85863fe-75eb-477c-a2f3-799385049e59"). InnerVolumeSpecName "kube-api-access-s2ndq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.782533 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2ndq\" (UniqueName: \"kubernetes.io/projected/c85863fe-75eb-477c-a2f3-799385049e59-kube-api-access-s2ndq\") on node \"crc\" DevicePath \"\"" Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.782570 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.940073 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c85863fe-75eb-477c-a2f3-799385049e59" (UID: "c85863fe-75eb-477c-a2f3-799385049e59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 07:59:02 crc kubenswrapper[5129]: I0314 07:59:02.985982 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85863fe-75eb-477c-a2f3-799385049e59-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.167835 5129 generic.go:334] "Generic (PLEG): container finished" podID="c85863fe-75eb-477c-a2f3-799385049e59" containerID="966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c" exitCode=0 Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.167877 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlxdr" event={"ID":"c85863fe-75eb-477c-a2f3-799385049e59","Type":"ContainerDied","Data":"966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c"} Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.167908 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rlxdr" event={"ID":"c85863fe-75eb-477c-a2f3-799385049e59","Type":"ContainerDied","Data":"f4d9e4fedc28da7e5ce7f29a2f9a966dcd8868db1df383578f7d8dbb358549f7"} Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.167927 5129 scope.go:117] "RemoveContainer" containerID="966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.168780 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rlxdr" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.198783 5129 scope.go:117] "RemoveContainer" containerID="9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.204464 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlxdr"] Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.210886 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rlxdr"] Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.224067 5129 scope.go:117] "RemoveContainer" containerID="1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.251917 5129 scope.go:117] "RemoveContainer" containerID="966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c" Mar 14 07:59:03 crc kubenswrapper[5129]: E0314 07:59:03.252490 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c\": container with ID starting with 966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c not found: ID does not exist" containerID="966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.252586 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c"} err="failed to get container status \"966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c\": rpc error: code = NotFound desc = could not find container \"966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c\": container with ID starting with 966afc89b3442e73dc782edb2ff17ca8fe97af2e7ba4f956775e628391f2988c not found: ID does not exist" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.252731 5129 scope.go:117] "RemoveContainer" containerID="9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53" Mar 14 07:59:03 crc kubenswrapper[5129]: E0314 07:59:03.253283 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53\": container with ID starting with 9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53 not found: ID does not exist" containerID="9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.253340 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53"} err="failed to get container status \"9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53\": rpc error: code = NotFound desc = could not find container \"9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53\": container with ID starting with 9ab43093655db3e411849769819ab8472365510b35918362b8ed3a19c6010b53 not found: ID does not exist" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.253365 5129 scope.go:117] "RemoveContainer" containerID="1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5" Mar 14 07:59:03 crc kubenswrapper[5129]: E0314 07:59:03.253923 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5\": container with ID starting with 1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5 not found: ID does not exist" containerID="1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5" Mar 14 07:59:03 crc kubenswrapper[5129]: I0314 07:59:03.253967 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5"} err="failed to get container status \"1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5\": rpc error: code = NotFound desc = could not find container \"1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5\": container with ID starting with 1d3b3f013563cb334b16a0c3811174ccd391c3576a90cb9f63580221e87b67f5 not found: ID does not exist" Mar 14 07:59:04 crc kubenswrapper[5129]: I0314 07:59:04.048486 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85863fe-75eb-477c-a2f3-799385049e59" path="/var/lib/kubelet/pods/c85863fe-75eb-477c-a2f3-799385049e59/volumes" Mar 14 07:59:19 crc kubenswrapper[5129]: I0314 07:59:19.574403 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:59:19 crc kubenswrapper[5129]: I0314 07:59:19.575023 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 07:59:49 crc kubenswrapper[5129]: I0314 07:59:49.574512 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 07:59:49 crc kubenswrapper[5129]: I0314 07:59:49.575237 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.140960 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557920-hhckp"] Mar 14 08:00:00 crc kubenswrapper[5129]: E0314 08:00:00.141977 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="registry-server" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.141993 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="registry-server" Mar 14 08:00:00 crc kubenswrapper[5129]: E0314 08:00:00.142012 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="extract-content" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.142018 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="extract-content" Mar 14 08:00:00 crc kubenswrapper[5129]: E0314 08:00:00.142034 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="extract-utilities" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.142042 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="extract-utilities" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.142213 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85863fe-75eb-477c-a2f3-799385049e59" containerName="registry-server" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.142903 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-hhckp" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.147947 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.147970 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.147995 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.148584 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg"] Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.149724 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.151889 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.151970 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.156703 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-hhckp"] Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.162692 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg"] Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.226403 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f53960c-a7ff-458b-a85c-bb252e8da404-config-volume\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.226459 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmw8h\" (UniqueName: \"kubernetes.io/projected/6f53960c-a7ff-458b-a85c-bb252e8da404-kube-api-access-qmw8h\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.226507 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f53960c-a7ff-458b-a85c-bb252e8da404-secret-volume\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.226525 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9fn\" (UniqueName: \"kubernetes.io/projected/e804cb3a-bd5e-4870-9e6d-d7406633ca41-kube-api-access-nh9fn\") pod \"auto-csr-approver-29557920-hhckp\" (UID: \"e804cb3a-bd5e-4870-9e6d-d7406633ca41\") " pod="openshift-infra/auto-csr-approver-29557920-hhckp" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.327740 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f53960c-a7ff-458b-a85c-bb252e8da404-config-volume\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.327799 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmw8h\" (UniqueName: \"kubernetes.io/projected/6f53960c-a7ff-458b-a85c-bb252e8da404-kube-api-access-qmw8h\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.327848 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f53960c-a7ff-458b-a85c-bb252e8da404-secret-volume\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.327868 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9fn\" (UniqueName: \"kubernetes.io/projected/e804cb3a-bd5e-4870-9e6d-d7406633ca41-kube-api-access-nh9fn\") pod \"auto-csr-approver-29557920-hhckp\" (UID: \"e804cb3a-bd5e-4870-9e6d-d7406633ca41\") " pod="openshift-infra/auto-csr-approver-29557920-hhckp" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.328678 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f53960c-a7ff-458b-a85c-bb252e8da404-config-volume\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.333342 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f53960c-a7ff-458b-a85c-bb252e8da404-secret-volume\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.345052 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9fn\" (UniqueName: \"kubernetes.io/projected/e804cb3a-bd5e-4870-9e6d-d7406633ca41-kube-api-access-nh9fn\") pod \"auto-csr-approver-29557920-hhckp\" (UID: \"e804cb3a-bd5e-4870-9e6d-d7406633ca41\") " pod="openshift-infra/auto-csr-approver-29557920-hhckp" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.352164 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmw8h\" (UniqueName: \"kubernetes.io/projected/6f53960c-a7ff-458b-a85c-bb252e8da404-kube-api-access-qmw8h\") pod \"collect-profiles-29557920-rrqbg\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.460999 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-hhckp" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.470272 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.911667 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg"] Mar 14 08:00:00 crc kubenswrapper[5129]: I0314 08:00:00.960090 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-hhckp"] Mar 14 08:00:00 crc kubenswrapper[5129]: W0314 08:00:00.963754 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode804cb3a_bd5e_4870_9e6d_d7406633ca41.slice/crio-df7d6d7afe436412a231dec95e1a1c490619574ee31dda8ab571c21390a2e54d WatchSource:0}: Error finding container df7d6d7afe436412a231dec95e1a1c490619574ee31dda8ab571c21390a2e54d: Status 404 returned error can't find the container with id df7d6d7afe436412a231dec95e1a1c490619574ee31dda8ab571c21390a2e54d Mar 14 08:00:01 crc kubenswrapper[5129]: I0314 08:00:01.656626 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557920-hhckp" event={"ID":"e804cb3a-bd5e-4870-9e6d-d7406633ca41","Type":"ContainerStarted","Data":"df7d6d7afe436412a231dec95e1a1c490619574ee31dda8ab571c21390a2e54d"} Mar 14 08:00:01 crc kubenswrapper[5129]: I0314 08:00:01.659064 5129 generic.go:334] "Generic (PLEG): container finished" podID="6f53960c-a7ff-458b-a85c-bb252e8da404" containerID="327635a2f8d665b5ace7520c44a2bab8a326b59cc630f7dec69f389d499e6420" exitCode=0 Mar 14 08:00:01 crc kubenswrapper[5129]: I0314 08:00:01.659092 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" event={"ID":"6f53960c-a7ff-458b-a85c-bb252e8da404","Type":"ContainerDied","Data":"327635a2f8d665b5ace7520c44a2bab8a326b59cc630f7dec69f389d499e6420"} Mar 14 08:00:01 crc kubenswrapper[5129]: I0314 08:00:01.659107 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" event={"ID":"6f53960c-a7ff-458b-a85c-bb252e8da404","Type":"ContainerStarted","Data":"755be01637a952cccda06b7fce76163a7ade859217c8b982c4d1f9e8b19e5232"} Mar 14 08:00:02 crc kubenswrapper[5129]: I0314 08:00:02.922142 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.072291 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f53960c-a7ff-458b-a85c-bb252e8da404-config-volume\") pod \"6f53960c-a7ff-458b-a85c-bb252e8da404\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.072349 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmw8h\" (UniqueName: \"kubernetes.io/projected/6f53960c-a7ff-458b-a85c-bb252e8da404-kube-api-access-qmw8h\") pod \"6f53960c-a7ff-458b-a85c-bb252e8da404\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.072444 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f53960c-a7ff-458b-a85c-bb252e8da404-secret-volume\") pod \"6f53960c-a7ff-458b-a85c-bb252e8da404\" (UID: \"6f53960c-a7ff-458b-a85c-bb252e8da404\") " Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.073166 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f53960c-a7ff-458b-a85c-bb252e8da404-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f53960c-a7ff-458b-a85c-bb252e8da404" (UID: "6f53960c-a7ff-458b-a85c-bb252e8da404"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.078854 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f53960c-a7ff-458b-a85c-bb252e8da404-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f53960c-a7ff-458b-a85c-bb252e8da404" (UID: "6f53960c-a7ff-458b-a85c-bb252e8da404"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.078913 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f53960c-a7ff-458b-a85c-bb252e8da404-kube-api-access-qmw8h" (OuterVolumeSpecName: "kube-api-access-qmw8h") pod "6f53960c-a7ff-458b-a85c-bb252e8da404" (UID: "6f53960c-a7ff-458b-a85c-bb252e8da404"). InnerVolumeSpecName "kube-api-access-qmw8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.174521 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f53960c-a7ff-458b-a85c-bb252e8da404-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.174561 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmw8h\" (UniqueName: \"kubernetes.io/projected/6f53960c-a7ff-458b-a85c-bb252e8da404-kube-api-access-qmw8h\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.174577 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f53960c-a7ff-458b-a85c-bb252e8da404-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.672569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" event={"ID":"6f53960c-a7ff-458b-a85c-bb252e8da404","Type":"ContainerDied","Data":"755be01637a952cccda06b7fce76163a7ade859217c8b982c4d1f9e8b19e5232"} Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.672638 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="755be01637a952cccda06b7fce76163a7ade859217c8b982c4d1f9e8b19e5232" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.672611 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg" Mar 14 08:00:03 crc kubenswrapper[5129]: I0314 08:00:03.996124 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht"] Mar 14 08:00:04 crc kubenswrapper[5129]: I0314 08:00:04.001962 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557875-6nwht"] Mar 14 08:00:04 crc kubenswrapper[5129]: I0314 08:00:04.067046 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec477a7-244e-4c14-a6b8-f7d09cb2777d" path="/var/lib/kubelet/pods/4ec477a7-244e-4c14-a6b8-f7d09cb2777d/volumes" Mar 14 08:00:05 crc kubenswrapper[5129]: I0314 08:00:05.692334 5129 generic.go:334] "Generic (PLEG): container finished" podID="e804cb3a-bd5e-4870-9e6d-d7406633ca41" containerID="d1eec9e39d314b8f723aeb67b7243cd33fd9825d84766e407e6d227662190334" exitCode=0 Mar 14 08:00:05 crc kubenswrapper[5129]: I0314 08:00:05.692476 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557920-hhckp" event={"ID":"e804cb3a-bd5e-4870-9e6d-d7406633ca41","Type":"ContainerDied","Data":"d1eec9e39d314b8f723aeb67b7243cd33fd9825d84766e407e6d227662190334"} Mar 14 08:00:06 crc kubenswrapper[5129]: I0314 08:00:06.979035 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-hhckp" Mar 14 08:00:07 crc kubenswrapper[5129]: I0314 08:00:07.126863 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9fn\" (UniqueName: \"kubernetes.io/projected/e804cb3a-bd5e-4870-9e6d-d7406633ca41-kube-api-access-nh9fn\") pod \"e804cb3a-bd5e-4870-9e6d-d7406633ca41\" (UID: \"e804cb3a-bd5e-4870-9e6d-d7406633ca41\") " Mar 14 08:00:07 crc kubenswrapper[5129]: I0314 08:00:07.133638 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e804cb3a-bd5e-4870-9e6d-d7406633ca41-kube-api-access-nh9fn" (OuterVolumeSpecName: "kube-api-access-nh9fn") pod "e804cb3a-bd5e-4870-9e6d-d7406633ca41" (UID: "e804cb3a-bd5e-4870-9e6d-d7406633ca41"). InnerVolumeSpecName "kube-api-access-nh9fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:00:07 crc kubenswrapper[5129]: I0314 08:00:07.228448 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9fn\" (UniqueName: \"kubernetes.io/projected/e804cb3a-bd5e-4870-9e6d-d7406633ca41-kube-api-access-nh9fn\") on node \"crc\" DevicePath \"\"" Mar 14 08:00:07 crc kubenswrapper[5129]: I0314 08:00:07.714522 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557920-hhckp" event={"ID":"e804cb3a-bd5e-4870-9e6d-d7406633ca41","Type":"ContainerDied","Data":"df7d6d7afe436412a231dec95e1a1c490619574ee31dda8ab571c21390a2e54d"} Mar 14 08:00:07 crc kubenswrapper[5129]: I0314 08:00:07.714562 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df7d6d7afe436412a231dec95e1a1c490619574ee31dda8ab571c21390a2e54d" Mar 14 08:00:07 crc kubenswrapper[5129]: I0314 08:00:07.714636 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557920-hhckp" Mar 14 08:00:08 crc kubenswrapper[5129]: I0314 08:00:08.053202 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-s27tv"] Mar 14 08:00:08 crc kubenswrapper[5129]: I0314 08:00:08.057051 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557914-s27tv"] Mar 14 08:00:10 crc kubenswrapper[5129]: I0314 08:00:10.053521 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca67976-197b-42a1-9f59-d08349f5568b" path="/var/lib/kubelet/pods/0ca67976-197b-42a1-9f59-d08349f5568b/volumes" Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.574386 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.575018 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.575067 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.575779 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.575855 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" gracePeriod=600 Mar 14 08:00:19 crc kubenswrapper[5129]: E0314 08:00:19.720996 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.811621 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" exitCode=0 Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.811670 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1"} Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.811705 5129 scope.go:117] "RemoveContainer" containerID="8906fc096299b710e84f146f378d355e3f323666ba280a3f46f8886223fc9eb7" Mar 14 08:00:19 crc kubenswrapper[5129]: I0314 08:00:19.812375 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:00:19 crc kubenswrapper[5129]: E0314 08:00:19.812974 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:00:23 crc kubenswrapper[5129]: I0314 08:00:23.130079 5129 scope.go:117] "RemoveContainer" containerID="aea66245a6ebfe3e584aae19887982142057dd9a68403d75ce2867d45d88df9b" Mar 14 08:00:23 crc kubenswrapper[5129]: I0314 08:00:23.193840 5129 scope.go:117] "RemoveContainer" containerID="26ca9429c6b3fc835f9f8dfd6e7651b540f2cf04b5223bc72de223a735b3200b" Mar 14 08:00:35 crc kubenswrapper[5129]: I0314 08:00:35.036792 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:00:35 crc kubenswrapper[5129]: E0314 08:00:35.037744 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:00:50 crc kubenswrapper[5129]: I0314 08:00:50.036164 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:00:50 crc kubenswrapper[5129]: E0314 08:00:50.037014 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:01:01 crc kubenswrapper[5129]: I0314 08:01:01.037142 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:01:01 crc kubenswrapper[5129]: E0314 08:01:01.037906 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:01:15 crc kubenswrapper[5129]: I0314 08:01:15.036335 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:01:15 crc kubenswrapper[5129]: E0314 08:01:15.037340 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:01:27 crc kubenswrapper[5129]: I0314 08:01:27.036428 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:01:27 crc kubenswrapper[5129]: E0314 08:01:27.037245 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:01:40 crc kubenswrapper[5129]: I0314 08:01:40.036959 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:01:40 crc kubenswrapper[5129]: E0314 08:01:40.038154 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:01:53 crc kubenswrapper[5129]: I0314 08:01:53.037531 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:01:53 crc kubenswrapper[5129]: E0314 08:01:53.038894 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.149423 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557922-c2w6m"] Mar 14 08:02:00 crc kubenswrapper[5129]: E0314 08:02:00.150267 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f53960c-a7ff-458b-a85c-bb252e8da404" containerName="collect-profiles" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.150279 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f53960c-a7ff-458b-a85c-bb252e8da404" containerName="collect-profiles" Mar 14 08:02:00 crc kubenswrapper[5129]: E0314 08:02:00.150304 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e804cb3a-bd5e-4870-9e6d-d7406633ca41" containerName="oc" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.150310 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e804cb3a-bd5e-4870-9e6d-d7406633ca41" containerName="oc" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.150451 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e804cb3a-bd5e-4870-9e6d-d7406633ca41" containerName="oc" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.150464 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f53960c-a7ff-458b-a85c-bb252e8da404" containerName="collect-profiles" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.150942 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-c2w6m" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.153542 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.156936 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.156988 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.161201 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-c2w6m"] Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.307289 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzj4\" (UniqueName: \"kubernetes.io/projected/f2eac466-4494-4848-8384-b8989b3f4a13-kube-api-access-llzj4\") pod \"auto-csr-approver-29557922-c2w6m\" (UID: \"f2eac466-4494-4848-8384-b8989b3f4a13\") " pod="openshift-infra/auto-csr-approver-29557922-c2w6m" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.409909 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzj4\" (UniqueName: \"kubernetes.io/projected/f2eac466-4494-4848-8384-b8989b3f4a13-kube-api-access-llzj4\") pod \"auto-csr-approver-29557922-c2w6m\" (UID: \"f2eac466-4494-4848-8384-b8989b3f4a13\") " pod="openshift-infra/auto-csr-approver-29557922-c2w6m" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.440671 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzj4\" (UniqueName: \"kubernetes.io/projected/f2eac466-4494-4848-8384-b8989b3f4a13-kube-api-access-llzj4\") pod \"auto-csr-approver-29557922-c2w6m\" (UID: \"f2eac466-4494-4848-8384-b8989b3f4a13\") " pod="openshift-infra/auto-csr-approver-29557922-c2w6m" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.479956 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-c2w6m" Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.947882 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-c2w6m"] Mar 14 08:02:00 crc kubenswrapper[5129]: I0314 08:02:00.957402 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:02:01 crc kubenswrapper[5129]: I0314 08:02:01.622189 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557922-c2w6m" event={"ID":"f2eac466-4494-4848-8384-b8989b3f4a13","Type":"ContainerStarted","Data":"20996ac4f930d99dedb9cdf6ca2439867db4e00bf4ec3007675e591aabdce486"} Mar 14 08:02:02 crc kubenswrapper[5129]: I0314 08:02:02.634362 5129 generic.go:334] "Generic (PLEG): container finished" podID="f2eac466-4494-4848-8384-b8989b3f4a13" containerID="bd1fe37b0d34301f95a99d8dd4ecbb75227e9b7b87bbaa203affa5f2203bc426" exitCode=0 Mar 14 08:02:02 crc kubenswrapper[5129]: I0314 08:02:02.634462 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557922-c2w6m" event={"ID":"f2eac466-4494-4848-8384-b8989b3f4a13","Type":"ContainerDied","Data":"bd1fe37b0d34301f95a99d8dd4ecbb75227e9b7b87bbaa203affa5f2203bc426"} Mar 14 08:02:03 crc kubenswrapper[5129]: I0314 08:02:03.935476 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-c2w6m" Mar 14 08:02:04 crc kubenswrapper[5129]: I0314 08:02:04.070330 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llzj4\" (UniqueName: \"kubernetes.io/projected/f2eac466-4494-4848-8384-b8989b3f4a13-kube-api-access-llzj4\") pod \"f2eac466-4494-4848-8384-b8989b3f4a13\" (UID: \"f2eac466-4494-4848-8384-b8989b3f4a13\") " Mar 14 08:02:04 crc kubenswrapper[5129]: I0314 08:02:04.081854 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2eac466-4494-4848-8384-b8989b3f4a13-kube-api-access-llzj4" (OuterVolumeSpecName: "kube-api-access-llzj4") pod "f2eac466-4494-4848-8384-b8989b3f4a13" (UID: "f2eac466-4494-4848-8384-b8989b3f4a13"). InnerVolumeSpecName "kube-api-access-llzj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:02:04 crc kubenswrapper[5129]: I0314 08:02:04.172533 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llzj4\" (UniqueName: \"kubernetes.io/projected/f2eac466-4494-4848-8384-b8989b3f4a13-kube-api-access-llzj4\") on node \"crc\" DevicePath \"\"" Mar 14 08:02:04 crc kubenswrapper[5129]: I0314 08:02:04.652586 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557922-c2w6m" event={"ID":"f2eac466-4494-4848-8384-b8989b3f4a13","Type":"ContainerDied","Data":"20996ac4f930d99dedb9cdf6ca2439867db4e00bf4ec3007675e591aabdce486"} Mar 14 08:02:04 crc kubenswrapper[5129]: I0314 08:02:04.652659 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20996ac4f930d99dedb9cdf6ca2439867db4e00bf4ec3007675e591aabdce486" Mar 14 08:02:04 crc kubenswrapper[5129]: I0314 08:02:04.652696 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557922-c2w6m" Mar 14 08:02:05 crc kubenswrapper[5129]: I0314 08:02:05.002540 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-d4rdg"] Mar 14 08:02:05 crc kubenswrapper[5129]: I0314 08:02:05.007056 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557916-d4rdg"] Mar 14 08:02:06 crc kubenswrapper[5129]: I0314 08:02:06.045883 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30939fec-e426-43a9-8400-0bbb6487042f" path="/var/lib/kubelet/pods/30939fec-e426-43a9-8400-0bbb6487042f/volumes" Mar 14 08:02:07 crc kubenswrapper[5129]: I0314 08:02:07.037114 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:02:07 crc kubenswrapper[5129]: E0314 08:02:07.038010 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:02:22 crc kubenswrapper[5129]: I0314 08:02:22.038472 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:02:22 crc kubenswrapper[5129]: E0314 08:02:22.039776 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:02:23 crc kubenswrapper[5129]: I0314 08:02:23.272722 5129 scope.go:117] "RemoveContainer" containerID="c2a398bc07be7ee3a375df99861e753adcd0a5cc7ef7d643ee974acddab39455" Mar 14 08:02:33 crc kubenswrapper[5129]: I0314 08:02:33.037332 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:02:33 crc kubenswrapper[5129]: E0314 08:02:33.038443 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:02:44 crc kubenswrapper[5129]: I0314 08:02:44.036790 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:02:44 crc kubenswrapper[5129]: E0314 08:02:44.037920 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:02:59 crc kubenswrapper[5129]: I0314 08:02:59.037217 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:02:59 crc kubenswrapper[5129]: E0314 08:02:59.038468 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:03:10 crc kubenswrapper[5129]: I0314 08:03:10.037777 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:03:10 crc kubenswrapper[5129]: E0314 08:03:10.038556 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:03:22 crc kubenswrapper[5129]: I0314 08:03:22.670110 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:03:22 crc kubenswrapper[5129]: E0314 08:03:22.671291 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:03:37 crc kubenswrapper[5129]: I0314 08:03:37.036334 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:03:37 crc kubenswrapper[5129]: E0314 08:03:37.037344 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:03:52 crc kubenswrapper[5129]: I0314 08:03:52.036203 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:03:52 crc kubenswrapper[5129]: E0314 08:03:52.037332 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.145231 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557924-w6lzp"] Mar 14 08:04:00 crc kubenswrapper[5129]: E0314 08:04:00.146198 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eac466-4494-4848-8384-b8989b3f4a13" containerName="oc" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.146213 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eac466-4494-4848-8384-b8989b3f4a13" containerName="oc" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.146385 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2eac466-4494-4848-8384-b8989b3f4a13" containerName="oc" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.146982 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-w6lzp" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.152162 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.152584 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.153195 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.157070 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-w6lzp"] Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.242387 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgk9f\" (UniqueName: \"kubernetes.io/projected/239da77b-d926-4c8c-8e47-f93a6ac44969-kube-api-access-bgk9f\") pod \"auto-csr-approver-29557924-w6lzp\" (UID: \"239da77b-d926-4c8c-8e47-f93a6ac44969\") " pod="openshift-infra/auto-csr-approver-29557924-w6lzp" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.344272 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgk9f\" (UniqueName: \"kubernetes.io/projected/239da77b-d926-4c8c-8e47-f93a6ac44969-kube-api-access-bgk9f\") pod \"auto-csr-approver-29557924-w6lzp\" (UID: \"239da77b-d926-4c8c-8e47-f93a6ac44969\") " pod="openshift-infra/auto-csr-approver-29557924-w6lzp" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.362391 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgk9f\" (UniqueName: \"kubernetes.io/projected/239da77b-d926-4c8c-8e47-f93a6ac44969-kube-api-access-bgk9f\") pod \"auto-csr-approver-29557924-w6lzp\" (UID: \"239da77b-d926-4c8c-8e47-f93a6ac44969\") " pod="openshift-infra/auto-csr-approver-29557924-w6lzp" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.536408 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-w6lzp" Mar 14 08:04:00 crc kubenswrapper[5129]: I0314 08:04:00.974184 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-w6lzp"] Mar 14 08:04:01 crc kubenswrapper[5129]: I0314 08:04:01.727757 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557924-w6lzp" event={"ID":"239da77b-d926-4c8c-8e47-f93a6ac44969","Type":"ContainerStarted","Data":"b133acf56dea472a1651746fadfcf8a75668852fd25f2453b507f25b883c937f"} Mar 14 08:04:03 crc kubenswrapper[5129]: I0314 08:04:03.750590 5129 generic.go:334] "Generic (PLEG): container finished" podID="239da77b-d926-4c8c-8e47-f93a6ac44969" containerID="da8ff0e09c6422857665044073566c0535dfc61c3c4e2fad85dcce654b631d8b" exitCode=0 Mar 14 08:04:03 crc kubenswrapper[5129]: I0314 08:04:03.750701 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557924-w6lzp" event={"ID":"239da77b-d926-4c8c-8e47-f93a6ac44969","Type":"ContainerDied","Data":"da8ff0e09c6422857665044073566c0535dfc61c3c4e2fad85dcce654b631d8b"} Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.036929 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:04:05 crc kubenswrapper[5129]: E0314 08:04:05.037694 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.318803 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-w6lzp" Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.426502 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgk9f\" (UniqueName: \"kubernetes.io/projected/239da77b-d926-4c8c-8e47-f93a6ac44969-kube-api-access-bgk9f\") pod \"239da77b-d926-4c8c-8e47-f93a6ac44969\" (UID: \"239da77b-d926-4c8c-8e47-f93a6ac44969\") " Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.433004 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239da77b-d926-4c8c-8e47-f93a6ac44969-kube-api-access-bgk9f" (OuterVolumeSpecName: "kube-api-access-bgk9f") pod "239da77b-d926-4c8c-8e47-f93a6ac44969" (UID: "239da77b-d926-4c8c-8e47-f93a6ac44969"). InnerVolumeSpecName "kube-api-access-bgk9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.528742 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgk9f\" (UniqueName: \"kubernetes.io/projected/239da77b-d926-4c8c-8e47-f93a6ac44969-kube-api-access-bgk9f\") on node \"crc\" DevicePath \"\"" Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.776121 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557924-w6lzp" event={"ID":"239da77b-d926-4c8c-8e47-f93a6ac44969","Type":"ContainerDied","Data":"b133acf56dea472a1651746fadfcf8a75668852fd25f2453b507f25b883c937f"} Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.776188 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b133acf56dea472a1651746fadfcf8a75668852fd25f2453b507f25b883c937f" Mar 14 08:04:05 crc kubenswrapper[5129]: I0314 08:04:05.776211 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557924-w6lzp" Mar 14 08:04:06 crc kubenswrapper[5129]: I0314 08:04:06.388703 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-g6f4v"] Mar 14 08:04:06 crc kubenswrapper[5129]: I0314 08:04:06.397074 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557918-g6f4v"] Mar 14 08:04:08 crc kubenswrapper[5129]: I0314 08:04:08.045789 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb3640d-7197-4f66-a9cc-56febda4eb73" path="/var/lib/kubelet/pods/0eb3640d-7197-4f66-a9cc-56febda4eb73/volumes" Mar 14 08:04:17 crc kubenswrapper[5129]: I0314 08:04:17.036250 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:04:17 crc kubenswrapper[5129]: E0314 08:04:17.037088 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:04:23 crc kubenswrapper[5129]: I0314 08:04:23.392442 5129 scope.go:117] "RemoveContainer" containerID="13c47c3b19304504043d732b1aade54c969881d9900c9f75d325ddb98b87230a" Mar 14 08:04:32 crc kubenswrapper[5129]: I0314 08:04:32.038041 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:04:32 crc kubenswrapper[5129]: E0314 08:04:32.039019 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:04:45 crc kubenswrapper[5129]: I0314 08:04:45.036459 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:04:45 crc kubenswrapper[5129]: E0314 08:04:45.037105 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:04:57 crc kubenswrapper[5129]: I0314 08:04:57.038114 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:04:57 crc kubenswrapper[5129]: E0314 08:04:57.040412 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:05:09 crc kubenswrapper[5129]: I0314 08:05:09.036257 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:05:09 crc kubenswrapper[5129]: E0314 08:05:09.037048 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.922947 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nxfpv"] Mar 14 08:05:18 crc kubenswrapper[5129]: E0314 08:05:18.923785 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239da77b-d926-4c8c-8e47-f93a6ac44969" containerName="oc" Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.923797 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="239da77b-d926-4c8c-8e47-f93a6ac44969" containerName="oc" Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.923933 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="239da77b-d926-4c8c-8e47-f93a6ac44969" containerName="oc" Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.924872 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.946146 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nxfpv"] Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.959249 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjl5\" (UniqueName: \"kubernetes.io/projected/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-kube-api-access-9mjl5\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.959314 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-utilities\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:18 crc kubenswrapper[5129]: I0314 08:05:18.959347 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-catalog-content\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.060354 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjl5\" (UniqueName: \"kubernetes.io/projected/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-kube-api-access-9mjl5\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.060405 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-utilities\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.060429 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-catalog-content\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.060946 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-catalog-content\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.061715 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-utilities\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.094869 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjl5\" (UniqueName: \"kubernetes.io/projected/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-kube-api-access-9mjl5\") pod \"redhat-operators-nxfpv\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.253468 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:19 crc kubenswrapper[5129]: I0314 08:05:19.720633 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nxfpv"] Mar 14 08:05:21 crc kubenswrapper[5129]: I0314 08:05:21.172882 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxfpv" event={"ID":"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5","Type":"ContainerStarted","Data":"61173d0ef60c0e6b0a40166a7e05e887bb403c565d444eab76efd0985938a1e9"} Mar 14 08:05:22 crc kubenswrapper[5129]: I0314 08:05:22.184953 5129 generic.go:334] "Generic (PLEG): container finished" podID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerID="467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344" exitCode=0 Mar 14 08:05:22 crc kubenswrapper[5129]: I0314 08:05:22.185011 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxfpv" event={"ID":"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5","Type":"ContainerDied","Data":"467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344"} Mar 14 08:05:24 crc kubenswrapper[5129]: I0314 08:05:24.036540 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:05:24 crc kubenswrapper[5129]: I0314 08:05:24.202274 5129 generic.go:334] "Generic (PLEG): container finished" podID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerID="43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156" exitCode=0 Mar 14 08:05:24 crc kubenswrapper[5129]: I0314 08:05:24.202365 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxfpv" event={"ID":"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5","Type":"ContainerDied","Data":"43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156"} Mar 14 08:05:25 crc kubenswrapper[5129]: I0314 08:05:25.214494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxfpv" event={"ID":"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5","Type":"ContainerStarted","Data":"64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da"} Mar 14 08:05:25 crc kubenswrapper[5129]: I0314 08:05:25.218993 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"bf794827ef327c71df144de582e947a9b3492752b6b76e8b1ee32a2702a6123e"} Mar 14 08:05:25 crc kubenswrapper[5129]: I0314 08:05:25.248012 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nxfpv" podStartSLOduration=4.807065548 podStartE2EDuration="7.247987786s" podCreationTimestamp="2026-03-14 08:05:18 +0000 UTC" firstStartedPulling="2026-03-14 08:05:22.18873202 +0000 UTC m=+3984.940647204" lastFinishedPulling="2026-03-14 08:05:24.629654218 +0000 UTC m=+3987.381569442" observedRunningTime="2026-03-14 08:05:25.247103511 +0000 UTC m=+3987.999018695" watchObservedRunningTime="2026-03-14 08:05:25.247987786 +0000 UTC m=+3987.999902970" Mar 14 08:05:29 crc kubenswrapper[5129]: I0314 08:05:29.253920 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:29 crc kubenswrapper[5129]: I0314 08:05:29.254286 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:30 crc kubenswrapper[5129]: I0314 08:05:30.303166 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nxfpv" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="registry-server" probeResult="failure" output=< Mar 14 08:05:30 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 08:05:30 crc kubenswrapper[5129]: > Mar 14 08:05:39 crc kubenswrapper[5129]: I0314 08:05:39.296954 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:39 crc kubenswrapper[5129]: I0314 08:05:39.339939 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:39 crc kubenswrapper[5129]: I0314 08:05:39.531246 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nxfpv"] Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.329494 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nxfpv" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="registry-server" containerID="cri-o://64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da" gracePeriod=2 Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.782046 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.886251 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-catalog-content\") pod \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.886378 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-utilities\") pod \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.886492 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mjl5\" (UniqueName: \"kubernetes.io/projected/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-kube-api-access-9mjl5\") pod \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\" (UID: \"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5\") " Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.887406 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-utilities" (OuterVolumeSpecName: "utilities") pod "c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" (UID: "c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.906799 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-kube-api-access-9mjl5" (OuterVolumeSpecName: "kube-api-access-9mjl5") pod "c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" (UID: "c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5"). InnerVolumeSpecName "kube-api-access-9mjl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.987875 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mjl5\" (UniqueName: \"kubernetes.io/projected/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-kube-api-access-9mjl5\") on node \"crc\" DevicePath \"\"" Mar 14 08:05:40 crc kubenswrapper[5129]: I0314 08:05:40.987912 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.025774 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" (UID: "c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.089387 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.345222 5129 generic.go:334] "Generic (PLEG): container finished" podID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerID="64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da" exitCode=0 Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.345322 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxfpv" event={"ID":"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5","Type":"ContainerDied","Data":"64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da"} Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.345826 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxfpv" event={"ID":"c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5","Type":"ContainerDied","Data":"61173d0ef60c0e6b0a40166a7e05e887bb403c565d444eab76efd0985938a1e9"} Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.345391 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxfpv" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.345865 5129 scope.go:117] "RemoveContainer" containerID="64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.378968 5129 scope.go:117] "RemoveContainer" containerID="43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.403846 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nxfpv"] Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.408893 5129 scope.go:117] "RemoveContainer" containerID="467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.412677 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nxfpv"] Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.425522 5129 scope.go:117] "RemoveContainer" containerID="64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da" Mar 14 08:05:41 crc kubenswrapper[5129]: E0314 08:05:41.426146 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da\": container with ID starting with 64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da not found: ID does not exist" containerID="64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.426195 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da"} err="failed to get container status \"64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da\": rpc error: code = NotFound desc = could not find container \"64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da\": container with ID starting with 64a355acc02904d2f0fd112a000573ee65eb47456eb024830d0c791c21af07da not found: ID does not exist" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.426218 5129 scope.go:117] "RemoveContainer" containerID="43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156" Mar 14 08:05:41 crc kubenswrapper[5129]: E0314 08:05:41.426996 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156\": container with ID starting with 43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156 not found: ID does not exist" containerID="43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.427017 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156"} err="failed to get container status \"43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156\": rpc error: code = NotFound desc = could not find container \"43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156\": container with ID starting with 43f6393252a661aee67c1afbad42f64f0b1f6cf1cdf514f727c5f798ba878156 not found: ID does not exist" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.427029 5129 scope.go:117] "RemoveContainer" containerID="467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344" Mar 14 08:05:41 crc kubenswrapper[5129]: E0314 08:05:41.427230 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344\": container with ID starting with 467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344 not found: ID does not exist" containerID="467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344" Mar 14 08:05:41 crc kubenswrapper[5129]: I0314 08:05:41.427254 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344"} err="failed to get container status \"467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344\": rpc error: code = NotFound desc = could not find container \"467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344\": container with ID starting with 467b917c267bc2b0f16647910d14b3d76e87f74c87a3f85f13062120981d4344 not found: ID does not exist" Mar 14 08:05:42 crc kubenswrapper[5129]: I0314 08:05:42.055031 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" path="/var/lib/kubelet/pods/c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5/volumes" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.135271 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557926-wrl9s"] Mar 14 08:06:00 crc kubenswrapper[5129]: E0314 08:06:00.136143 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="registry-server" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.136156 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="registry-server" Mar 14 08:06:00 crc kubenswrapper[5129]: E0314 08:06:00.136171 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="extract-utilities" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.136179 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="extract-utilities" Mar 14 08:06:00 crc kubenswrapper[5129]: E0314 08:06:00.136200 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="extract-content" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.136212 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="extract-content" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.136351 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fd87a2-0cd1-42e6-aec7-a20ec94c5ce5" containerName="registry-server" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.136791 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-wrl9s" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.139165 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.139343 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.142262 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.143055 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-wrl9s"] Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.160992 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w79b2\" (UniqueName: \"kubernetes.io/projected/a72e95e9-1005-4f78-98ec-3541836fe5b9-kube-api-access-w79b2\") pod \"auto-csr-approver-29557926-wrl9s\" (UID: \"a72e95e9-1005-4f78-98ec-3541836fe5b9\") " pod="openshift-infra/auto-csr-approver-29557926-wrl9s" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.262843 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w79b2\" (UniqueName: \"kubernetes.io/projected/a72e95e9-1005-4f78-98ec-3541836fe5b9-kube-api-access-w79b2\") pod \"auto-csr-approver-29557926-wrl9s\" (UID: \"a72e95e9-1005-4f78-98ec-3541836fe5b9\") " pod="openshift-infra/auto-csr-approver-29557926-wrl9s" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.417198 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w79b2\" (UniqueName: \"kubernetes.io/projected/a72e95e9-1005-4f78-98ec-3541836fe5b9-kube-api-access-w79b2\") pod \"auto-csr-approver-29557926-wrl9s\" (UID: \"a72e95e9-1005-4f78-98ec-3541836fe5b9\") " pod="openshift-infra/auto-csr-approver-29557926-wrl9s" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.501377 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-wrl9s" Mar 14 08:06:00 crc kubenswrapper[5129]: I0314 08:06:00.902174 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-wrl9s"] Mar 14 08:06:01 crc kubenswrapper[5129]: I0314 08:06:01.502702 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557926-wrl9s" event={"ID":"a72e95e9-1005-4f78-98ec-3541836fe5b9","Type":"ContainerStarted","Data":"006824cd57fae9cf6018ec90fe67fb784d51db24f33bf4625a4e6a215a3c434d"} Mar 14 08:06:02 crc kubenswrapper[5129]: I0314 08:06:02.515308 5129 generic.go:334] "Generic (PLEG): container finished" podID="a72e95e9-1005-4f78-98ec-3541836fe5b9" containerID="af30279bc8f2e4b677b644747a4f3d58637a056eaa26617c7c0f4c2b188727c8" exitCode=0 Mar 14 08:06:02 crc kubenswrapper[5129]: I0314 08:06:02.515505 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557926-wrl9s" event={"ID":"a72e95e9-1005-4f78-98ec-3541836fe5b9","Type":"ContainerDied","Data":"af30279bc8f2e4b677b644747a4f3d58637a056eaa26617c7c0f4c2b188727c8"} Mar 14 08:06:03 crc kubenswrapper[5129]: I0314 08:06:03.887623 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-wrl9s" Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.012115 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w79b2\" (UniqueName: \"kubernetes.io/projected/a72e95e9-1005-4f78-98ec-3541836fe5b9-kube-api-access-w79b2\") pod \"a72e95e9-1005-4f78-98ec-3541836fe5b9\" (UID: \"a72e95e9-1005-4f78-98ec-3541836fe5b9\") " Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.017836 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72e95e9-1005-4f78-98ec-3541836fe5b9-kube-api-access-w79b2" (OuterVolumeSpecName: "kube-api-access-w79b2") pod "a72e95e9-1005-4f78-98ec-3541836fe5b9" (UID: "a72e95e9-1005-4f78-98ec-3541836fe5b9"). InnerVolumeSpecName "kube-api-access-w79b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.114226 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w79b2\" (UniqueName: \"kubernetes.io/projected/a72e95e9-1005-4f78-98ec-3541836fe5b9-kube-api-access-w79b2\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.533403 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557926-wrl9s" event={"ID":"a72e95e9-1005-4f78-98ec-3541836fe5b9","Type":"ContainerDied","Data":"006824cd57fae9cf6018ec90fe67fb784d51db24f33bf4625a4e6a215a3c434d"} Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.533706 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="006824cd57fae9cf6018ec90fe67fb784d51db24f33bf4625a4e6a215a3c434d" Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.533480 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557926-wrl9s" Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.977003 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-hhckp"] Mar 14 08:06:04 crc kubenswrapper[5129]: I0314 08:06:04.994497 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557920-hhckp"] Mar 14 08:06:06 crc kubenswrapper[5129]: I0314 08:06:06.050652 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e804cb3a-bd5e-4870-9e6d-d7406633ca41" path="/var/lib/kubelet/pods/e804cb3a-bd5e-4870-9e6d-d7406633ca41/volumes" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.740051 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6c6v4"] Mar 14 08:06:14 crc kubenswrapper[5129]: E0314 08:06:14.741017 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72e95e9-1005-4f78-98ec-3541836fe5b9" containerName="oc" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.741034 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72e95e9-1005-4f78-98ec-3541836fe5b9" containerName="oc" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.741209 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72e95e9-1005-4f78-98ec-3541836fe5b9" containerName="oc" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.742499 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.762269 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6c6v4"] Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.788900 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-utilities\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.788944 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-catalog-content\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.788973 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhwr\" (UniqueName: \"kubernetes.io/projected/194193eb-833b-4e2c-af0d-f582fe23aeb8-kube-api-access-kwhwr\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.891192 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-utilities\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.891275 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-catalog-content\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.891319 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhwr\" (UniqueName: \"kubernetes.io/projected/194193eb-833b-4e2c-af0d-f582fe23aeb8-kube-api-access-kwhwr\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.891843 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-utilities\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.892236 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-catalog-content\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:14 crc kubenswrapper[5129]: I0314 08:06:14.918635 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhwr\" (UniqueName: \"kubernetes.io/projected/194193eb-833b-4e2c-af0d-f582fe23aeb8-kube-api-access-kwhwr\") pod \"community-operators-6c6v4\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:15 crc kubenswrapper[5129]: I0314 08:06:15.060644 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:15 crc kubenswrapper[5129]: I0314 08:06:15.548718 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6c6v4"] Mar 14 08:06:15 crc kubenswrapper[5129]: I0314 08:06:15.615844 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6c6v4" event={"ID":"194193eb-833b-4e2c-af0d-f582fe23aeb8","Type":"ContainerStarted","Data":"a55b2c152637718371c059dba24f5f7657873c814770ddcdb1b6be814acd4511"} Mar 14 08:06:16 crc kubenswrapper[5129]: I0314 08:06:16.623133 5129 generic.go:334] "Generic (PLEG): container finished" podID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerID="744183d48dd749a3679822156b5070266bbe7f0c44beee9f94609f6b61e71cbc" exitCode=0 Mar 14 08:06:16 crc kubenswrapper[5129]: I0314 08:06:16.623290 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6c6v4" event={"ID":"194193eb-833b-4e2c-af0d-f582fe23aeb8","Type":"ContainerDied","Data":"744183d48dd749a3679822156b5070266bbe7f0c44beee9f94609f6b61e71cbc"} Mar 14 08:06:17 crc kubenswrapper[5129]: I0314 08:06:17.633817 5129 generic.go:334] "Generic (PLEG): container finished" podID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerID="06d09d76afa53551379a85801cfe104c594ecea3e5a8919fd0dfa6b45c6f9566" exitCode=0 Mar 14 08:06:17 crc kubenswrapper[5129]: I0314 08:06:17.633928 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6c6v4" event={"ID":"194193eb-833b-4e2c-af0d-f582fe23aeb8","Type":"ContainerDied","Data":"06d09d76afa53551379a85801cfe104c594ecea3e5a8919fd0dfa6b45c6f9566"} Mar 14 08:06:18 crc kubenswrapper[5129]: I0314 08:06:18.643935 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6c6v4" event={"ID":"194193eb-833b-4e2c-af0d-f582fe23aeb8","Type":"ContainerStarted","Data":"7b5561d7b01d0927bc4e56d7e1c6df32d3455750ad300cdbfa4115d9e38ed897"} Mar 14 08:06:18 crc kubenswrapper[5129]: I0314 08:06:18.664937 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6c6v4" podStartSLOduration=3.282352817 podStartE2EDuration="4.664918668s" podCreationTimestamp="2026-03-14 08:06:14 +0000 UTC" firstStartedPulling="2026-03-14 08:06:16.624662997 +0000 UTC m=+4039.376578181" lastFinishedPulling="2026-03-14 08:06:18.007228848 +0000 UTC m=+4040.759144032" observedRunningTime="2026-03-14 08:06:18.658909225 +0000 UTC m=+4041.410824449" watchObservedRunningTime="2026-03-14 08:06:18.664918668 +0000 UTC m=+4041.416833862" Mar 14 08:06:23 crc kubenswrapper[5129]: I0314 08:06:23.492062 5129 scope.go:117] "RemoveContainer" containerID="d1eec9e39d314b8f723aeb67b7243cd33fd9825d84766e407e6d227662190334" Mar 14 08:06:25 crc kubenswrapper[5129]: I0314 08:06:25.061673 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:25 crc kubenswrapper[5129]: I0314 08:06:25.062091 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:25 crc kubenswrapper[5129]: I0314 08:06:25.112371 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:25 crc kubenswrapper[5129]: I0314 08:06:25.756451 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:25 crc kubenswrapper[5129]: I0314 08:06:25.802289 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6c6v4"] Mar 14 08:06:27 crc kubenswrapper[5129]: I0314 08:06:27.721765 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6c6v4" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="registry-server" containerID="cri-o://7b5561d7b01d0927bc4e56d7e1c6df32d3455750ad300cdbfa4115d9e38ed897" gracePeriod=2 Mar 14 08:06:28 crc kubenswrapper[5129]: I0314 08:06:28.731122 5129 generic.go:334] "Generic (PLEG): container finished" podID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerID="7b5561d7b01d0927bc4e56d7e1c6df32d3455750ad300cdbfa4115d9e38ed897" exitCode=0 Mar 14 08:06:28 crc kubenswrapper[5129]: I0314 08:06:28.731180 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6c6v4" event={"ID":"194193eb-833b-4e2c-af0d-f582fe23aeb8","Type":"ContainerDied","Data":"7b5561d7b01d0927bc4e56d7e1c6df32d3455750ad300cdbfa4115d9e38ed897"} Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.298278 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.406438 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-utilities\") pod \"194193eb-833b-4e2c-af0d-f582fe23aeb8\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.406675 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhwr\" (UniqueName: \"kubernetes.io/projected/194193eb-833b-4e2c-af0d-f582fe23aeb8-kube-api-access-kwhwr\") pod \"194193eb-833b-4e2c-af0d-f582fe23aeb8\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.406700 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-catalog-content\") pod \"194193eb-833b-4e2c-af0d-f582fe23aeb8\" (UID: \"194193eb-833b-4e2c-af0d-f582fe23aeb8\") " Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.407791 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-utilities" (OuterVolumeSpecName: "utilities") pod "194193eb-833b-4e2c-af0d-f582fe23aeb8" (UID: "194193eb-833b-4e2c-af0d-f582fe23aeb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.412828 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194193eb-833b-4e2c-af0d-f582fe23aeb8-kube-api-access-kwhwr" (OuterVolumeSpecName: "kube-api-access-kwhwr") pod "194193eb-833b-4e2c-af0d-f582fe23aeb8" (UID: "194193eb-833b-4e2c-af0d-f582fe23aeb8"). InnerVolumeSpecName "kube-api-access-kwhwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.458439 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "194193eb-833b-4e2c-af0d-f582fe23aeb8" (UID: "194193eb-833b-4e2c-af0d-f582fe23aeb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.507441 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.507479 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhwr\" (UniqueName: \"kubernetes.io/projected/194193eb-833b-4e2c-af0d-f582fe23aeb8-kube-api-access-kwhwr\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.507492 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/194193eb-833b-4e2c-af0d-f582fe23aeb8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.740344 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6c6v4" event={"ID":"194193eb-833b-4e2c-af0d-f582fe23aeb8","Type":"ContainerDied","Data":"a55b2c152637718371c059dba24f5f7657873c814770ddcdb1b6be814acd4511"} Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.740398 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6c6v4" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.740420 5129 scope.go:117] "RemoveContainer" containerID="7b5561d7b01d0927bc4e56d7e1c6df32d3455750ad300cdbfa4115d9e38ed897" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.758420 5129 scope.go:117] "RemoveContainer" containerID="06d09d76afa53551379a85801cfe104c594ecea3e5a8919fd0dfa6b45c6f9566" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.789079 5129 scope.go:117] "RemoveContainer" containerID="744183d48dd749a3679822156b5070266bbe7f0c44beee9f94609f6b61e71cbc" Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.797851 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6c6v4"] Mar 14 08:06:29 crc kubenswrapper[5129]: I0314 08:06:29.804273 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6c6v4"] Mar 14 08:06:30 crc kubenswrapper[5129]: I0314 08:06:30.045327 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" path="/var/lib/kubelet/pods/194193eb-833b-4e2c-af0d-f582fe23aeb8/volumes" Mar 14 08:07:49 crc kubenswrapper[5129]: I0314 08:07:49.574581 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:07:49 crc kubenswrapper[5129]: I0314 08:07:49.575136 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.143855 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557928-zk67m"] Mar 14 08:08:00 crc kubenswrapper[5129]: E0314 08:08:00.144794 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="registry-server" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.144812 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="registry-server" Mar 14 08:08:00 crc kubenswrapper[5129]: E0314 08:08:00.144839 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="extract-content" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.144850 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="extract-content" Mar 14 08:08:00 crc kubenswrapper[5129]: E0314 08:08:00.144862 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="extract-utilities" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.144873 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="extract-utilities" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.145079 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="194193eb-833b-4e2c-af0d-f582fe23aeb8" containerName="registry-server" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.145681 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-zk67m" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.147571 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.148016 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.149577 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.169493 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-zk67m"] Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.200512 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgklj\" (UniqueName: \"kubernetes.io/projected/53af3bd0-780d-4e46-8843-c9dd4c4e0bb1-kube-api-access-cgklj\") pod \"auto-csr-approver-29557928-zk67m\" (UID: \"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1\") " pod="openshift-infra/auto-csr-approver-29557928-zk67m" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.302687 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgklj\" (UniqueName: \"kubernetes.io/projected/53af3bd0-780d-4e46-8843-c9dd4c4e0bb1-kube-api-access-cgklj\") pod \"auto-csr-approver-29557928-zk67m\" (UID: \"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1\") " pod="openshift-infra/auto-csr-approver-29557928-zk67m" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.324919 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgklj\" (UniqueName: \"kubernetes.io/projected/53af3bd0-780d-4e46-8843-c9dd4c4e0bb1-kube-api-access-cgklj\") pod \"auto-csr-approver-29557928-zk67m\" (UID: \"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1\") " pod="openshift-infra/auto-csr-approver-29557928-zk67m" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.461284 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-zk67m" Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.857954 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-zk67m"] Mar 14 08:08:00 crc kubenswrapper[5129]: I0314 08:08:00.869952 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:08:01 crc kubenswrapper[5129]: I0314 08:08:01.463851 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557928-zk67m" event={"ID":"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1","Type":"ContainerStarted","Data":"06968a5a38f270a60a84eccf52c119b34c7b2fd7cf9d1b8ae4644b37f2569cd0"} Mar 14 08:08:02 crc kubenswrapper[5129]: I0314 08:08:02.471184 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557928-zk67m" event={"ID":"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1","Type":"ContainerStarted","Data":"0848a8d2863f6eb9d4d98980b9703272773d0161533b6a4e3accd00e29ccf377"} Mar 14 08:08:02 crc kubenswrapper[5129]: I0314 08:08:02.488142 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557928-zk67m" podStartSLOduration=1.307590972 podStartE2EDuration="2.48812505s" podCreationTimestamp="2026-03-14 08:08:00 +0000 UTC" firstStartedPulling="2026-03-14 08:08:00.86951951 +0000 UTC m=+4143.621434704" lastFinishedPulling="2026-03-14 08:08:02.050053598 +0000 UTC m=+4144.801968782" observedRunningTime="2026-03-14 08:08:02.483520166 +0000 UTC m=+4145.235435360" watchObservedRunningTime="2026-03-14 08:08:02.48812505 +0000 UTC m=+4145.240040234" Mar 14 08:08:03 crc kubenswrapper[5129]: I0314 08:08:03.480419 5129 generic.go:334] "Generic (PLEG): container finished" podID="53af3bd0-780d-4e46-8843-c9dd4c4e0bb1" containerID="0848a8d2863f6eb9d4d98980b9703272773d0161533b6a4e3accd00e29ccf377" exitCode=0 Mar 14 08:08:03 crc kubenswrapper[5129]: I0314 08:08:03.480462 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557928-zk67m" event={"ID":"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1","Type":"ContainerDied","Data":"0848a8d2863f6eb9d4d98980b9703272773d0161533b6a4e3accd00e29ccf377"} Mar 14 08:08:04 crc kubenswrapper[5129]: I0314 08:08:04.741787 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-zk67m" Mar 14 08:08:04 crc kubenswrapper[5129]: I0314 08:08:04.774524 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgklj\" (UniqueName: \"kubernetes.io/projected/53af3bd0-780d-4e46-8843-c9dd4c4e0bb1-kube-api-access-cgklj\") pod \"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1\" (UID: \"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1\") " Mar 14 08:08:04 crc kubenswrapper[5129]: I0314 08:08:04.781068 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53af3bd0-780d-4e46-8843-c9dd4c4e0bb1-kube-api-access-cgklj" (OuterVolumeSpecName: "kube-api-access-cgklj") pod "53af3bd0-780d-4e46-8843-c9dd4c4e0bb1" (UID: "53af3bd0-780d-4e46-8843-c9dd4c4e0bb1"). InnerVolumeSpecName "kube-api-access-cgklj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:08:04 crc kubenswrapper[5129]: I0314 08:08:04.876250 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgklj\" (UniqueName: \"kubernetes.io/projected/53af3bd0-780d-4e46-8843-c9dd4c4e0bb1-kube-api-access-cgklj\") on node \"crc\" DevicePath \"\"" Mar 14 08:08:05 crc kubenswrapper[5129]: I0314 08:08:05.501082 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557928-zk67m" event={"ID":"53af3bd0-780d-4e46-8843-c9dd4c4e0bb1","Type":"ContainerDied","Data":"06968a5a38f270a60a84eccf52c119b34c7b2fd7cf9d1b8ae4644b37f2569cd0"} Mar 14 08:08:05 crc kubenswrapper[5129]: I0314 08:08:05.501140 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06968a5a38f270a60a84eccf52c119b34c7b2fd7cf9d1b8ae4644b37f2569cd0" Mar 14 08:08:05 crc kubenswrapper[5129]: I0314 08:08:05.501517 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557928-zk67m" Mar 14 08:08:05 crc kubenswrapper[5129]: I0314 08:08:05.569837 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-c2w6m"] Mar 14 08:08:05 crc kubenswrapper[5129]: I0314 08:08:05.575959 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557922-c2w6m"] Mar 14 08:08:06 crc kubenswrapper[5129]: I0314 08:08:06.044421 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2eac466-4494-4848-8384-b8989b3f4a13" path="/var/lib/kubelet/pods/f2eac466-4494-4848-8384-b8989b3f4a13/volumes" Mar 14 08:08:19 crc kubenswrapper[5129]: I0314 08:08:19.574894 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:08:19 crc kubenswrapper[5129]: I0314 08:08:19.576045 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:08:23 crc kubenswrapper[5129]: I0314 08:08:23.604549 5129 scope.go:117] "RemoveContainer" containerID="bd1fe37b0d34301f95a99d8dd4ecbb75227e9b7b87bbaa203affa5f2203bc426" Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.574565 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.575124 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.575171 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.575847 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf794827ef327c71df144de582e947a9b3492752b6b76e8b1ee32a2702a6123e"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.575904 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://bf794827ef327c71df144de582e947a9b3492752b6b76e8b1ee32a2702a6123e" gracePeriod=600 Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.842433 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="bf794827ef327c71df144de582e947a9b3492752b6b76e8b1ee32a2702a6123e" exitCode=0 Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.842489 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"bf794827ef327c71df144de582e947a9b3492752b6b76e8b1ee32a2702a6123e"} Mar 14 08:08:49 crc kubenswrapper[5129]: I0314 08:08:49.842881 5129 scope.go:117] "RemoveContainer" containerID="05467b1c6d52d7fee988ca4eb23eb35f797b5220f48e6d169c201d4ccc8c6bb1" Mar 14 08:08:50 crc kubenswrapper[5129]: I0314 08:08:50.853550 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63"} Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.743697 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ts4rq"] Mar 14 08:09:58 crc kubenswrapper[5129]: E0314 08:09:58.744460 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53af3bd0-780d-4e46-8843-c9dd4c4e0bb1" containerName="oc" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.744472 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="53af3bd0-780d-4e46-8843-c9dd4c4e0bb1" containerName="oc" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.744635 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="53af3bd0-780d-4e46-8843-c9dd4c4e0bb1" containerName="oc" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.745507 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.764546 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts4rq"] Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.876317 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsv6n\" (UniqueName: \"kubernetes.io/projected/d79a9d09-a223-4614-8c5e-8d18f737aa1f-kube-api-access-tsv6n\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.876653 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-catalog-content\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.876762 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-utilities\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.979768 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-catalog-content\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.979831 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-catalog-content\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.979894 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-utilities\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.980388 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-utilities\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:58 crc kubenswrapper[5129]: I0314 08:09:58.980493 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsv6n\" (UniqueName: \"kubernetes.io/projected/d79a9d09-a223-4614-8c5e-8d18f737aa1f-kube-api-access-tsv6n\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:59 crc kubenswrapper[5129]: I0314 08:09:59.007184 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsv6n\" (UniqueName: \"kubernetes.io/projected/d79a9d09-a223-4614-8c5e-8d18f737aa1f-kube-api-access-tsv6n\") pod \"redhat-marketplace-ts4rq\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:59 crc kubenswrapper[5129]: I0314 08:09:59.063658 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:09:59 crc kubenswrapper[5129]: I0314 08:09:59.640488 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts4rq"] Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.139805 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557930-dfq6p"] Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.140902 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-dfq6p" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.145149 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.145471 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.145479 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.146288 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-dfq6p"] Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.299763 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjwdb\" (UniqueName: \"kubernetes.io/projected/7998feb4-2ba1-44c3-a5dd-abed98f4a674-kube-api-access-wjwdb\") pod \"auto-csr-approver-29557930-dfq6p\" (UID: \"7998feb4-2ba1-44c3-a5dd-abed98f4a674\") " pod="openshift-infra/auto-csr-approver-29557930-dfq6p" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.384928 5129 generic.go:334] "Generic (PLEG): container finished" podID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerID="282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337" exitCode=0 Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.384979 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts4rq" event={"ID":"d79a9d09-a223-4614-8c5e-8d18f737aa1f","Type":"ContainerDied","Data":"282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337"} Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.385028 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts4rq" event={"ID":"d79a9d09-a223-4614-8c5e-8d18f737aa1f","Type":"ContainerStarted","Data":"1f3446f85a5c17b24802ddff802847e5cb25422c080cb47711d694888430ec79"} Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.401041 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjwdb\" (UniqueName: \"kubernetes.io/projected/7998feb4-2ba1-44c3-a5dd-abed98f4a674-kube-api-access-wjwdb\") pod \"auto-csr-approver-29557930-dfq6p\" (UID: \"7998feb4-2ba1-44c3-a5dd-abed98f4a674\") " pod="openshift-infra/auto-csr-approver-29557930-dfq6p" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.425794 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjwdb\" (UniqueName: \"kubernetes.io/projected/7998feb4-2ba1-44c3-a5dd-abed98f4a674-kube-api-access-wjwdb\") pod \"auto-csr-approver-29557930-dfq6p\" (UID: \"7998feb4-2ba1-44c3-a5dd-abed98f4a674\") " pod="openshift-infra/auto-csr-approver-29557930-dfq6p" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.463199 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-dfq6p" Mar 14 08:10:00 crc kubenswrapper[5129]: I0314 08:10:00.877298 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-dfq6p"] Mar 14 08:10:01 crc kubenswrapper[5129]: I0314 08:10:01.396292 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557930-dfq6p" event={"ID":"7998feb4-2ba1-44c3-a5dd-abed98f4a674","Type":"ContainerStarted","Data":"be6d3d0517c0084c6573552232079e94f303ecb406ed1351b80e0c2646a0a155"} Mar 14 08:10:02 crc kubenswrapper[5129]: I0314 08:10:02.405765 5129 generic.go:334] "Generic (PLEG): container finished" podID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerID="dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440" exitCode=0 Mar 14 08:10:02 crc kubenswrapper[5129]: I0314 08:10:02.406186 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts4rq" event={"ID":"d79a9d09-a223-4614-8c5e-8d18f737aa1f","Type":"ContainerDied","Data":"dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440"} Mar 14 08:10:03 crc kubenswrapper[5129]: I0314 08:10:03.422264 5129 generic.go:334] "Generic (PLEG): container finished" podID="7998feb4-2ba1-44c3-a5dd-abed98f4a674" containerID="99bbfa14a218ede89403fdb13f0f64bf8ac37b5aaf53bef23f76214d51ca4901" exitCode=0 Mar 14 08:10:03 crc kubenswrapper[5129]: I0314 08:10:03.422388 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557930-dfq6p" event={"ID":"7998feb4-2ba1-44c3-a5dd-abed98f4a674","Type":"ContainerDied","Data":"99bbfa14a218ede89403fdb13f0f64bf8ac37b5aaf53bef23f76214d51ca4901"} Mar 14 08:10:04 crc kubenswrapper[5129]: I0314 08:10:04.437200 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts4rq" event={"ID":"d79a9d09-a223-4614-8c5e-8d18f737aa1f","Type":"ContainerStarted","Data":"6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b"} Mar 14 08:10:04 crc kubenswrapper[5129]: I0314 08:10:04.726120 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-dfq6p" Mar 14 08:10:04 crc kubenswrapper[5129]: I0314 08:10:04.740649 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ts4rq" podStartSLOduration=3.805491567 podStartE2EDuration="6.740631887s" podCreationTimestamp="2026-03-14 08:09:58 +0000 UTC" firstStartedPulling="2026-03-14 08:10:00.388425993 +0000 UTC m=+4263.140341177" lastFinishedPulling="2026-03-14 08:10:03.323566313 +0000 UTC m=+4266.075481497" observedRunningTime="2026-03-14 08:10:04.460965428 +0000 UTC m=+4267.212880612" watchObservedRunningTime="2026-03-14 08:10:04.740631887 +0000 UTC m=+4267.492547061" Mar 14 08:10:04 crc kubenswrapper[5129]: I0314 08:10:04.869208 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjwdb\" (UniqueName: \"kubernetes.io/projected/7998feb4-2ba1-44c3-a5dd-abed98f4a674-kube-api-access-wjwdb\") pod \"7998feb4-2ba1-44c3-a5dd-abed98f4a674\" (UID: \"7998feb4-2ba1-44c3-a5dd-abed98f4a674\") " Mar 14 08:10:04 crc kubenswrapper[5129]: I0314 08:10:04.882310 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7998feb4-2ba1-44c3-a5dd-abed98f4a674-kube-api-access-wjwdb" (OuterVolumeSpecName: "kube-api-access-wjwdb") pod "7998feb4-2ba1-44c3-a5dd-abed98f4a674" (UID: "7998feb4-2ba1-44c3-a5dd-abed98f4a674"). InnerVolumeSpecName "kube-api-access-wjwdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:10:04 crc kubenswrapper[5129]: I0314 08:10:04.970460 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjwdb\" (UniqueName: \"kubernetes.io/projected/7998feb4-2ba1-44c3-a5dd-abed98f4a674-kube-api-access-wjwdb\") on node \"crc\" DevicePath \"\"" Mar 14 08:10:05 crc kubenswrapper[5129]: I0314 08:10:05.444125 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557930-dfq6p" event={"ID":"7998feb4-2ba1-44c3-a5dd-abed98f4a674","Type":"ContainerDied","Data":"be6d3d0517c0084c6573552232079e94f303ecb406ed1351b80e0c2646a0a155"} Mar 14 08:10:05 crc kubenswrapper[5129]: I0314 08:10:05.444179 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be6d3d0517c0084c6573552232079e94f303ecb406ed1351b80e0c2646a0a155" Mar 14 08:10:05 crc kubenswrapper[5129]: I0314 08:10:05.444139 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557930-dfq6p" Mar 14 08:10:05 crc kubenswrapper[5129]: I0314 08:10:05.796136 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-w6lzp"] Mar 14 08:10:05 crc kubenswrapper[5129]: I0314 08:10:05.801262 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557924-w6lzp"] Mar 14 08:10:06 crc kubenswrapper[5129]: I0314 08:10:06.047864 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239da77b-d926-4c8c-8e47-f93a6ac44969" path="/var/lib/kubelet/pods/239da77b-d926-4c8c-8e47-f93a6ac44969/volumes" Mar 14 08:10:09 crc kubenswrapper[5129]: I0314 08:10:09.064338 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:10:09 crc kubenswrapper[5129]: I0314 08:10:09.064966 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:10:09 crc kubenswrapper[5129]: I0314 08:10:09.111347 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:10:09 crc kubenswrapper[5129]: I0314 08:10:09.517310 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:10:09 crc kubenswrapper[5129]: I0314 08:10:09.573597 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts4rq"] Mar 14 08:10:11 crc kubenswrapper[5129]: I0314 08:10:11.485741 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ts4rq" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="registry-server" containerID="cri-o://6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b" gracePeriod=2 Mar 14 08:10:11 crc kubenswrapper[5129]: I0314 08:10:11.900895 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:10:11 crc kubenswrapper[5129]: I0314 08:10:11.970774 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-utilities\") pod \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " Mar 14 08:10:11 crc kubenswrapper[5129]: I0314 08:10:11.970910 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-catalog-content\") pod \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " Mar 14 08:10:11 crc kubenswrapper[5129]: I0314 08:10:11.970958 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsv6n\" (UniqueName: \"kubernetes.io/projected/d79a9d09-a223-4614-8c5e-8d18f737aa1f-kube-api-access-tsv6n\") pod \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\" (UID: \"d79a9d09-a223-4614-8c5e-8d18f737aa1f\") " Mar 14 08:10:11 crc kubenswrapper[5129]: I0314 08:10:11.972076 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-utilities" (OuterVolumeSpecName: "utilities") pod "d79a9d09-a223-4614-8c5e-8d18f737aa1f" (UID: "d79a9d09-a223-4614-8c5e-8d18f737aa1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:10:11 crc kubenswrapper[5129]: I0314 08:10:11.976475 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d79a9d09-a223-4614-8c5e-8d18f737aa1f-kube-api-access-tsv6n" (OuterVolumeSpecName: "kube-api-access-tsv6n") pod "d79a9d09-a223-4614-8c5e-8d18f737aa1f" (UID: "d79a9d09-a223-4614-8c5e-8d18f737aa1f"). InnerVolumeSpecName "kube-api-access-tsv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.024051 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d79a9d09-a223-4614-8c5e-8d18f737aa1f" (UID: "d79a9d09-a223-4614-8c5e-8d18f737aa1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.073083 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.073113 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9d09-a223-4614-8c5e-8d18f737aa1f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.073123 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsv6n\" (UniqueName: \"kubernetes.io/projected/d79a9d09-a223-4614-8c5e-8d18f737aa1f-kube-api-access-tsv6n\") on node \"crc\" DevicePath \"\"" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.494728 5129 generic.go:334] "Generic (PLEG): container finished" podID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerID="6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b" exitCode=0 Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.494809 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts4rq" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.494829 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts4rq" event={"ID":"d79a9d09-a223-4614-8c5e-8d18f737aa1f","Type":"ContainerDied","Data":"6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b"} Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.495870 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts4rq" event={"ID":"d79a9d09-a223-4614-8c5e-8d18f737aa1f","Type":"ContainerDied","Data":"1f3446f85a5c17b24802ddff802847e5cb25422c080cb47711d694888430ec79"} Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.495894 5129 scope.go:117] "RemoveContainer" containerID="6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.520826 5129 scope.go:117] "RemoveContainer" containerID="dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.523388 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts4rq"] Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.532171 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts4rq"] Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.542354 5129 scope.go:117] "RemoveContainer" containerID="282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.575931 5129 scope.go:117] "RemoveContainer" containerID="6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b" Mar 14 08:10:12 crc kubenswrapper[5129]: E0314 08:10:12.576439 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b\": container with ID starting with 6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b not found: ID does not exist" containerID="6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.576497 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b"} err="failed to get container status \"6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b\": rpc error: code = NotFound desc = could not find container \"6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b\": container with ID starting with 6dfa54522bdceb1e44798ed265b27444f0d46889fbc938615606dc8c411d663b not found: ID does not exist" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.576530 5129 scope.go:117] "RemoveContainer" containerID="dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440" Mar 14 08:10:12 crc kubenswrapper[5129]: E0314 08:10:12.577240 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440\": container with ID starting with dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440 not found: ID does not exist" containerID="dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.577283 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440"} err="failed to get container status \"dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440\": rpc error: code = NotFound desc = could not find container \"dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440\": container with ID starting with dcb0495f9e57a0c2e491364a30c1ee466942d993f4ef65a3676ced7015b09440 not found: ID does not exist" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.577316 5129 scope.go:117] "RemoveContainer" containerID="282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337" Mar 14 08:10:12 crc kubenswrapper[5129]: E0314 08:10:12.577630 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337\": container with ID starting with 282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337 not found: ID does not exist" containerID="282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337" Mar 14 08:10:12 crc kubenswrapper[5129]: I0314 08:10:12.577656 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337"} err="failed to get container status \"282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337\": rpc error: code = NotFound desc = could not find container \"282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337\": container with ID starting with 282732e89821cf615048d77c772d1af082e19cb3f5457702f37cb71eaf6cb337 not found: ID does not exist" Mar 14 08:10:14 crc kubenswrapper[5129]: I0314 08:10:14.047569 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" path="/var/lib/kubelet/pods/d79a9d09-a223-4614-8c5e-8d18f737aa1f/volumes" Mar 14 08:10:23 crc kubenswrapper[5129]: I0314 08:10:23.722291 5129 scope.go:117] "RemoveContainer" containerID="da8ff0e09c6422857665044073566c0535dfc61c3c4e2fad85dcce654b631d8b" Mar 14 08:10:49 crc kubenswrapper[5129]: I0314 08:10:49.574235 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:10:49 crc kubenswrapper[5129]: I0314 08:10:49.574832 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:11:19 crc kubenswrapper[5129]: I0314 08:11:19.574418 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:11:19 crc kubenswrapper[5129]: I0314 08:11:19.575020 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:11:49 crc kubenswrapper[5129]: I0314 08:11:49.574764 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:11:49 crc kubenswrapper[5129]: I0314 08:11:49.575487 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:11:49 crc kubenswrapper[5129]: I0314 08:11:49.575559 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:11:49 crc kubenswrapper[5129]: I0314 08:11:49.576496 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:11:49 crc kubenswrapper[5129]: I0314 08:11:49.576663 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" gracePeriod=600 Mar 14 08:11:49 crc kubenswrapper[5129]: E0314 08:11:49.704315 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:11:50 crc kubenswrapper[5129]: I0314 08:11:50.190128 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" exitCode=0 Mar 14 08:11:50 crc kubenswrapper[5129]: I0314 08:11:50.190201 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63"} Mar 14 08:11:50 crc kubenswrapper[5129]: I0314 08:11:50.190744 5129 scope.go:117] "RemoveContainer" containerID="bf794827ef327c71df144de582e947a9b3492752b6b76e8b1ee32a2702a6123e" Mar 14 08:11:50 crc kubenswrapper[5129]: I0314 08:11:50.191306 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:11:50 crc kubenswrapper[5129]: E0314 08:11:50.191537 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.150413 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557932-h652f"] Mar 14 08:12:00 crc kubenswrapper[5129]: E0314 08:12:00.151392 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7998feb4-2ba1-44c3-a5dd-abed98f4a674" containerName="oc" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.151411 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7998feb4-2ba1-44c3-a5dd-abed98f4a674" containerName="oc" Mar 14 08:12:00 crc kubenswrapper[5129]: E0314 08:12:00.151436 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="extract-utilities" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.151445 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="extract-utilities" Mar 14 08:12:00 crc kubenswrapper[5129]: E0314 08:12:00.151478 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="extract-content" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.151488 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="extract-content" Mar 14 08:12:00 crc kubenswrapper[5129]: E0314 08:12:00.151498 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="registry-server" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.151505 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="registry-server" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.151683 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7998feb4-2ba1-44c3-a5dd-abed98f4a674" containerName="oc" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.151700 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79a9d09-a223-4614-8c5e-8d18f737aa1f" containerName="registry-server" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.189758 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-h652f"] Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.189892 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-h652f" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.191817 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.191858 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.192582 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.194184 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mbv\" (UniqueName: \"kubernetes.io/projected/8613d333-dcde-40a0-84a3-fa23e80a7a74-kube-api-access-d6mbv\") pod \"auto-csr-approver-29557932-h652f\" (UID: \"8613d333-dcde-40a0-84a3-fa23e80a7a74\") " pod="openshift-infra/auto-csr-approver-29557932-h652f" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.295679 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mbv\" (UniqueName: \"kubernetes.io/projected/8613d333-dcde-40a0-84a3-fa23e80a7a74-kube-api-access-d6mbv\") pod \"auto-csr-approver-29557932-h652f\" (UID: \"8613d333-dcde-40a0-84a3-fa23e80a7a74\") " pod="openshift-infra/auto-csr-approver-29557932-h652f" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.317162 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mbv\" (UniqueName: \"kubernetes.io/projected/8613d333-dcde-40a0-84a3-fa23e80a7a74-kube-api-access-d6mbv\") pod \"auto-csr-approver-29557932-h652f\" (UID: \"8613d333-dcde-40a0-84a3-fa23e80a7a74\") " pod="openshift-infra/auto-csr-approver-29557932-h652f" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.506552 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-h652f" Mar 14 08:12:00 crc kubenswrapper[5129]: I0314 08:12:00.899680 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-h652f"] Mar 14 08:12:01 crc kubenswrapper[5129]: I0314 08:12:01.270888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-h652f" event={"ID":"8613d333-dcde-40a0-84a3-fa23e80a7a74","Type":"ContainerStarted","Data":"759b251d827aff8d5c6e75dd4e78330ca4f0a8b61c3dd0bd148986c8894ff7e5"} Mar 14 08:12:02 crc kubenswrapper[5129]: I0314 08:12:02.280108 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-h652f" event={"ID":"8613d333-dcde-40a0-84a3-fa23e80a7a74","Type":"ContainerStarted","Data":"72285e1be024d3c03c6e1119bc316325679e223688d6d9e562da2cee74e816de"} Mar 14 08:12:02 crc kubenswrapper[5129]: I0314 08:12:02.298871 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557932-h652f" podStartSLOduration=1.45191965 podStartE2EDuration="2.298851103s" podCreationTimestamp="2026-03-14 08:12:00 +0000 UTC" firstStartedPulling="2026-03-14 08:12:01.023374286 +0000 UTC m=+4383.775289470" lastFinishedPulling="2026-03-14 08:12:01.870305739 +0000 UTC m=+4384.622220923" observedRunningTime="2026-03-14 08:12:02.294530945 +0000 UTC m=+4385.046446129" watchObservedRunningTime="2026-03-14 08:12:02.298851103 +0000 UTC m=+4385.050766297" Mar 14 08:12:03 crc kubenswrapper[5129]: I0314 08:12:03.036908 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:12:03 crc kubenswrapper[5129]: E0314 08:12:03.037280 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:12:03 crc kubenswrapper[5129]: I0314 08:12:03.291695 5129 generic.go:334] "Generic (PLEG): container finished" podID="8613d333-dcde-40a0-84a3-fa23e80a7a74" containerID="72285e1be024d3c03c6e1119bc316325679e223688d6d9e562da2cee74e816de" exitCode=0 Mar 14 08:12:03 crc kubenswrapper[5129]: I0314 08:12:03.291751 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-h652f" event={"ID":"8613d333-dcde-40a0-84a3-fa23e80a7a74","Type":"ContainerDied","Data":"72285e1be024d3c03c6e1119bc316325679e223688d6d9e562da2cee74e816de"} Mar 14 08:12:04 crc kubenswrapper[5129]: I0314 08:12:04.538278 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-h652f" Mar 14 08:12:04 crc kubenswrapper[5129]: I0314 08:12:04.656054 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6mbv\" (UniqueName: \"kubernetes.io/projected/8613d333-dcde-40a0-84a3-fa23e80a7a74-kube-api-access-d6mbv\") pod \"8613d333-dcde-40a0-84a3-fa23e80a7a74\" (UID: \"8613d333-dcde-40a0-84a3-fa23e80a7a74\") " Mar 14 08:12:04 crc kubenswrapper[5129]: I0314 08:12:04.662705 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8613d333-dcde-40a0-84a3-fa23e80a7a74-kube-api-access-d6mbv" (OuterVolumeSpecName: "kube-api-access-d6mbv") pod "8613d333-dcde-40a0-84a3-fa23e80a7a74" (UID: "8613d333-dcde-40a0-84a3-fa23e80a7a74"). InnerVolumeSpecName "kube-api-access-d6mbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:12:04 crc kubenswrapper[5129]: I0314 08:12:04.757909 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6mbv\" (UniqueName: \"kubernetes.io/projected/8613d333-dcde-40a0-84a3-fa23e80a7a74-kube-api-access-d6mbv\") on node \"crc\" DevicePath \"\"" Mar 14 08:12:05 crc kubenswrapper[5129]: I0314 08:12:05.306767 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557932-h652f" event={"ID":"8613d333-dcde-40a0-84a3-fa23e80a7a74","Type":"ContainerDied","Data":"759b251d827aff8d5c6e75dd4e78330ca4f0a8b61c3dd0bd148986c8894ff7e5"} Mar 14 08:12:05 crc kubenswrapper[5129]: I0314 08:12:05.306823 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="759b251d827aff8d5c6e75dd4e78330ca4f0a8b61c3dd0bd148986c8894ff7e5" Mar 14 08:12:05 crc kubenswrapper[5129]: I0314 08:12:05.306856 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557932-h652f" Mar 14 08:12:05 crc kubenswrapper[5129]: I0314 08:12:05.360064 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-wrl9s"] Mar 14 08:12:05 crc kubenswrapper[5129]: I0314 08:12:05.364856 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557926-wrl9s"] Mar 14 08:12:06 crc kubenswrapper[5129]: I0314 08:12:06.046917 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72e95e9-1005-4f78-98ec-3541836fe5b9" path="/var/lib/kubelet/pods/a72e95e9-1005-4f78-98ec-3541836fe5b9/volumes" Mar 14 08:12:14 crc kubenswrapper[5129]: I0314 08:12:14.036901 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:12:14 crc kubenswrapper[5129]: E0314 08:12:14.037830 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:12:23 crc kubenswrapper[5129]: I0314 08:12:23.802585 5129 scope.go:117] "RemoveContainer" containerID="af30279bc8f2e4b677b644747a4f3d58637a056eaa26617c7c0f4c2b188727c8" Mar 14 08:12:26 crc kubenswrapper[5129]: I0314 08:12:26.036388 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:12:26 crc kubenswrapper[5129]: E0314 08:12:26.037219 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:12:38 crc kubenswrapper[5129]: I0314 08:12:38.043638 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:12:38 crc kubenswrapper[5129]: E0314 08:12:38.044683 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:12:50 crc kubenswrapper[5129]: I0314 08:12:50.036336 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:12:50 crc kubenswrapper[5129]: E0314 08:12:50.037078 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:13:01 crc kubenswrapper[5129]: I0314 08:13:01.037062 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:13:01 crc kubenswrapper[5129]: E0314 08:13:01.038031 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:13:14 crc kubenswrapper[5129]: I0314 08:13:14.035758 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:13:14 crc kubenswrapper[5129]: E0314 08:13:14.036409 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:13:27 crc kubenswrapper[5129]: I0314 08:13:27.036313 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:13:27 crc kubenswrapper[5129]: E0314 08:13:27.037167 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:13:41 crc kubenswrapper[5129]: I0314 08:13:41.036963 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:13:41 crc kubenswrapper[5129]: E0314 08:13:41.037997 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:13:54 crc kubenswrapper[5129]: I0314 08:13:54.037380 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:13:54 crc kubenswrapper[5129]: E0314 08:13:54.038829 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.159797 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557934-fxmft"] Mar 14 08:14:00 crc kubenswrapper[5129]: E0314 08:14:00.161145 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8613d333-dcde-40a0-84a3-fa23e80a7a74" containerName="oc" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.161175 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8613d333-dcde-40a0-84a3-fa23e80a7a74" containerName="oc" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.161457 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8613d333-dcde-40a0-84a3-fa23e80a7a74" containerName="oc" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.162396 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-fxmft" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.167369 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.168902 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.169191 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.174169 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-fxmft"] Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.333479 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2klfd\" (UniqueName: \"kubernetes.io/projected/c1711beb-5567-448a-96f3-c77f77edf0d0-kube-api-access-2klfd\") pod \"auto-csr-approver-29557934-fxmft\" (UID: \"c1711beb-5567-448a-96f3-c77f77edf0d0\") " pod="openshift-infra/auto-csr-approver-29557934-fxmft" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.435429 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2klfd\" (UniqueName: \"kubernetes.io/projected/c1711beb-5567-448a-96f3-c77f77edf0d0-kube-api-access-2klfd\") pod \"auto-csr-approver-29557934-fxmft\" (UID: \"c1711beb-5567-448a-96f3-c77f77edf0d0\") " pod="openshift-infra/auto-csr-approver-29557934-fxmft" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.455585 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2klfd\" (UniqueName: \"kubernetes.io/projected/c1711beb-5567-448a-96f3-c77f77edf0d0-kube-api-access-2klfd\") pod \"auto-csr-approver-29557934-fxmft\" (UID: \"c1711beb-5567-448a-96f3-c77f77edf0d0\") " pod="openshift-infra/auto-csr-approver-29557934-fxmft" Mar 14 08:14:00 crc kubenswrapper[5129]: I0314 08:14:00.541581 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-fxmft" Mar 14 08:14:01 crc kubenswrapper[5129]: I0314 08:14:01.030295 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-fxmft"] Mar 14 08:14:01 crc kubenswrapper[5129]: I0314 08:14:01.037342 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:14:01 crc kubenswrapper[5129]: I0314 08:14:01.304249 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557934-fxmft" event={"ID":"c1711beb-5567-448a-96f3-c77f77edf0d0","Type":"ContainerStarted","Data":"a22ce8cf4a2f606abb84fd06a7c9ac6740a83dac04f9dfec4c0fdc6b5ea95883"} Mar 14 08:14:03 crc kubenswrapper[5129]: I0314 08:14:03.342011 5129 generic.go:334] "Generic (PLEG): container finished" podID="c1711beb-5567-448a-96f3-c77f77edf0d0" containerID="9b02accea30ff179a18cbd7b922f70b0eebd80e381d316e8540319d9e11f6147" exitCode=0 Mar 14 08:14:03 crc kubenswrapper[5129]: I0314 08:14:03.343023 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557934-fxmft" event={"ID":"c1711beb-5567-448a-96f3-c77f77edf0d0","Type":"ContainerDied","Data":"9b02accea30ff179a18cbd7b922f70b0eebd80e381d316e8540319d9e11f6147"} Mar 14 08:14:04 crc kubenswrapper[5129]: I0314 08:14:04.657500 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-fxmft" Mar 14 08:14:04 crc kubenswrapper[5129]: I0314 08:14:04.814779 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2klfd\" (UniqueName: \"kubernetes.io/projected/c1711beb-5567-448a-96f3-c77f77edf0d0-kube-api-access-2klfd\") pod \"c1711beb-5567-448a-96f3-c77f77edf0d0\" (UID: \"c1711beb-5567-448a-96f3-c77f77edf0d0\") " Mar 14 08:14:04 crc kubenswrapper[5129]: I0314 08:14:04.823532 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1711beb-5567-448a-96f3-c77f77edf0d0-kube-api-access-2klfd" (OuterVolumeSpecName: "kube-api-access-2klfd") pod "c1711beb-5567-448a-96f3-c77f77edf0d0" (UID: "c1711beb-5567-448a-96f3-c77f77edf0d0"). InnerVolumeSpecName "kube-api-access-2klfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:14:04 crc kubenswrapper[5129]: I0314 08:14:04.918164 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2klfd\" (UniqueName: \"kubernetes.io/projected/c1711beb-5567-448a-96f3-c77f77edf0d0-kube-api-access-2klfd\") on node \"crc\" DevicePath \"\"" Mar 14 08:14:05 crc kubenswrapper[5129]: I0314 08:14:05.367567 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557934-fxmft" event={"ID":"c1711beb-5567-448a-96f3-c77f77edf0d0","Type":"ContainerDied","Data":"a22ce8cf4a2f606abb84fd06a7c9ac6740a83dac04f9dfec4c0fdc6b5ea95883"} Mar 14 08:14:05 crc kubenswrapper[5129]: I0314 08:14:05.367675 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22ce8cf4a2f606abb84fd06a7c9ac6740a83dac04f9dfec4c0fdc6b5ea95883" Mar 14 08:14:05 crc kubenswrapper[5129]: I0314 08:14:05.367743 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557934-fxmft" Mar 14 08:14:05 crc kubenswrapper[5129]: I0314 08:14:05.744726 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-zk67m"] Mar 14 08:14:05 crc kubenswrapper[5129]: I0314 08:14:05.756736 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557928-zk67m"] Mar 14 08:14:06 crc kubenswrapper[5129]: I0314 08:14:06.054388 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53af3bd0-780d-4e46-8843-c9dd4c4e0bb1" path="/var/lib/kubelet/pods/53af3bd0-780d-4e46-8843-c9dd4c4e0bb1/volumes" Mar 14 08:14:07 crc kubenswrapper[5129]: I0314 08:14:07.036691 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:14:07 crc kubenswrapper[5129]: E0314 08:14:07.037196 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:14:18 crc kubenswrapper[5129]: I0314 08:14:18.037475 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:14:18 crc kubenswrapper[5129]: E0314 08:14:18.038810 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:14:23 crc kubenswrapper[5129]: I0314 08:14:23.888263 5129 scope.go:117] "RemoveContainer" containerID="0848a8d2863f6eb9d4d98980b9703272773d0161533b6a4e3accd00e29ccf377" Mar 14 08:14:30 crc kubenswrapper[5129]: I0314 08:14:30.037437 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:14:30 crc kubenswrapper[5129]: E0314 08:14:30.038140 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:14:41 crc kubenswrapper[5129]: I0314 08:14:41.036955 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:14:41 crc kubenswrapper[5129]: E0314 08:14:41.037664 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.730153 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7qkhr"] Mar 14 08:14:46 crc kubenswrapper[5129]: E0314 08:14:46.731551 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1711beb-5567-448a-96f3-c77f77edf0d0" containerName="oc" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.731587 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1711beb-5567-448a-96f3-c77f77edf0d0" containerName="oc" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.732010 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1711beb-5567-448a-96f3-c77f77edf0d0" containerName="oc" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.734497 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.736501 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qkhr"] Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.766565 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt6d9\" (UniqueName: \"kubernetes.io/projected/637e03fa-e13f-4582-b79c-0ba685307be8-kube-api-access-zt6d9\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.766686 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-catalog-content\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.766759 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-utilities\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.867957 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-utilities\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.868391 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt6d9\" (UniqueName: \"kubernetes.io/projected/637e03fa-e13f-4582-b79c-0ba685307be8-kube-api-access-zt6d9\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.868514 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-utilities\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.868526 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-catalog-content\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.869117 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-catalog-content\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:46 crc kubenswrapper[5129]: I0314 08:14:46.886709 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt6d9\" (UniqueName: \"kubernetes.io/projected/637e03fa-e13f-4582-b79c-0ba685307be8-kube-api-access-zt6d9\") pod \"certified-operators-7qkhr\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:47 crc kubenswrapper[5129]: I0314 08:14:47.087893 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:47 crc kubenswrapper[5129]: I0314 08:14:47.547333 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qkhr"] Mar 14 08:14:47 crc kubenswrapper[5129]: I0314 08:14:47.783931 5129 generic.go:334] "Generic (PLEG): container finished" podID="637e03fa-e13f-4582-b79c-0ba685307be8" containerID="cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb" exitCode=0 Mar 14 08:14:47 crc kubenswrapper[5129]: I0314 08:14:47.783983 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qkhr" event={"ID":"637e03fa-e13f-4582-b79c-0ba685307be8","Type":"ContainerDied","Data":"cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb"} Mar 14 08:14:47 crc kubenswrapper[5129]: I0314 08:14:47.784019 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qkhr" event={"ID":"637e03fa-e13f-4582-b79c-0ba685307be8","Type":"ContainerStarted","Data":"0272563dbec5a49b76131348fc03b18d2a5047025ce7ef2e31c5c35190b1491d"} Mar 14 08:14:48 crc kubenswrapper[5129]: I0314 08:14:48.797902 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qkhr" event={"ID":"637e03fa-e13f-4582-b79c-0ba685307be8","Type":"ContainerStarted","Data":"972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9"} Mar 14 08:14:49 crc kubenswrapper[5129]: I0314 08:14:49.805105 5129 generic.go:334] "Generic (PLEG): container finished" podID="637e03fa-e13f-4582-b79c-0ba685307be8" containerID="972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9" exitCode=0 Mar 14 08:14:49 crc kubenswrapper[5129]: I0314 08:14:49.805161 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qkhr" event={"ID":"637e03fa-e13f-4582-b79c-0ba685307be8","Type":"ContainerDied","Data":"972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9"} Mar 14 08:14:50 crc kubenswrapper[5129]: I0314 08:14:50.813315 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qkhr" event={"ID":"637e03fa-e13f-4582-b79c-0ba685307be8","Type":"ContainerStarted","Data":"6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792"} Mar 14 08:14:56 crc kubenswrapper[5129]: I0314 08:14:56.036523 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:14:56 crc kubenswrapper[5129]: E0314 08:14:56.037341 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:14:57 crc kubenswrapper[5129]: I0314 08:14:57.088788 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:57 crc kubenswrapper[5129]: I0314 08:14:57.088884 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:57 crc kubenswrapper[5129]: I0314 08:14:57.165852 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:57 crc kubenswrapper[5129]: I0314 08:14:57.186657 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7qkhr" podStartSLOduration=8.759247363 podStartE2EDuration="11.186638928s" podCreationTimestamp="2026-03-14 08:14:46 +0000 UTC" firstStartedPulling="2026-03-14 08:14:47.787109181 +0000 UTC m=+4550.539024375" lastFinishedPulling="2026-03-14 08:14:50.214500756 +0000 UTC m=+4552.966415940" observedRunningTime="2026-03-14 08:14:50.83624051 +0000 UTC m=+4553.588155694" watchObservedRunningTime="2026-03-14 08:14:57.186638928 +0000 UTC m=+4559.938554112" Mar 14 08:14:57 crc kubenswrapper[5129]: I0314 08:14:57.894909 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:14:57 crc kubenswrapper[5129]: I0314 08:14:57.935373 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qkhr"] Mar 14 08:14:59 crc kubenswrapper[5129]: I0314 08:14:59.871333 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7qkhr" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="registry-server" containerID="cri-o://6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792" gracePeriod=2 Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.141860 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5"] Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.143050 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.145105 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.146293 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.153161 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5"] Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.299461 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-config-volume\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.299540 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwvn\" (UniqueName: \"kubernetes.io/projected/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-kube-api-access-xdwvn\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.299585 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-secret-volume\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.401652 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-secret-volume\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.401926 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-config-volume\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.402025 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwvn\" (UniqueName: \"kubernetes.io/projected/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-kube-api-access-xdwvn\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.404845 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-config-volume\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.418697 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-secret-volume\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.422236 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwvn\" (UniqueName: \"kubernetes.io/projected/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-kube-api-access-xdwvn\") pod \"collect-profiles-29557935-tctc5\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.461029 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.777075 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.863907 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5"] Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.886182 5129 generic.go:334] "Generic (PLEG): container finished" podID="637e03fa-e13f-4582-b79c-0ba685307be8" containerID="6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792" exitCode=0 Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.886226 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qkhr" event={"ID":"637e03fa-e13f-4582-b79c-0ba685307be8","Type":"ContainerDied","Data":"6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792"} Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.886253 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qkhr" event={"ID":"637e03fa-e13f-4582-b79c-0ba685307be8","Type":"ContainerDied","Data":"0272563dbec5a49b76131348fc03b18d2a5047025ce7ef2e31c5c35190b1491d"} Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.886270 5129 scope.go:117] "RemoveContainer" containerID="6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.886391 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qkhr" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.905862 5129 scope.go:117] "RemoveContainer" containerID="972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.908197 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-catalog-content\") pod \"637e03fa-e13f-4582-b79c-0ba685307be8\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.908752 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-utilities\") pod \"637e03fa-e13f-4582-b79c-0ba685307be8\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.908800 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt6d9\" (UniqueName: \"kubernetes.io/projected/637e03fa-e13f-4582-b79c-0ba685307be8-kube-api-access-zt6d9\") pod \"637e03fa-e13f-4582-b79c-0ba685307be8\" (UID: \"637e03fa-e13f-4582-b79c-0ba685307be8\") " Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.910155 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-utilities" (OuterVolumeSpecName: "utilities") pod "637e03fa-e13f-4582-b79c-0ba685307be8" (UID: "637e03fa-e13f-4582-b79c-0ba685307be8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.913199 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637e03fa-e13f-4582-b79c-0ba685307be8-kube-api-access-zt6d9" (OuterVolumeSpecName: "kube-api-access-zt6d9") pod "637e03fa-e13f-4582-b79c-0ba685307be8" (UID: "637e03fa-e13f-4582-b79c-0ba685307be8"). InnerVolumeSpecName "kube-api-access-zt6d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.926714 5129 scope.go:117] "RemoveContainer" containerID="cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.942090 5129 scope.go:117] "RemoveContainer" containerID="6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792" Mar 14 08:15:00 crc kubenswrapper[5129]: E0314 08:15:00.942510 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792\": container with ID starting with 6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792 not found: ID does not exist" containerID="6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.942542 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792"} err="failed to get container status \"6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792\": rpc error: code = NotFound desc = could not find container \"6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792\": container with ID starting with 6722fe2a34c69d568a703dc236b739e517995fb2b15ee919b8607d4b4daad792 not found: ID does not exist" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.942563 5129 scope.go:117] "RemoveContainer" containerID="972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9" Mar 14 08:15:00 crc kubenswrapper[5129]: E0314 08:15:00.942962 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9\": container with ID starting with 972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9 not found: ID does not exist" containerID="972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.942988 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9"} err="failed to get container status \"972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9\": rpc error: code = NotFound desc = could not find container \"972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9\": container with ID starting with 972e32e74d459f69cf8e3ea2f1ded308d5d8fb3c6950563c9fb35db0b9e724c9 not found: ID does not exist" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.943003 5129 scope.go:117] "RemoveContainer" containerID="cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb" Mar 14 08:15:00 crc kubenswrapper[5129]: E0314 08:15:00.943253 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb\": container with ID starting with cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb not found: ID does not exist" containerID="cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.943282 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb"} err="failed to get container status \"cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb\": rpc error: code = NotFound desc = could not find container \"cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb\": container with ID starting with cdb3c4e6eea4200a34fcb3632e3986c0cc01b401b8efcbc54da285a11afb7beb not found: ID does not exist" Mar 14 08:15:00 crc kubenswrapper[5129]: I0314 08:15:00.961788 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "637e03fa-e13f-4582-b79c-0ba685307be8" (UID: "637e03fa-e13f-4582-b79c-0ba685307be8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.009857 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.010010 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637e03fa-e13f-4582-b79c-0ba685307be8-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.010099 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt6d9\" (UniqueName: \"kubernetes.io/projected/637e03fa-e13f-4582-b79c-0ba685307be8-kube-api-access-zt6d9\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.236868 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qkhr"] Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.244098 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7qkhr"] Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.893947 5129 generic.go:334] "Generic (PLEG): container finished" podID="42a9b5e2-d646-4c22-930e-a5ac08cf3e56" containerID="51836c45836211810ac84637e54672b86910a37bb85e1ab59e243670db7c6c65" exitCode=0 Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.894003 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" event={"ID":"42a9b5e2-d646-4c22-930e-a5ac08cf3e56","Type":"ContainerDied","Data":"51836c45836211810ac84637e54672b86910a37bb85e1ab59e243670db7c6c65"} Mar 14 08:15:01 crc kubenswrapper[5129]: I0314 08:15:01.894030 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" event={"ID":"42a9b5e2-d646-4c22-930e-a5ac08cf3e56","Type":"ContainerStarted","Data":"2b877f1ad13d43feba95b9210418985475fc0597d27dd7fcea928518bf6fde07"} Mar 14 08:15:02 crc kubenswrapper[5129]: I0314 08:15:02.044139 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" path="/var/lib/kubelet/pods/637e03fa-e13f-4582-b79c-0ba685307be8/volumes" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.171959 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.345734 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdwvn\" (UniqueName: \"kubernetes.io/projected/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-kube-api-access-xdwvn\") pod \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.345817 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-config-volume\") pod \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.345846 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-secret-volume\") pod \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\" (UID: \"42a9b5e2-d646-4c22-930e-a5ac08cf3e56\") " Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.346576 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-config-volume" (OuterVolumeSpecName: "config-volume") pod "42a9b5e2-d646-4c22-930e-a5ac08cf3e56" (UID: "42a9b5e2-d646-4c22-930e-a5ac08cf3e56"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.353790 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42a9b5e2-d646-4c22-930e-a5ac08cf3e56" (UID: "42a9b5e2-d646-4c22-930e-a5ac08cf3e56"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.358927 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-kube-api-access-xdwvn" (OuterVolumeSpecName: "kube-api-access-xdwvn") pod "42a9b5e2-d646-4c22-930e-a5ac08cf3e56" (UID: "42a9b5e2-d646-4c22-930e-a5ac08cf3e56"). InnerVolumeSpecName "kube-api-access-xdwvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.447313 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdwvn\" (UniqueName: \"kubernetes.io/projected/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-kube-api-access-xdwvn\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.447361 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.447373 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a9b5e2-d646-4c22-930e-a5ac08cf3e56-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.910383 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" event={"ID":"42a9b5e2-d646-4c22-930e-a5ac08cf3e56","Type":"ContainerDied","Data":"2b877f1ad13d43feba95b9210418985475fc0597d27dd7fcea928518bf6fde07"} Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.910762 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b877f1ad13d43feba95b9210418985475fc0597d27dd7fcea928518bf6fde07" Mar 14 08:15:03 crc kubenswrapper[5129]: I0314 08:15:03.910456 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5" Mar 14 08:15:04 crc kubenswrapper[5129]: I0314 08:15:04.255423 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd"] Mar 14 08:15:04 crc kubenswrapper[5129]: I0314 08:15:04.264080 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557890-glgsd"] Mar 14 08:15:06 crc kubenswrapper[5129]: I0314 08:15:06.046375 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e86f46-497a-42a5-b15f-fbb484545a18" path="/var/lib/kubelet/pods/f4e86f46-497a-42a5-b15f-fbb484545a18/volumes" Mar 14 08:15:10 crc kubenswrapper[5129]: I0314 08:15:10.036584 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:15:10 crc kubenswrapper[5129]: E0314 08:15:10.037325 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.342416 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8sxz5"] Mar 14 08:15:18 crc kubenswrapper[5129]: E0314 08:15:18.343464 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="extract-utilities" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.343478 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="extract-utilities" Mar 14 08:15:18 crc kubenswrapper[5129]: E0314 08:15:18.343497 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="registry-server" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.343502 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="registry-server" Mar 14 08:15:18 crc kubenswrapper[5129]: E0314 08:15:18.343512 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="extract-content" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.343518 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="extract-content" Mar 14 08:15:18 crc kubenswrapper[5129]: E0314 08:15:18.343538 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a9b5e2-d646-4c22-930e-a5ac08cf3e56" containerName="collect-profiles" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.343544 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a9b5e2-d646-4c22-930e-a5ac08cf3e56" containerName="collect-profiles" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.343693 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="42a9b5e2-d646-4c22-930e-a5ac08cf3e56" containerName="collect-profiles" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.343704 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="637e03fa-e13f-4582-b79c-0ba685307be8" containerName="registry-server" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.344703 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.356546 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8sxz5"] Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.381797 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnwhp\" (UniqueName: \"kubernetes.io/projected/34c32c77-342e-4e8d-b7af-062078f80a2a-kube-api-access-bnwhp\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.382175 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-utilities\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.382430 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-catalog-content\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.483333 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-catalog-content\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.483849 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-catalog-content\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.484069 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnwhp\" (UniqueName: \"kubernetes.io/projected/34c32c77-342e-4e8d-b7af-062078f80a2a-kube-api-access-bnwhp\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.484266 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-utilities\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.484562 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-utilities\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.511975 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnwhp\" (UniqueName: \"kubernetes.io/projected/34c32c77-342e-4e8d-b7af-062078f80a2a-kube-api-access-bnwhp\") pod \"redhat-operators-8sxz5\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:18 crc kubenswrapper[5129]: I0314 08:15:18.662125 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:19 crc kubenswrapper[5129]: I0314 08:15:19.176653 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8sxz5"] Mar 14 08:15:20 crc kubenswrapper[5129]: I0314 08:15:20.026093 5129 generic.go:334] "Generic (PLEG): container finished" podID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerID="06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67" exitCode=0 Mar 14 08:15:20 crc kubenswrapper[5129]: I0314 08:15:20.026135 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sxz5" event={"ID":"34c32c77-342e-4e8d-b7af-062078f80a2a","Type":"ContainerDied","Data":"06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67"} Mar 14 08:15:20 crc kubenswrapper[5129]: I0314 08:15:20.026397 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sxz5" event={"ID":"34c32c77-342e-4e8d-b7af-062078f80a2a","Type":"ContainerStarted","Data":"03366fa5af6c84f4e87a7914d6c8b55c503d637b29db09af1e0a14391b41c2e1"} Mar 14 08:15:22 crc kubenswrapper[5129]: I0314 08:15:22.036019 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:15:22 crc kubenswrapper[5129]: E0314 08:15:22.036589 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:15:22 crc kubenswrapper[5129]: I0314 08:15:22.046836 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sxz5" event={"ID":"34c32c77-342e-4e8d-b7af-062078f80a2a","Type":"ContainerStarted","Data":"90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772"} Mar 14 08:15:23 crc kubenswrapper[5129]: I0314 08:15:23.055357 5129 generic.go:334] "Generic (PLEG): container finished" podID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerID="90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772" exitCode=0 Mar 14 08:15:23 crc kubenswrapper[5129]: I0314 08:15:23.055694 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sxz5" event={"ID":"34c32c77-342e-4e8d-b7af-062078f80a2a","Type":"ContainerDied","Data":"90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772"} Mar 14 08:15:24 crc kubenswrapper[5129]: I0314 08:15:24.064894 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sxz5" event={"ID":"34c32c77-342e-4e8d-b7af-062078f80a2a","Type":"ContainerStarted","Data":"d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a"} Mar 14 08:15:24 crc kubenswrapper[5129]: I0314 08:15:24.088230 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8sxz5" podStartSLOduration=2.511527682 podStartE2EDuration="6.088210585s" podCreationTimestamp="2026-03-14 08:15:18 +0000 UTC" firstStartedPulling="2026-03-14 08:15:20.027536841 +0000 UTC m=+4582.779452025" lastFinishedPulling="2026-03-14 08:15:23.604219744 +0000 UTC m=+4586.356134928" observedRunningTime="2026-03-14 08:15:24.082490919 +0000 UTC m=+4586.834406123" watchObservedRunningTime="2026-03-14 08:15:24.088210585 +0000 UTC m=+4586.840125769" Mar 14 08:15:24 crc kubenswrapper[5129]: I0314 08:15:24.483176 5129 scope.go:117] "RemoveContainer" containerID="5ffbb904f7121a9609eb49eed71b135d7cfec0a840e955f26cde1e2667cd8115" Mar 14 08:15:28 crc kubenswrapper[5129]: I0314 08:15:28.663046 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:28 crc kubenswrapper[5129]: I0314 08:15:28.663865 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:29 crc kubenswrapper[5129]: I0314 08:15:29.739826 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8sxz5" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="registry-server" probeResult="failure" output=< Mar 14 08:15:29 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 08:15:29 crc kubenswrapper[5129]: > Mar 14 08:15:37 crc kubenswrapper[5129]: I0314 08:15:37.036828 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:15:37 crc kubenswrapper[5129]: E0314 08:15:37.038058 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:15:38 crc kubenswrapper[5129]: I0314 08:15:38.716286 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:38 crc kubenswrapper[5129]: I0314 08:15:38.768470 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:38 crc kubenswrapper[5129]: I0314 08:15:38.953630 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8sxz5"] Mar 14 08:15:40 crc kubenswrapper[5129]: I0314 08:15:40.181131 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8sxz5" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="registry-server" containerID="cri-o://d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a" gracePeriod=2 Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.053537 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.124517 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-utilities\") pod \"34c32c77-342e-4e8d-b7af-062078f80a2a\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.124593 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-catalog-content\") pod \"34c32c77-342e-4e8d-b7af-062078f80a2a\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.124655 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnwhp\" (UniqueName: \"kubernetes.io/projected/34c32c77-342e-4e8d-b7af-062078f80a2a-kube-api-access-bnwhp\") pod \"34c32c77-342e-4e8d-b7af-062078f80a2a\" (UID: \"34c32c77-342e-4e8d-b7af-062078f80a2a\") " Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.125865 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-utilities" (OuterVolumeSpecName: "utilities") pod "34c32c77-342e-4e8d-b7af-062078f80a2a" (UID: "34c32c77-342e-4e8d-b7af-062078f80a2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.143536 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c32c77-342e-4e8d-b7af-062078f80a2a-kube-api-access-bnwhp" (OuterVolumeSpecName: "kube-api-access-bnwhp") pod "34c32c77-342e-4e8d-b7af-062078f80a2a" (UID: "34c32c77-342e-4e8d-b7af-062078f80a2a"). InnerVolumeSpecName "kube-api-access-bnwhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.193375 5129 generic.go:334] "Generic (PLEG): container finished" podID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerID="d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a" exitCode=0 Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.193427 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sxz5" event={"ID":"34c32c77-342e-4e8d-b7af-062078f80a2a","Type":"ContainerDied","Data":"d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a"} Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.193459 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sxz5" event={"ID":"34c32c77-342e-4e8d-b7af-062078f80a2a","Type":"ContainerDied","Data":"03366fa5af6c84f4e87a7914d6c8b55c503d637b29db09af1e0a14391b41c2e1"} Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.193480 5129 scope.go:117] "RemoveContainer" containerID="d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.193478 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sxz5" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.220730 5129 scope.go:117] "RemoveContainer" containerID="90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.226189 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.226246 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnwhp\" (UniqueName: \"kubernetes.io/projected/34c32c77-342e-4e8d-b7af-062078f80a2a-kube-api-access-bnwhp\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.253897 5129 scope.go:117] "RemoveContainer" containerID="06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.272452 5129 scope.go:117] "RemoveContainer" containerID="d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a" Mar 14 08:15:41 crc kubenswrapper[5129]: E0314 08:15:41.273080 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a\": container with ID starting with d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a not found: ID does not exist" containerID="d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.273116 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a"} err="failed to get container status \"d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a\": rpc error: code = NotFound desc = could not find container \"d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a\": container with ID starting with d502cac4e35bc5a004c45c0a28f955590ce29e95ab97c88a0f3f7a24392ce41a not found: ID does not exist" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.273139 5129 scope.go:117] "RemoveContainer" containerID="90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772" Mar 14 08:15:41 crc kubenswrapper[5129]: E0314 08:15:41.273483 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772\": container with ID starting with 90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772 not found: ID does not exist" containerID="90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.273513 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772"} err="failed to get container status \"90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772\": rpc error: code = NotFound desc = could not find container \"90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772\": container with ID starting with 90c3c73014e4fd2094ab80615002d76deb7fcdf453d6c05f63293678ddcf9772 not found: ID does not exist" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.273531 5129 scope.go:117] "RemoveContainer" containerID="06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67" Mar 14 08:15:41 crc kubenswrapper[5129]: E0314 08:15:41.273866 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67\": container with ID starting with 06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67 not found: ID does not exist" containerID="06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.273891 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67"} err="failed to get container status \"06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67\": rpc error: code = NotFound desc = could not find container \"06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67\": container with ID starting with 06c718588f276a7a2a1707d611997e2acbf89ef159ef6e59d7bbbfdae0a6cf67 not found: ID does not exist" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.293945 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34c32c77-342e-4e8d-b7af-062078f80a2a" (UID: "34c32c77-342e-4e8d-b7af-062078f80a2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.328021 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c32c77-342e-4e8d-b7af-062078f80a2a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.537058 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8sxz5"] Mar 14 08:15:41 crc kubenswrapper[5129]: I0314 08:15:41.542839 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8sxz5"] Mar 14 08:15:42 crc kubenswrapper[5129]: I0314 08:15:42.047660 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" path="/var/lib/kubelet/pods/34c32c77-342e-4e8d-b7af-062078f80a2a/volumes" Mar 14 08:15:50 crc kubenswrapper[5129]: I0314 08:15:50.036371 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:15:50 crc kubenswrapper[5129]: E0314 08:15:50.036954 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.167516 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557936-8v6gs"] Mar 14 08:16:00 crc kubenswrapper[5129]: E0314 08:16:00.168792 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="registry-server" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.168815 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="registry-server" Mar 14 08:16:00 crc kubenswrapper[5129]: E0314 08:16:00.168865 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="extract-content" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.168883 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="extract-content" Mar 14 08:16:00 crc kubenswrapper[5129]: E0314 08:16:00.168905 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="extract-utilities" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.168919 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="extract-utilities" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.169190 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c32c77-342e-4e8d-b7af-062078f80a2a" containerName="registry-server" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.170112 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.173391 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.173570 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.178800 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-8v6gs"] Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.179721 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.226925 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghnph\" (UniqueName: \"kubernetes.io/projected/d02c25a8-0c07-4449-9d35-b499b9fca8ca-kube-api-access-ghnph\") pod \"auto-csr-approver-29557936-8v6gs\" (UID: \"d02c25a8-0c07-4449-9d35-b499b9fca8ca\") " pod="openshift-infra/auto-csr-approver-29557936-8v6gs" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.328854 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghnph\" (UniqueName: \"kubernetes.io/projected/d02c25a8-0c07-4449-9d35-b499b9fca8ca-kube-api-access-ghnph\") pod \"auto-csr-approver-29557936-8v6gs\" (UID: \"d02c25a8-0c07-4449-9d35-b499b9fca8ca\") " pod="openshift-infra/auto-csr-approver-29557936-8v6gs" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.352258 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghnph\" (UniqueName: \"kubernetes.io/projected/d02c25a8-0c07-4449-9d35-b499b9fca8ca-kube-api-access-ghnph\") pod \"auto-csr-approver-29557936-8v6gs\" (UID: \"d02c25a8-0c07-4449-9d35-b499b9fca8ca\") " pod="openshift-infra/auto-csr-approver-29557936-8v6gs" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.527047 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" Mar 14 08:16:00 crc kubenswrapper[5129]: I0314 08:16:00.954345 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-8v6gs"] Mar 14 08:16:01 crc kubenswrapper[5129]: I0314 08:16:01.367597 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" event={"ID":"d02c25a8-0c07-4449-9d35-b499b9fca8ca","Type":"ContainerStarted","Data":"6904143e0cb73d25dd8d4184194d95c65d4290e3e5185d1899802ec128ff6fdb"} Mar 14 08:16:02 crc kubenswrapper[5129]: I0314 08:16:02.379395 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" event={"ID":"d02c25a8-0c07-4449-9d35-b499b9fca8ca","Type":"ContainerStarted","Data":"52944355abdfa3c1628465b943798c965a6ad6a84f8cff51b02b8cc1e5fb2428"} Mar 14 08:16:03 crc kubenswrapper[5129]: I0314 08:16:03.036670 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:16:03 crc kubenswrapper[5129]: E0314 08:16:03.037488 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:16:03 crc kubenswrapper[5129]: I0314 08:16:03.397396 5129 generic.go:334] "Generic (PLEG): container finished" podID="d02c25a8-0c07-4449-9d35-b499b9fca8ca" containerID="52944355abdfa3c1628465b943798c965a6ad6a84f8cff51b02b8cc1e5fb2428" exitCode=0 Mar 14 08:16:03 crc kubenswrapper[5129]: I0314 08:16:03.397437 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" event={"ID":"d02c25a8-0c07-4449-9d35-b499b9fca8ca","Type":"ContainerDied","Data":"52944355abdfa3c1628465b943798c965a6ad6a84f8cff51b02b8cc1e5fb2428"} Mar 14 08:16:04 crc kubenswrapper[5129]: I0314 08:16:04.711212 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" Mar 14 08:16:04 crc kubenswrapper[5129]: I0314 08:16:04.818824 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghnph\" (UniqueName: \"kubernetes.io/projected/d02c25a8-0c07-4449-9d35-b499b9fca8ca-kube-api-access-ghnph\") pod \"d02c25a8-0c07-4449-9d35-b499b9fca8ca\" (UID: \"d02c25a8-0c07-4449-9d35-b499b9fca8ca\") " Mar 14 08:16:04 crc kubenswrapper[5129]: I0314 08:16:04.834896 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02c25a8-0c07-4449-9d35-b499b9fca8ca-kube-api-access-ghnph" (OuterVolumeSpecName: "kube-api-access-ghnph") pod "d02c25a8-0c07-4449-9d35-b499b9fca8ca" (UID: "d02c25a8-0c07-4449-9d35-b499b9fca8ca"). InnerVolumeSpecName "kube-api-access-ghnph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:16:04 crc kubenswrapper[5129]: I0314 08:16:04.920780 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghnph\" (UniqueName: \"kubernetes.io/projected/d02c25a8-0c07-4449-9d35-b499b9fca8ca-kube-api-access-ghnph\") on node \"crc\" DevicePath \"\"" Mar 14 08:16:05 crc kubenswrapper[5129]: I0314 08:16:05.414561 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" event={"ID":"d02c25a8-0c07-4449-9d35-b499b9fca8ca","Type":"ContainerDied","Data":"6904143e0cb73d25dd8d4184194d95c65d4290e3e5185d1899802ec128ff6fdb"} Mar 14 08:16:05 crc kubenswrapper[5129]: I0314 08:16:05.414656 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6904143e0cb73d25dd8d4184194d95c65d4290e3e5185d1899802ec128ff6fdb" Mar 14 08:16:05 crc kubenswrapper[5129]: I0314 08:16:05.414779 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557936-8v6gs" Mar 14 08:16:05 crc kubenswrapper[5129]: I0314 08:16:05.772556 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-dfq6p"] Mar 14 08:16:05 crc kubenswrapper[5129]: I0314 08:16:05.776930 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557930-dfq6p"] Mar 14 08:16:06 crc kubenswrapper[5129]: I0314 08:16:06.044863 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7998feb4-2ba1-44c3-a5dd-abed98f4a674" path="/var/lib/kubelet/pods/7998feb4-2ba1-44c3-a5dd-abed98f4a674/volumes" Mar 14 08:16:16 crc kubenswrapper[5129]: I0314 08:16:16.036856 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:16:16 crc kubenswrapper[5129]: E0314 08:16:16.037879 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:16:24 crc kubenswrapper[5129]: I0314 08:16:24.544934 5129 scope.go:117] "RemoveContainer" containerID="99bbfa14a218ede89403fdb13f0f64bf8ac37b5aaf53bef23f76214d51ca4901" Mar 14 08:16:29 crc kubenswrapper[5129]: I0314 08:16:29.036371 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:16:29 crc kubenswrapper[5129]: E0314 08:16:29.037233 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.827716 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7tt7v"] Mar 14 08:16:32 crc kubenswrapper[5129]: E0314 08:16:32.828356 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02c25a8-0c07-4449-9d35-b499b9fca8ca" containerName="oc" Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.828372 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02c25a8-0c07-4449-9d35-b499b9fca8ca" containerName="oc" Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.828561 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02c25a8-0c07-4449-9d35-b499b9fca8ca" containerName="oc" Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.829857 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.847462 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7tt7v"] Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.944368 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-catalog-content\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.945026 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml874\" (UniqueName: \"kubernetes.io/projected/11814b4d-9519-43d3-b7b3-bab72b0b493d-kube-api-access-ml874\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:32 crc kubenswrapper[5129]: I0314 08:16:32.945209 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-utilities\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.046654 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-utilities\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.046760 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-catalog-content\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.046823 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml874\" (UniqueName: \"kubernetes.io/projected/11814b4d-9519-43d3-b7b3-bab72b0b493d-kube-api-access-ml874\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.047651 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-utilities\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.048039 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-catalog-content\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.080502 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml874\" (UniqueName: \"kubernetes.io/projected/11814b4d-9519-43d3-b7b3-bab72b0b493d-kube-api-access-ml874\") pod \"community-operators-7tt7v\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.148126 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:33 crc kubenswrapper[5129]: I0314 08:16:33.712329 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7tt7v"] Mar 14 08:16:34 crc kubenswrapper[5129]: I0314 08:16:34.698466 5129 generic.go:334] "Generic (PLEG): container finished" podID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerID="2cc2d4896e97f29a82128af8cc104c719645c71017b8b43536ae0dc7771867ea" exitCode=0 Mar 14 08:16:34 crc kubenswrapper[5129]: I0314 08:16:34.698640 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tt7v" event={"ID":"11814b4d-9519-43d3-b7b3-bab72b0b493d","Type":"ContainerDied","Data":"2cc2d4896e97f29a82128af8cc104c719645c71017b8b43536ae0dc7771867ea"} Mar 14 08:16:34 crc kubenswrapper[5129]: I0314 08:16:34.698912 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tt7v" event={"ID":"11814b4d-9519-43d3-b7b3-bab72b0b493d","Type":"ContainerStarted","Data":"b82e4f94144dd58e29d90fea86ad82a32463788a0360e52d7526c551d36a587c"} Mar 14 08:16:36 crc kubenswrapper[5129]: I0314 08:16:36.721560 5129 generic.go:334] "Generic (PLEG): container finished" podID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerID="55c5f987997a0145424cdc5d1cdfab17d8475fbf99298c4e5e135b45e3980dd1" exitCode=0 Mar 14 08:16:36 crc kubenswrapper[5129]: I0314 08:16:36.721849 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tt7v" event={"ID":"11814b4d-9519-43d3-b7b3-bab72b0b493d","Type":"ContainerDied","Data":"55c5f987997a0145424cdc5d1cdfab17d8475fbf99298c4e5e135b45e3980dd1"} Mar 14 08:16:38 crc kubenswrapper[5129]: I0314 08:16:38.749515 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tt7v" event={"ID":"11814b4d-9519-43d3-b7b3-bab72b0b493d","Type":"ContainerStarted","Data":"66e1fd0c549bd92b235cfcbf1cf85d6ddaf0cc38bf0e2247114d1b14480cbc2a"} Mar 14 08:16:38 crc kubenswrapper[5129]: I0314 08:16:38.789003 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7tt7v" podStartSLOduration=3.996351554 podStartE2EDuration="6.788966503s" podCreationTimestamp="2026-03-14 08:16:32 +0000 UTC" firstStartedPulling="2026-03-14 08:16:34.700431103 +0000 UTC m=+4657.452346277" lastFinishedPulling="2026-03-14 08:16:37.493046042 +0000 UTC m=+4660.244961226" observedRunningTime="2026-03-14 08:16:38.774425448 +0000 UTC m=+4661.526340652" watchObservedRunningTime="2026-03-14 08:16:38.788966503 +0000 UTC m=+4661.540881727" Mar 14 08:16:41 crc kubenswrapper[5129]: I0314 08:16:41.036412 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:16:41 crc kubenswrapper[5129]: E0314 08:16:41.036986 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:16:43 crc kubenswrapper[5129]: I0314 08:16:43.149417 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:43 crc kubenswrapper[5129]: I0314 08:16:43.149554 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:43 crc kubenswrapper[5129]: I0314 08:16:43.231673 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:43 crc kubenswrapper[5129]: I0314 08:16:43.865254 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:47 crc kubenswrapper[5129]: I0314 08:16:47.026113 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7tt7v"] Mar 14 08:16:47 crc kubenswrapper[5129]: I0314 08:16:47.026681 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7tt7v" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="registry-server" containerID="cri-o://66e1fd0c549bd92b235cfcbf1cf85d6ddaf0cc38bf0e2247114d1b14480cbc2a" gracePeriod=2 Mar 14 08:16:47 crc kubenswrapper[5129]: I0314 08:16:47.840014 5129 generic.go:334] "Generic (PLEG): container finished" podID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerID="66e1fd0c549bd92b235cfcbf1cf85d6ddaf0cc38bf0e2247114d1b14480cbc2a" exitCode=0 Mar 14 08:16:47 crc kubenswrapper[5129]: I0314 08:16:47.840107 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tt7v" event={"ID":"11814b4d-9519-43d3-b7b3-bab72b0b493d","Type":"ContainerDied","Data":"66e1fd0c549bd92b235cfcbf1cf85d6ddaf0cc38bf0e2247114d1b14480cbc2a"} Mar 14 08:16:47 crc kubenswrapper[5129]: I0314 08:16:47.977872 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.057212 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-utilities\") pod \"11814b4d-9519-43d3-b7b3-bab72b0b493d\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.057305 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml874\" (UniqueName: \"kubernetes.io/projected/11814b4d-9519-43d3-b7b3-bab72b0b493d-kube-api-access-ml874\") pod \"11814b4d-9519-43d3-b7b3-bab72b0b493d\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.057535 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-catalog-content\") pod \"11814b4d-9519-43d3-b7b3-bab72b0b493d\" (UID: \"11814b4d-9519-43d3-b7b3-bab72b0b493d\") " Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.058663 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-utilities" (OuterVolumeSpecName: "utilities") pod "11814b4d-9519-43d3-b7b3-bab72b0b493d" (UID: "11814b4d-9519-43d3-b7b3-bab72b0b493d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.067677 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11814b4d-9519-43d3-b7b3-bab72b0b493d-kube-api-access-ml874" (OuterVolumeSpecName: "kube-api-access-ml874") pod "11814b4d-9519-43d3-b7b3-bab72b0b493d" (UID: "11814b4d-9519-43d3-b7b3-bab72b0b493d"). InnerVolumeSpecName "kube-api-access-ml874". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.114969 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11814b4d-9519-43d3-b7b3-bab72b0b493d" (UID: "11814b4d-9519-43d3-b7b3-bab72b0b493d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.159929 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.159958 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11814b4d-9519-43d3-b7b3-bab72b0b493d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.159991 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml874\" (UniqueName: \"kubernetes.io/projected/11814b4d-9519-43d3-b7b3-bab72b0b493d-kube-api-access-ml874\") on node \"crc\" DevicePath \"\"" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.854228 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tt7v" event={"ID":"11814b4d-9519-43d3-b7b3-bab72b0b493d","Type":"ContainerDied","Data":"b82e4f94144dd58e29d90fea86ad82a32463788a0360e52d7526c551d36a587c"} Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.854299 5129 scope.go:117] "RemoveContainer" containerID="66e1fd0c549bd92b235cfcbf1cf85d6ddaf0cc38bf0e2247114d1b14480cbc2a" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.854451 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tt7v" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.888164 5129 scope.go:117] "RemoveContainer" containerID="55c5f987997a0145424cdc5d1cdfab17d8475fbf99298c4e5e135b45e3980dd1" Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.892135 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7tt7v"] Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.899619 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7tt7v"] Mar 14 08:16:48 crc kubenswrapper[5129]: I0314 08:16:48.923130 5129 scope.go:117] "RemoveContainer" containerID="2cc2d4896e97f29a82128af8cc104c719645c71017b8b43536ae0dc7771867ea" Mar 14 08:16:50 crc kubenswrapper[5129]: I0314 08:16:50.052778 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" path="/var/lib/kubelet/pods/11814b4d-9519-43d3-b7b3-bab72b0b493d/volumes" Mar 14 08:16:52 crc kubenswrapper[5129]: I0314 08:16:52.037200 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:16:52 crc kubenswrapper[5129]: I0314 08:16:52.892483 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"fb6315427df7c54c22a5bb85b7ca28bb56f0255d6951cd739f06219485b1406e"} Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.140398 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557938-x5jbv"] Mar 14 08:18:00 crc kubenswrapper[5129]: E0314 08:18:00.141542 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="extract-content" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.141563 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="extract-content" Mar 14 08:18:00 crc kubenswrapper[5129]: E0314 08:18:00.141680 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="extract-utilities" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.141694 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="extract-utilities" Mar 14 08:18:00 crc kubenswrapper[5129]: E0314 08:18:00.141735 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="registry-server" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.141748 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="registry-server" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.141997 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="11814b4d-9519-43d3-b7b3-bab72b0b493d" containerName="registry-server" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.142887 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.144842 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.145798 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.146212 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.146326 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-x5jbv"] Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.210546 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkj7z\" (UniqueName: \"kubernetes.io/projected/8314a0aa-d526-459c-b758-8b9944238261-kube-api-access-jkj7z\") pod \"auto-csr-approver-29557938-x5jbv\" (UID: \"8314a0aa-d526-459c-b758-8b9944238261\") " pod="openshift-infra/auto-csr-approver-29557938-x5jbv" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.311817 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkj7z\" (UniqueName: \"kubernetes.io/projected/8314a0aa-d526-459c-b758-8b9944238261-kube-api-access-jkj7z\") pod \"auto-csr-approver-29557938-x5jbv\" (UID: \"8314a0aa-d526-459c-b758-8b9944238261\") " pod="openshift-infra/auto-csr-approver-29557938-x5jbv" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.415840 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkj7z\" (UniqueName: \"kubernetes.io/projected/8314a0aa-d526-459c-b758-8b9944238261-kube-api-access-jkj7z\") pod \"auto-csr-approver-29557938-x5jbv\" (UID: \"8314a0aa-d526-459c-b758-8b9944238261\") " pod="openshift-infra/auto-csr-approver-29557938-x5jbv" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.468338 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" Mar 14 08:18:00 crc kubenswrapper[5129]: I0314 08:18:00.683632 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-x5jbv"] Mar 14 08:18:01 crc kubenswrapper[5129]: I0314 08:18:01.455756 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" event={"ID":"8314a0aa-d526-459c-b758-8b9944238261","Type":"ContainerStarted","Data":"c5e815495c271166ad601ea6e4a739f6650a5bf055a48efe4f1ef53c034eaf91"} Mar 14 08:18:02 crc kubenswrapper[5129]: I0314 08:18:02.468270 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" event={"ID":"8314a0aa-d526-459c-b758-8b9944238261","Type":"ContainerStarted","Data":"0fa574c35f72e60e5479b76c8c8589e6081cdea2c99092ef5c208b73cb372057"} Mar 14 08:18:03 crc kubenswrapper[5129]: I0314 08:18:03.480417 5129 generic.go:334] "Generic (PLEG): container finished" podID="8314a0aa-d526-459c-b758-8b9944238261" containerID="0fa574c35f72e60e5479b76c8c8589e6081cdea2c99092ef5c208b73cb372057" exitCode=0 Mar 14 08:18:03 crc kubenswrapper[5129]: I0314 08:18:03.480484 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" event={"ID":"8314a0aa-d526-459c-b758-8b9944238261","Type":"ContainerDied","Data":"0fa574c35f72e60e5479b76c8c8589e6081cdea2c99092ef5c208b73cb372057"} Mar 14 08:18:04 crc kubenswrapper[5129]: I0314 08:18:04.820139 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" Mar 14 08:18:04 crc kubenswrapper[5129]: I0314 08:18:04.982468 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkj7z\" (UniqueName: \"kubernetes.io/projected/8314a0aa-d526-459c-b758-8b9944238261-kube-api-access-jkj7z\") pod \"8314a0aa-d526-459c-b758-8b9944238261\" (UID: \"8314a0aa-d526-459c-b758-8b9944238261\") " Mar 14 08:18:04 crc kubenswrapper[5129]: I0314 08:18:04.993394 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8314a0aa-d526-459c-b758-8b9944238261-kube-api-access-jkj7z" (OuterVolumeSpecName: "kube-api-access-jkj7z") pod "8314a0aa-d526-459c-b758-8b9944238261" (UID: "8314a0aa-d526-459c-b758-8b9944238261"). InnerVolumeSpecName "kube-api-access-jkj7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:18:05 crc kubenswrapper[5129]: I0314 08:18:05.085068 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkj7z\" (UniqueName: \"kubernetes.io/projected/8314a0aa-d526-459c-b758-8b9944238261-kube-api-access-jkj7z\") on node \"crc\" DevicePath \"\"" Mar 14 08:18:05 crc kubenswrapper[5129]: I0314 08:18:05.500664 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" event={"ID":"8314a0aa-d526-459c-b758-8b9944238261","Type":"ContainerDied","Data":"c5e815495c271166ad601ea6e4a739f6650a5bf055a48efe4f1ef53c034eaf91"} Mar 14 08:18:05 crc kubenswrapper[5129]: I0314 08:18:05.501068 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e815495c271166ad601ea6e4a739f6650a5bf055a48efe4f1ef53c034eaf91" Mar 14 08:18:05 crc kubenswrapper[5129]: I0314 08:18:05.500714 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557938-x5jbv" Mar 14 08:18:05 crc kubenswrapper[5129]: I0314 08:18:05.556050 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-h652f"] Mar 14 08:18:05 crc kubenswrapper[5129]: I0314 08:18:05.561299 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557932-h652f"] Mar 14 08:18:06 crc kubenswrapper[5129]: I0314 08:18:06.055668 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8613d333-dcde-40a0-84a3-fa23e80a7a74" path="/var/lib/kubelet/pods/8613d333-dcde-40a0-84a3-fa23e80a7a74/volumes" Mar 14 08:18:24 crc kubenswrapper[5129]: I0314 08:18:24.669442 5129 scope.go:117] "RemoveContainer" containerID="72285e1be024d3c03c6e1119bc316325679e223688d6d9e562da2cee74e816de" Mar 14 08:19:19 crc kubenswrapper[5129]: I0314 08:19:19.575128 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:19:19 crc kubenswrapper[5129]: I0314 08:19:19.576139 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:19:49 crc kubenswrapper[5129]: I0314 08:19:49.574232 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:19:49 crc kubenswrapper[5129]: I0314 08:19:49.575832 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.169742 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557940-x5s2v"] Mar 14 08:20:00 crc kubenswrapper[5129]: E0314 08:20:00.171427 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8314a0aa-d526-459c-b758-8b9944238261" containerName="oc" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.171464 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8314a0aa-d526-459c-b758-8b9944238261" containerName="oc" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.171861 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8314a0aa-d526-459c-b758-8b9944238261" containerName="oc" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.172596 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-x5s2v" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.177841 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.178569 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.178579 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.178840 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-x5s2v"] Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.247562 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft92d\" (UniqueName: \"kubernetes.io/projected/b8897c56-ef56-453f-9651-3020a32c8fcc-kube-api-access-ft92d\") pod \"auto-csr-approver-29557940-x5s2v\" (UID: \"b8897c56-ef56-453f-9651-3020a32c8fcc\") " pod="openshift-infra/auto-csr-approver-29557940-x5s2v" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.348659 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft92d\" (UniqueName: \"kubernetes.io/projected/b8897c56-ef56-453f-9651-3020a32c8fcc-kube-api-access-ft92d\") pod \"auto-csr-approver-29557940-x5s2v\" (UID: \"b8897c56-ef56-453f-9651-3020a32c8fcc\") " pod="openshift-infra/auto-csr-approver-29557940-x5s2v" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.371749 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft92d\" (UniqueName: \"kubernetes.io/projected/b8897c56-ef56-453f-9651-3020a32c8fcc-kube-api-access-ft92d\") pod \"auto-csr-approver-29557940-x5s2v\" (UID: \"b8897c56-ef56-453f-9651-3020a32c8fcc\") " pod="openshift-infra/auto-csr-approver-29557940-x5s2v" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.500341 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-x5s2v" Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.737324 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-x5s2v"] Mar 14 08:20:00 crc kubenswrapper[5129]: I0314 08:20:00.744338 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:20:01 crc kubenswrapper[5129]: I0314 08:20:01.658822 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557940-x5s2v" event={"ID":"b8897c56-ef56-453f-9651-3020a32c8fcc","Type":"ContainerStarted","Data":"deea6a44a80f66f245f35facde2d8987259b5faea47fc3e5c269b971402be5cd"} Mar 14 08:20:02 crc kubenswrapper[5129]: I0314 08:20:02.674756 5129 generic.go:334] "Generic (PLEG): container finished" podID="b8897c56-ef56-453f-9651-3020a32c8fcc" containerID="9b948b31e890df90b996f49a902bd61b5088ff4a7e290343ed1b2644cc2bb666" exitCode=0 Mar 14 08:20:02 crc kubenswrapper[5129]: I0314 08:20:02.675134 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557940-x5s2v" event={"ID":"b8897c56-ef56-453f-9651-3020a32c8fcc","Type":"ContainerDied","Data":"9b948b31e890df90b996f49a902bd61b5088ff4a7e290343ed1b2644cc2bb666"} Mar 14 08:20:03 crc kubenswrapper[5129]: I0314 08:20:03.970994 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-x5s2v" Mar 14 08:20:04 crc kubenswrapper[5129]: I0314 08:20:04.031164 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft92d\" (UniqueName: \"kubernetes.io/projected/b8897c56-ef56-453f-9651-3020a32c8fcc-kube-api-access-ft92d\") pod \"b8897c56-ef56-453f-9651-3020a32c8fcc\" (UID: \"b8897c56-ef56-453f-9651-3020a32c8fcc\") " Mar 14 08:20:04 crc kubenswrapper[5129]: I0314 08:20:04.041188 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8897c56-ef56-453f-9651-3020a32c8fcc-kube-api-access-ft92d" (OuterVolumeSpecName: "kube-api-access-ft92d") pod "b8897c56-ef56-453f-9651-3020a32c8fcc" (UID: "b8897c56-ef56-453f-9651-3020a32c8fcc"). InnerVolumeSpecName "kube-api-access-ft92d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:20:04 crc kubenswrapper[5129]: I0314 08:20:04.133614 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft92d\" (UniqueName: \"kubernetes.io/projected/b8897c56-ef56-453f-9651-3020a32c8fcc-kube-api-access-ft92d\") on node \"crc\" DevicePath \"\"" Mar 14 08:20:04 crc kubenswrapper[5129]: I0314 08:20:04.689467 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557940-x5s2v" event={"ID":"b8897c56-ef56-453f-9651-3020a32c8fcc","Type":"ContainerDied","Data":"deea6a44a80f66f245f35facde2d8987259b5faea47fc3e5c269b971402be5cd"} Mar 14 08:20:04 crc kubenswrapper[5129]: I0314 08:20:04.689512 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deea6a44a80f66f245f35facde2d8987259b5faea47fc3e5c269b971402be5cd" Mar 14 08:20:04 crc kubenswrapper[5129]: I0314 08:20:04.689577 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557940-x5s2v" Mar 14 08:20:05 crc kubenswrapper[5129]: I0314 08:20:05.065177 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-fxmft"] Mar 14 08:20:05 crc kubenswrapper[5129]: I0314 08:20:05.074809 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557934-fxmft"] Mar 14 08:20:06 crc kubenswrapper[5129]: I0314 08:20:06.046820 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1711beb-5567-448a-96f3-c77f77edf0d0" path="/var/lib/kubelet/pods/c1711beb-5567-448a-96f3-c77f77edf0d0/volumes" Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.574279 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.574867 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.574912 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.575466 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb6315427df7c54c22a5bb85b7ca28bb56f0255d6951cd739f06219485b1406e"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.575506 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://fb6315427df7c54c22a5bb85b7ca28bb56f0255d6951cd739f06219485b1406e" gracePeriod=600 Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.822395 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="fb6315427df7c54c22a5bb85b7ca28bb56f0255d6951cd739f06219485b1406e" exitCode=0 Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.822551 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"fb6315427df7c54c22a5bb85b7ca28bb56f0255d6951cd739f06219485b1406e"} Mar 14 08:20:19 crc kubenswrapper[5129]: I0314 08:20:19.823110 5129 scope.go:117] "RemoveContainer" containerID="d8633f253fae5f5b3e1c301397e4ade8ee103548837288b6271a2fd9f11a9c63" Mar 14 08:20:20 crc kubenswrapper[5129]: I0314 08:20:20.831517 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd"} Mar 14 08:20:24 crc kubenswrapper[5129]: I0314 08:20:24.779872 5129 scope.go:117] "RemoveContainer" containerID="9b02accea30ff179a18cbd7b922f70b0eebd80e381d316e8540319d9e11f6147" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.056571 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-btndk"] Mar 14 08:20:39 crc kubenswrapper[5129]: E0314 08:20:39.059231 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8897c56-ef56-453f-9651-3020a32c8fcc" containerName="oc" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.059260 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8897c56-ef56-453f-9651-3020a32c8fcc" containerName="oc" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.059425 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8897c56-ef56-453f-9651-3020a32c8fcc" containerName="oc" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.060542 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.072055 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btndk"] Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.091206 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-catalog-content\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.091297 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh6tq\" (UniqueName: \"kubernetes.io/projected/e81106c8-7b48-4168-b344-a003aa8af576-kube-api-access-mh6tq\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.091339 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-utilities\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.192572 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh6tq\" (UniqueName: \"kubernetes.io/projected/e81106c8-7b48-4168-b344-a003aa8af576-kube-api-access-mh6tq\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.192667 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-utilities\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.192722 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-catalog-content\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.193254 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-catalog-content\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.193298 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-utilities\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.222122 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh6tq\" (UniqueName: \"kubernetes.io/projected/e81106c8-7b48-4168-b344-a003aa8af576-kube-api-access-mh6tq\") pod \"redhat-marketplace-btndk\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.379331 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:39 crc kubenswrapper[5129]: I0314 08:20:39.583338 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btndk"] Mar 14 08:20:39 crc kubenswrapper[5129]: W0314 08:20:39.819115 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode81106c8_7b48_4168_b344_a003aa8af576.slice/crio-d4a98c3a56a82d95d375a10068e1c5136eb2ae557da90651ea652a156252b4bf WatchSource:0}: Error finding container d4a98c3a56a82d95d375a10068e1c5136eb2ae557da90651ea652a156252b4bf: Status 404 returned error can't find the container with id d4a98c3a56a82d95d375a10068e1c5136eb2ae557da90651ea652a156252b4bf Mar 14 08:20:40 crc kubenswrapper[5129]: I0314 08:20:40.000697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btndk" event={"ID":"e81106c8-7b48-4168-b344-a003aa8af576","Type":"ContainerStarted","Data":"d4a98c3a56a82d95d375a10068e1c5136eb2ae557da90651ea652a156252b4bf"} Mar 14 08:20:41 crc kubenswrapper[5129]: I0314 08:20:41.011138 5129 generic.go:334] "Generic (PLEG): container finished" podID="e81106c8-7b48-4168-b344-a003aa8af576" containerID="fcdafcb67773bf9a62f8f9b7f2e22ca1f211ccaa0e2fc9bc2d55237a161a2020" exitCode=0 Mar 14 08:20:41 crc kubenswrapper[5129]: I0314 08:20:41.011272 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btndk" event={"ID":"e81106c8-7b48-4168-b344-a003aa8af576","Type":"ContainerDied","Data":"fcdafcb67773bf9a62f8f9b7f2e22ca1f211ccaa0e2fc9bc2d55237a161a2020"} Mar 14 08:20:43 crc kubenswrapper[5129]: I0314 08:20:43.031133 5129 generic.go:334] "Generic (PLEG): container finished" podID="e81106c8-7b48-4168-b344-a003aa8af576" containerID="9ded3b1612ddef9159eb98461d80d5fa930998e4c223187592e4a33a66d0a001" exitCode=0 Mar 14 08:20:43 crc kubenswrapper[5129]: I0314 08:20:43.031228 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btndk" event={"ID":"e81106c8-7b48-4168-b344-a003aa8af576","Type":"ContainerDied","Data":"9ded3b1612ddef9159eb98461d80d5fa930998e4c223187592e4a33a66d0a001"} Mar 14 08:20:44 crc kubenswrapper[5129]: I0314 08:20:44.047227 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btndk" event={"ID":"e81106c8-7b48-4168-b344-a003aa8af576","Type":"ContainerStarted","Data":"6e579e94d840ea7b9f862f3a83c3a58fa3c679bce34a4f1184daa814f8e6c1e0"} Mar 14 08:20:44 crc kubenswrapper[5129]: I0314 08:20:44.056461 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-btndk" podStartSLOduration=2.48822482 podStartE2EDuration="5.056444222s" podCreationTimestamp="2026-03-14 08:20:39 +0000 UTC" firstStartedPulling="2026-03-14 08:20:41.013091371 +0000 UTC m=+4903.765006565" lastFinishedPulling="2026-03-14 08:20:43.581310783 +0000 UTC m=+4906.333225967" observedRunningTime="2026-03-14 08:20:44.055326932 +0000 UTC m=+4906.807242136" watchObservedRunningTime="2026-03-14 08:20:44.056444222 +0000 UTC m=+4906.808359406" Mar 14 08:20:49 crc kubenswrapper[5129]: I0314 08:20:49.380095 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:49 crc kubenswrapper[5129]: I0314 08:20:49.380804 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:49 crc kubenswrapper[5129]: I0314 08:20:49.434128 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:50 crc kubenswrapper[5129]: I0314 08:20:50.129375 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:50 crc kubenswrapper[5129]: I0314 08:20:50.203130 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btndk"] Mar 14 08:20:52 crc kubenswrapper[5129]: I0314 08:20:52.111414 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-btndk" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="registry-server" containerID="cri-o://6e579e94d840ea7b9f862f3a83c3a58fa3c679bce34a4f1184daa814f8e6c1e0" gracePeriod=2 Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.119166 5129 generic.go:334] "Generic (PLEG): container finished" podID="e81106c8-7b48-4168-b344-a003aa8af576" containerID="6e579e94d840ea7b9f862f3a83c3a58fa3c679bce34a4f1184daa814f8e6c1e0" exitCode=0 Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.119202 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btndk" event={"ID":"e81106c8-7b48-4168-b344-a003aa8af576","Type":"ContainerDied","Data":"6e579e94d840ea7b9f862f3a83c3a58fa3c679bce34a4f1184daa814f8e6c1e0"} Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.783984 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.802934 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh6tq\" (UniqueName: \"kubernetes.io/projected/e81106c8-7b48-4168-b344-a003aa8af576-kube-api-access-mh6tq\") pod \"e81106c8-7b48-4168-b344-a003aa8af576\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.803005 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-utilities\") pod \"e81106c8-7b48-4168-b344-a003aa8af576\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.803088 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-catalog-content\") pod \"e81106c8-7b48-4168-b344-a003aa8af576\" (UID: \"e81106c8-7b48-4168-b344-a003aa8af576\") " Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.805754 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-utilities" (OuterVolumeSpecName: "utilities") pod "e81106c8-7b48-4168-b344-a003aa8af576" (UID: "e81106c8-7b48-4168-b344-a003aa8af576"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.813407 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81106c8-7b48-4168-b344-a003aa8af576-kube-api-access-mh6tq" (OuterVolumeSpecName: "kube-api-access-mh6tq") pod "e81106c8-7b48-4168-b344-a003aa8af576" (UID: "e81106c8-7b48-4168-b344-a003aa8af576"). InnerVolumeSpecName "kube-api-access-mh6tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.899779 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e81106c8-7b48-4168-b344-a003aa8af576" (UID: "e81106c8-7b48-4168-b344-a003aa8af576"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.904800 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh6tq\" (UniqueName: \"kubernetes.io/projected/e81106c8-7b48-4168-b344-a003aa8af576-kube-api-access-mh6tq\") on node \"crc\" DevicePath \"\"" Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.904835 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:20:53 crc kubenswrapper[5129]: I0314 08:20:53.904846 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81106c8-7b48-4168-b344-a003aa8af576-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:20:54 crc kubenswrapper[5129]: I0314 08:20:54.142330 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btndk" event={"ID":"e81106c8-7b48-4168-b344-a003aa8af576","Type":"ContainerDied","Data":"d4a98c3a56a82d95d375a10068e1c5136eb2ae557da90651ea652a156252b4bf"} Mar 14 08:20:54 crc kubenswrapper[5129]: I0314 08:20:54.142435 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btndk" Mar 14 08:20:54 crc kubenswrapper[5129]: I0314 08:20:54.142456 5129 scope.go:117] "RemoveContainer" containerID="6e579e94d840ea7b9f862f3a83c3a58fa3c679bce34a4f1184daa814f8e6c1e0" Mar 14 08:20:54 crc kubenswrapper[5129]: I0314 08:20:54.184313 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btndk"] Mar 14 08:20:54 crc kubenswrapper[5129]: I0314 08:20:54.185646 5129 scope.go:117] "RemoveContainer" containerID="9ded3b1612ddef9159eb98461d80d5fa930998e4c223187592e4a33a66d0a001" Mar 14 08:20:54 crc kubenswrapper[5129]: I0314 08:20:54.194171 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-btndk"] Mar 14 08:20:54 crc kubenswrapper[5129]: I0314 08:20:54.211051 5129 scope.go:117] "RemoveContainer" containerID="fcdafcb67773bf9a62f8f9b7f2e22ca1f211ccaa0e2fc9bc2d55237a161a2020" Mar 14 08:20:56 crc kubenswrapper[5129]: I0314 08:20:56.062001 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81106c8-7b48-4168-b344-a003aa8af576" path="/var/lib/kubelet/pods/e81106c8-7b48-4168-b344-a003aa8af576/volumes" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.165212 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557942-d6h5f"] Mar 14 08:22:00 crc kubenswrapper[5129]: E0314 08:22:00.167761 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="extract-utilities" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.167796 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="extract-utilities" Mar 14 08:22:00 crc kubenswrapper[5129]: E0314 08:22:00.167835 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="extract-content" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.167848 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="extract-content" Mar 14 08:22:00 crc kubenswrapper[5129]: E0314 08:22:00.167898 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="registry-server" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.167914 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="registry-server" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.168165 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81106c8-7b48-4168-b344-a003aa8af576" containerName="registry-server" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.169360 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-d6h5f" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.172008 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.172442 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.173043 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.181855 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-d6h5f"] Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.297538 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp2kq\" (UniqueName: \"kubernetes.io/projected/e9be32f2-6e41-4329-857f-b81cd9dc21b6-kube-api-access-gp2kq\") pod \"auto-csr-approver-29557942-d6h5f\" (UID: \"e9be32f2-6e41-4329-857f-b81cd9dc21b6\") " pod="openshift-infra/auto-csr-approver-29557942-d6h5f" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.399802 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp2kq\" (UniqueName: \"kubernetes.io/projected/e9be32f2-6e41-4329-857f-b81cd9dc21b6-kube-api-access-gp2kq\") pod \"auto-csr-approver-29557942-d6h5f\" (UID: \"e9be32f2-6e41-4329-857f-b81cd9dc21b6\") " pod="openshift-infra/auto-csr-approver-29557942-d6h5f" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.430885 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp2kq\" (UniqueName: \"kubernetes.io/projected/e9be32f2-6e41-4329-857f-b81cd9dc21b6-kube-api-access-gp2kq\") pod \"auto-csr-approver-29557942-d6h5f\" (UID: \"e9be32f2-6e41-4329-857f-b81cd9dc21b6\") " pod="openshift-infra/auto-csr-approver-29557942-d6h5f" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.491542 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-d6h5f" Mar 14 08:22:00 crc kubenswrapper[5129]: I0314 08:22:00.803804 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-d6h5f"] Mar 14 08:22:01 crc kubenswrapper[5129]: I0314 08:22:01.403012 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557942-d6h5f" event={"ID":"e9be32f2-6e41-4329-857f-b81cd9dc21b6","Type":"ContainerStarted","Data":"892e89f8f7427c2952be8ece6f4914d3731fef7be4a69e785481fdaf28f99aac"} Mar 14 08:22:04 crc kubenswrapper[5129]: I0314 08:22:04.445101 5129 generic.go:334] "Generic (PLEG): container finished" podID="e9be32f2-6e41-4329-857f-b81cd9dc21b6" containerID="0ed3e9e127d72ce74cd155cbdc6a251638f472f64e6c74e27a731b032287e502" exitCode=0 Mar 14 08:22:04 crc kubenswrapper[5129]: I0314 08:22:04.445152 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557942-d6h5f" event={"ID":"e9be32f2-6e41-4329-857f-b81cd9dc21b6","Type":"ContainerDied","Data":"0ed3e9e127d72ce74cd155cbdc6a251638f472f64e6c74e27a731b032287e502"} Mar 14 08:22:05 crc kubenswrapper[5129]: I0314 08:22:05.791871 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-d6h5f" Mar 14 08:22:05 crc kubenswrapper[5129]: I0314 08:22:05.905849 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp2kq\" (UniqueName: \"kubernetes.io/projected/e9be32f2-6e41-4329-857f-b81cd9dc21b6-kube-api-access-gp2kq\") pod \"e9be32f2-6e41-4329-857f-b81cd9dc21b6\" (UID: \"e9be32f2-6e41-4329-857f-b81cd9dc21b6\") " Mar 14 08:22:05 crc kubenswrapper[5129]: I0314 08:22:05.916561 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9be32f2-6e41-4329-857f-b81cd9dc21b6-kube-api-access-gp2kq" (OuterVolumeSpecName: "kube-api-access-gp2kq") pod "e9be32f2-6e41-4329-857f-b81cd9dc21b6" (UID: "e9be32f2-6e41-4329-857f-b81cd9dc21b6"). InnerVolumeSpecName "kube-api-access-gp2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:22:06 crc kubenswrapper[5129]: I0314 08:22:06.008705 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp2kq\" (UniqueName: \"kubernetes.io/projected/e9be32f2-6e41-4329-857f-b81cd9dc21b6-kube-api-access-gp2kq\") on node \"crc\" DevicePath \"\"" Mar 14 08:22:06 crc kubenswrapper[5129]: I0314 08:22:06.465088 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557942-d6h5f" event={"ID":"e9be32f2-6e41-4329-857f-b81cd9dc21b6","Type":"ContainerDied","Data":"892e89f8f7427c2952be8ece6f4914d3731fef7be4a69e785481fdaf28f99aac"} Mar 14 08:22:06 crc kubenswrapper[5129]: I0314 08:22:06.465457 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="892e89f8f7427c2952be8ece6f4914d3731fef7be4a69e785481fdaf28f99aac" Mar 14 08:22:06 crc kubenswrapper[5129]: I0314 08:22:06.465153 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557942-d6h5f" Mar 14 08:22:06 crc kubenswrapper[5129]: I0314 08:22:06.870865 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-8v6gs"] Mar 14 08:22:06 crc kubenswrapper[5129]: I0314 08:22:06.876048 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557936-8v6gs"] Mar 14 08:22:08 crc kubenswrapper[5129]: I0314 08:22:08.051780 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02c25a8-0c07-4449-9d35-b499b9fca8ca" path="/var/lib/kubelet/pods/d02c25a8-0c07-4449-9d35-b499b9fca8ca/volumes" Mar 14 08:22:19 crc kubenswrapper[5129]: I0314 08:22:19.574897 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:22:19 crc kubenswrapper[5129]: I0314 08:22:19.575921 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:22:24 crc kubenswrapper[5129]: I0314 08:22:24.897809 5129 scope.go:117] "RemoveContainer" containerID="52944355abdfa3c1628465b943798c965a6ad6a84f8cff51b02b8cc1e5fb2428" Mar 14 08:22:49 crc kubenswrapper[5129]: I0314 08:22:49.574405 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:22:49 crc kubenswrapper[5129]: I0314 08:22:49.575919 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:23:19 crc kubenswrapper[5129]: I0314 08:23:19.574903 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:23:19 crc kubenswrapper[5129]: I0314 08:23:19.575520 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:23:19 crc kubenswrapper[5129]: I0314 08:23:19.575585 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:23:19 crc kubenswrapper[5129]: I0314 08:23:19.576453 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:23:19 crc kubenswrapper[5129]: I0314 08:23:19.576548 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" gracePeriod=600 Mar 14 08:23:19 crc kubenswrapper[5129]: E0314 08:23:19.708752 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:23:20 crc kubenswrapper[5129]: I0314 08:23:20.107430 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" exitCode=0 Mar 14 08:23:20 crc kubenswrapper[5129]: I0314 08:23:20.107484 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd"} Mar 14 08:23:20 crc kubenswrapper[5129]: I0314 08:23:20.107522 5129 scope.go:117] "RemoveContainer" containerID="fb6315427df7c54c22a5bb85b7ca28bb56f0255d6951cd739f06219485b1406e" Mar 14 08:23:20 crc kubenswrapper[5129]: I0314 08:23:20.109098 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:23:20 crc kubenswrapper[5129]: E0314 08:23:20.109735 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:23:33 crc kubenswrapper[5129]: I0314 08:23:33.036582 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:23:33 crc kubenswrapper[5129]: E0314 08:23:33.037627 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:23:46 crc kubenswrapper[5129]: I0314 08:23:46.036904 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:23:46 crc kubenswrapper[5129]: E0314 08:23:46.037795 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:23:59 crc kubenswrapper[5129]: I0314 08:23:59.036355 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:23:59 crc kubenswrapper[5129]: E0314 08:23:59.037155 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.161714 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557944-s7bv2"] Mar 14 08:24:00 crc kubenswrapper[5129]: E0314 08:24:00.163083 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9be32f2-6e41-4329-857f-b81cd9dc21b6" containerName="oc" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.163213 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9be32f2-6e41-4329-857f-b81cd9dc21b6" containerName="oc" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.163519 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9be32f2-6e41-4329-857f-b81cd9dc21b6" containerName="oc" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.164353 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-s7bv2" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.168100 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.168491 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.171973 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.173669 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-s7bv2"] Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.282776 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkm82\" (UniqueName: \"kubernetes.io/projected/e9187a75-8afa-425b-8159-2012e38ae798-kube-api-access-kkm82\") pod \"auto-csr-approver-29557944-s7bv2\" (UID: \"e9187a75-8afa-425b-8159-2012e38ae798\") " pod="openshift-infra/auto-csr-approver-29557944-s7bv2" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.385537 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkm82\" (UniqueName: \"kubernetes.io/projected/e9187a75-8afa-425b-8159-2012e38ae798-kube-api-access-kkm82\") pod \"auto-csr-approver-29557944-s7bv2\" (UID: \"e9187a75-8afa-425b-8159-2012e38ae798\") " pod="openshift-infra/auto-csr-approver-29557944-s7bv2" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.414927 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkm82\" (UniqueName: \"kubernetes.io/projected/e9187a75-8afa-425b-8159-2012e38ae798-kube-api-access-kkm82\") pod \"auto-csr-approver-29557944-s7bv2\" (UID: \"e9187a75-8afa-425b-8159-2012e38ae798\") " pod="openshift-infra/auto-csr-approver-29557944-s7bv2" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.498595 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-s7bv2" Mar 14 08:24:00 crc kubenswrapper[5129]: I0314 08:24:00.791294 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-s7bv2"] Mar 14 08:24:01 crc kubenswrapper[5129]: I0314 08:24:01.522180 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557944-s7bv2" event={"ID":"e9187a75-8afa-425b-8159-2012e38ae798","Type":"ContainerStarted","Data":"03d03f99e1159b1446b3b08d048e28d7c7942ec677ed8a570eb7e23705283b21"} Mar 14 08:24:02 crc kubenswrapper[5129]: I0314 08:24:02.537010 5129 generic.go:334] "Generic (PLEG): container finished" podID="e9187a75-8afa-425b-8159-2012e38ae798" containerID="5ebcff27e48fabdf868b3802e7af686dd0862cb00b842fc5e734d8acf84f8490" exitCode=0 Mar 14 08:24:02 crc kubenswrapper[5129]: I0314 08:24:02.537156 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557944-s7bv2" event={"ID":"e9187a75-8afa-425b-8159-2012e38ae798","Type":"ContainerDied","Data":"5ebcff27e48fabdf868b3802e7af686dd0862cb00b842fc5e734d8acf84f8490"} Mar 14 08:24:03 crc kubenswrapper[5129]: I0314 08:24:03.866981 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-s7bv2" Mar 14 08:24:03 crc kubenswrapper[5129]: I0314 08:24:03.941043 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkm82\" (UniqueName: \"kubernetes.io/projected/e9187a75-8afa-425b-8159-2012e38ae798-kube-api-access-kkm82\") pod \"e9187a75-8afa-425b-8159-2012e38ae798\" (UID: \"e9187a75-8afa-425b-8159-2012e38ae798\") " Mar 14 08:24:03 crc kubenswrapper[5129]: I0314 08:24:03.951221 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9187a75-8afa-425b-8159-2012e38ae798-kube-api-access-kkm82" (OuterVolumeSpecName: "kube-api-access-kkm82") pod "e9187a75-8afa-425b-8159-2012e38ae798" (UID: "e9187a75-8afa-425b-8159-2012e38ae798"). InnerVolumeSpecName "kube-api-access-kkm82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:24:04 crc kubenswrapper[5129]: I0314 08:24:04.042314 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkm82\" (UniqueName: \"kubernetes.io/projected/e9187a75-8afa-425b-8159-2012e38ae798-kube-api-access-kkm82\") on node \"crc\" DevicePath \"\"" Mar 14 08:24:04 crc kubenswrapper[5129]: I0314 08:24:04.559100 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557944-s7bv2" event={"ID":"e9187a75-8afa-425b-8159-2012e38ae798","Type":"ContainerDied","Data":"03d03f99e1159b1446b3b08d048e28d7c7942ec677ed8a570eb7e23705283b21"} Mar 14 08:24:04 crc kubenswrapper[5129]: I0314 08:24:04.559409 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03d03f99e1159b1446b3b08d048e28d7c7942ec677ed8a570eb7e23705283b21" Mar 14 08:24:04 crc kubenswrapper[5129]: I0314 08:24:04.559169 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557944-s7bv2" Mar 14 08:24:04 crc kubenswrapper[5129]: I0314 08:24:04.960525 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-x5jbv"] Mar 14 08:24:04 crc kubenswrapper[5129]: I0314 08:24:04.966848 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557938-x5jbv"] Mar 14 08:24:06 crc kubenswrapper[5129]: I0314 08:24:06.045547 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8314a0aa-d526-459c-b758-8b9944238261" path="/var/lib/kubelet/pods/8314a0aa-d526-459c-b758-8b9944238261/volumes" Mar 14 08:24:14 crc kubenswrapper[5129]: I0314 08:24:14.037351 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:24:14 crc kubenswrapper[5129]: E0314 08:24:14.038160 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:24:25 crc kubenswrapper[5129]: I0314 08:24:25.000800 5129 scope.go:117] "RemoveContainer" containerID="0fa574c35f72e60e5479b76c8c8589e6081cdea2c99092ef5c208b73cb372057" Mar 14 08:24:29 crc kubenswrapper[5129]: I0314 08:24:29.036407 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:24:29 crc kubenswrapper[5129]: E0314 08:24:29.037040 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:24:43 crc kubenswrapper[5129]: I0314 08:24:43.037091 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:24:43 crc kubenswrapper[5129]: E0314 08:24:43.037970 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:24:54 crc kubenswrapper[5129]: I0314 08:24:54.037282 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:24:54 crc kubenswrapper[5129]: E0314 08:24:54.038575 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:25:07 crc kubenswrapper[5129]: I0314 08:25:07.036449 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:25:07 crc kubenswrapper[5129]: E0314 08:25:07.037340 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:25:21 crc kubenswrapper[5129]: I0314 08:25:21.038256 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:25:21 crc kubenswrapper[5129]: E0314 08:25:21.039831 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:25:35 crc kubenswrapper[5129]: I0314 08:25:35.036761 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:25:35 crc kubenswrapper[5129]: E0314 08:25:35.037801 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:25:46 crc kubenswrapper[5129]: I0314 08:25:46.036263 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:25:46 crc kubenswrapper[5129]: E0314 08:25:46.037043 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:25:59 crc kubenswrapper[5129]: I0314 08:25:59.036999 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:25:59 crc kubenswrapper[5129]: E0314 08:25:59.038302 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.175631 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557946-6lfbj"] Mar 14 08:26:00 crc kubenswrapper[5129]: E0314 08:26:00.177433 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9187a75-8afa-425b-8159-2012e38ae798" containerName="oc" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.177630 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9187a75-8afa-425b-8159-2012e38ae798" containerName="oc" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.178066 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9187a75-8afa-425b-8159-2012e38ae798" containerName="oc" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.179111 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-6lfbj" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.182719 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.183586 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.186198 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.187174 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-6lfbj"] Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.251274 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nzbv\" (UniqueName: \"kubernetes.io/projected/d53e29f5-e63a-42bc-8327-a51fb1ccbebe-kube-api-access-2nzbv\") pod \"auto-csr-approver-29557946-6lfbj\" (UID: \"d53e29f5-e63a-42bc-8327-a51fb1ccbebe\") " pod="openshift-infra/auto-csr-approver-29557946-6lfbj" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.352860 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nzbv\" (UniqueName: \"kubernetes.io/projected/d53e29f5-e63a-42bc-8327-a51fb1ccbebe-kube-api-access-2nzbv\") pod \"auto-csr-approver-29557946-6lfbj\" (UID: \"d53e29f5-e63a-42bc-8327-a51fb1ccbebe\") " pod="openshift-infra/auto-csr-approver-29557946-6lfbj" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.378907 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nzbv\" (UniqueName: \"kubernetes.io/projected/d53e29f5-e63a-42bc-8327-a51fb1ccbebe-kube-api-access-2nzbv\") pod \"auto-csr-approver-29557946-6lfbj\" (UID: \"d53e29f5-e63a-42bc-8327-a51fb1ccbebe\") " pod="openshift-infra/auto-csr-approver-29557946-6lfbj" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.502690 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-6lfbj" Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.972391 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-6lfbj"] Mar 14 08:26:00 crc kubenswrapper[5129]: W0314 08:26:00.982435 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd53e29f5_e63a_42bc_8327_a51fb1ccbebe.slice/crio-01b136fba4489466e1c5610995ca03f9ad34d35b7a5c35419e7fb0032fd4779a WatchSource:0}: Error finding container 01b136fba4489466e1c5610995ca03f9ad34d35b7a5c35419e7fb0032fd4779a: Status 404 returned error can't find the container with id 01b136fba4489466e1c5610995ca03f9ad34d35b7a5c35419e7fb0032fd4779a Mar 14 08:26:00 crc kubenswrapper[5129]: I0314 08:26:00.987074 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:26:01 crc kubenswrapper[5129]: I0314 08:26:01.574706 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557946-6lfbj" event={"ID":"d53e29f5-e63a-42bc-8327-a51fb1ccbebe","Type":"ContainerStarted","Data":"01b136fba4489466e1c5610995ca03f9ad34d35b7a5c35419e7fb0032fd4779a"} Mar 14 08:26:02 crc kubenswrapper[5129]: I0314 08:26:02.588011 5129 generic.go:334] "Generic (PLEG): container finished" podID="d53e29f5-e63a-42bc-8327-a51fb1ccbebe" containerID="96641c701f38d5d8e49a8c415603601a23c2d7f23f6f9d4e976239d372a9334a" exitCode=0 Mar 14 08:26:02 crc kubenswrapper[5129]: I0314 08:26:02.588169 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557946-6lfbj" event={"ID":"d53e29f5-e63a-42bc-8327-a51fb1ccbebe","Type":"ContainerDied","Data":"96641c701f38d5d8e49a8c415603601a23c2d7f23f6f9d4e976239d372a9334a"} Mar 14 08:26:03 crc kubenswrapper[5129]: I0314 08:26:03.949228 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-6lfbj" Mar 14 08:26:04 crc kubenswrapper[5129]: I0314 08:26:04.120997 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nzbv\" (UniqueName: \"kubernetes.io/projected/d53e29f5-e63a-42bc-8327-a51fb1ccbebe-kube-api-access-2nzbv\") pod \"d53e29f5-e63a-42bc-8327-a51fb1ccbebe\" (UID: \"d53e29f5-e63a-42bc-8327-a51fb1ccbebe\") " Mar 14 08:26:04 crc kubenswrapper[5129]: I0314 08:26:04.131343 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53e29f5-e63a-42bc-8327-a51fb1ccbebe-kube-api-access-2nzbv" (OuterVolumeSpecName: "kube-api-access-2nzbv") pod "d53e29f5-e63a-42bc-8327-a51fb1ccbebe" (UID: "d53e29f5-e63a-42bc-8327-a51fb1ccbebe"). InnerVolumeSpecName "kube-api-access-2nzbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:26:04 crc kubenswrapper[5129]: I0314 08:26:04.223494 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nzbv\" (UniqueName: \"kubernetes.io/projected/d53e29f5-e63a-42bc-8327-a51fb1ccbebe-kube-api-access-2nzbv\") on node \"crc\" DevicePath \"\"" Mar 14 08:26:04 crc kubenswrapper[5129]: I0314 08:26:04.606104 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557946-6lfbj" event={"ID":"d53e29f5-e63a-42bc-8327-a51fb1ccbebe","Type":"ContainerDied","Data":"01b136fba4489466e1c5610995ca03f9ad34d35b7a5c35419e7fb0032fd4779a"} Mar 14 08:26:04 crc kubenswrapper[5129]: I0314 08:26:04.606197 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b136fba4489466e1c5610995ca03f9ad34d35b7a5c35419e7fb0032fd4779a" Mar 14 08:26:04 crc kubenswrapper[5129]: I0314 08:26:04.606221 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557946-6lfbj" Mar 14 08:26:05 crc kubenswrapper[5129]: I0314 08:26:05.050991 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-x5s2v"] Mar 14 08:26:05 crc kubenswrapper[5129]: I0314 08:26:05.055965 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557940-x5s2v"] Mar 14 08:26:06 crc kubenswrapper[5129]: I0314 08:26:06.051662 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8897c56-ef56-453f-9651-3020a32c8fcc" path="/var/lib/kubelet/pods/b8897c56-ef56-453f-9651-3020a32c8fcc/volumes" Mar 14 08:26:14 crc kubenswrapper[5129]: I0314 08:26:14.036907 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:26:14 crc kubenswrapper[5129]: E0314 08:26:14.037876 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:26:25 crc kubenswrapper[5129]: I0314 08:26:25.037320 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:26:25 crc kubenswrapper[5129]: E0314 08:26:25.038352 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:26:25 crc kubenswrapper[5129]: I0314 08:26:25.090931 5129 scope.go:117] "RemoveContainer" containerID="9b948b31e890df90b996f49a902bd61b5088ff4a7e290343ed1b2644cc2bb666" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.675679 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sh4sp"] Mar 14 08:26:33 crc kubenswrapper[5129]: E0314 08:26:33.680770 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53e29f5-e63a-42bc-8327-a51fb1ccbebe" containerName="oc" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.680982 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53e29f5-e63a-42bc-8327-a51fb1ccbebe" containerName="oc" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.685532 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53e29f5-e63a-42bc-8327-a51fb1ccbebe" containerName="oc" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.687239 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.694214 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sh4sp"] Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.836367 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-catalog-content\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.836434 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-utilities\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.836489 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmf8q\" (UniqueName: \"kubernetes.io/projected/445e68a7-1eae-460b-aebf-3ead76975785-kube-api-access-wmf8q\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.938037 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-catalog-content\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.938133 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-utilities\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.938206 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmf8q\" (UniqueName: \"kubernetes.io/projected/445e68a7-1eae-460b-aebf-3ead76975785-kube-api-access-wmf8q\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.938694 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-catalog-content\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.938808 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-utilities\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:33 crc kubenswrapper[5129]: I0314 08:26:33.962661 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmf8q\" (UniqueName: \"kubernetes.io/projected/445e68a7-1eae-460b-aebf-3ead76975785-kube-api-access-wmf8q\") pod \"community-operators-sh4sp\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:34 crc kubenswrapper[5129]: I0314 08:26:34.018656 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:34 crc kubenswrapper[5129]: I0314 08:26:34.557979 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sh4sp"] Mar 14 08:26:34 crc kubenswrapper[5129]: I0314 08:26:34.885135 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerStarted","Data":"c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795"} Mar 14 08:26:34 crc kubenswrapper[5129]: I0314 08:26:34.885190 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerStarted","Data":"00a694bb9750cdfdfd0f9555c66608e21a8bd093f1555ac0b3d7cf2c3d61ba42"} Mar 14 08:26:35 crc kubenswrapper[5129]: I0314 08:26:35.899178 5129 generic.go:334] "Generic (PLEG): container finished" podID="445e68a7-1eae-460b-aebf-3ead76975785" containerID="c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795" exitCode=0 Mar 14 08:26:35 crc kubenswrapper[5129]: I0314 08:26:35.899293 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerDied","Data":"c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795"} Mar 14 08:26:36 crc kubenswrapper[5129]: I0314 08:26:36.914207 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerStarted","Data":"780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c"} Mar 14 08:26:37 crc kubenswrapper[5129]: I0314 08:26:37.925677 5129 generic.go:334] "Generic (PLEG): container finished" podID="445e68a7-1eae-460b-aebf-3ead76975785" containerID="780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c" exitCode=0 Mar 14 08:26:37 crc kubenswrapper[5129]: I0314 08:26:37.925767 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerDied","Data":"780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c"} Mar 14 08:26:38 crc kubenswrapper[5129]: I0314 08:26:38.937577 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerStarted","Data":"b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44"} Mar 14 08:26:38 crc kubenswrapper[5129]: I0314 08:26:38.961060 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sh4sp" podStartSLOduration=3.425391698 podStartE2EDuration="5.961042982s" podCreationTimestamp="2026-03-14 08:26:33 +0000 UTC" firstStartedPulling="2026-03-14 08:26:35.903247027 +0000 UTC m=+5258.655162241" lastFinishedPulling="2026-03-14 08:26:38.438898341 +0000 UTC m=+5261.190813525" observedRunningTime="2026-03-14 08:26:38.954296349 +0000 UTC m=+5261.706211533" watchObservedRunningTime="2026-03-14 08:26:38.961042982 +0000 UTC m=+5261.712958166" Mar 14 08:26:40 crc kubenswrapper[5129]: I0314 08:26:40.037224 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:26:40 crc kubenswrapper[5129]: E0314 08:26:40.037789 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:26:44 crc kubenswrapper[5129]: I0314 08:26:44.019866 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:44 crc kubenswrapper[5129]: I0314 08:26:44.020560 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:44 crc kubenswrapper[5129]: I0314 08:26:44.086010 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:45 crc kubenswrapper[5129]: I0314 08:26:45.069447 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:45 crc kubenswrapper[5129]: I0314 08:26:45.155228 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sh4sp"] Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.010123 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sh4sp" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="registry-server" containerID="cri-o://b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44" gracePeriod=2 Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.529329 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.686965 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-catalog-content\") pod \"445e68a7-1eae-460b-aebf-3ead76975785\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.687037 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-utilities\") pod \"445e68a7-1eae-460b-aebf-3ead76975785\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.687124 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmf8q\" (UniqueName: \"kubernetes.io/projected/445e68a7-1eae-460b-aebf-3ead76975785-kube-api-access-wmf8q\") pod \"445e68a7-1eae-460b-aebf-3ead76975785\" (UID: \"445e68a7-1eae-460b-aebf-3ead76975785\") " Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.689794 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-utilities" (OuterVolumeSpecName: "utilities") pod "445e68a7-1eae-460b-aebf-3ead76975785" (UID: "445e68a7-1eae-460b-aebf-3ead76975785"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.697641 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445e68a7-1eae-460b-aebf-3ead76975785-kube-api-access-wmf8q" (OuterVolumeSpecName: "kube-api-access-wmf8q") pod "445e68a7-1eae-460b-aebf-3ead76975785" (UID: "445e68a7-1eae-460b-aebf-3ead76975785"). InnerVolumeSpecName "kube-api-access-wmf8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.789090 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:26:47 crc kubenswrapper[5129]: I0314 08:26:47.789161 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmf8q\" (UniqueName: \"kubernetes.io/projected/445e68a7-1eae-460b-aebf-3ead76975785-kube-api-access-wmf8q\") on node \"crc\" DevicePath \"\"" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.026058 5129 generic.go:334] "Generic (PLEG): container finished" podID="445e68a7-1eae-460b-aebf-3ead76975785" containerID="b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44" exitCode=0 Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.026117 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerDied","Data":"b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44"} Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.026157 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh4sp" event={"ID":"445e68a7-1eae-460b-aebf-3ead76975785","Type":"ContainerDied","Data":"00a694bb9750cdfdfd0f9555c66608e21a8bd093f1555ac0b3d7cf2c3d61ba42"} Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.026188 5129 scope.go:117] "RemoveContainer" containerID="b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.026371 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh4sp" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.065531 5129 scope.go:117] "RemoveContainer" containerID="780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.102078 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "445e68a7-1eae-460b-aebf-3ead76975785" (UID: "445e68a7-1eae-460b-aebf-3ead76975785"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.104056 5129 scope.go:117] "RemoveContainer" containerID="c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.131810 5129 scope.go:117] "RemoveContainer" containerID="b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44" Mar 14 08:26:48 crc kubenswrapper[5129]: E0314 08:26:48.132510 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44\": container with ID starting with b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44 not found: ID does not exist" containerID="b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.132576 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44"} err="failed to get container status \"b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44\": rpc error: code = NotFound desc = could not find container \"b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44\": container with ID starting with b7ad5d47287125773136db46d7321672aaf060bef0e1c03af77fadda3e6ffe44 not found: ID does not exist" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.132662 5129 scope.go:117] "RemoveContainer" containerID="780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c" Mar 14 08:26:48 crc kubenswrapper[5129]: E0314 08:26:48.133250 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c\": container with ID starting with 780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c not found: ID does not exist" containerID="780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.133327 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c"} err="failed to get container status \"780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c\": rpc error: code = NotFound desc = could not find container \"780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c\": container with ID starting with 780a10c8a07c8b6557dedd4cd04e933b19a592bb2634768f36499ab0d7c8507c not found: ID does not exist" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.133368 5129 scope.go:117] "RemoveContainer" containerID="c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795" Mar 14 08:26:48 crc kubenswrapper[5129]: E0314 08:26:48.133909 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795\": container with ID starting with c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795 not found: ID does not exist" containerID="c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.133955 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795"} err="failed to get container status \"c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795\": rpc error: code = NotFound desc = could not find container \"c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795\": container with ID starting with c3f28093486f3ccab07961ec62a15b12f21b258ad17e2f786287df879036b795 not found: ID does not exist" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.197313 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e68a7-1eae-460b-aebf-3ead76975785-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.384710 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sh4sp"] Mar 14 08:26:48 crc kubenswrapper[5129]: I0314 08:26:48.396452 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sh4sp"] Mar 14 08:26:50 crc kubenswrapper[5129]: I0314 08:26:50.045581 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445e68a7-1eae-460b-aebf-3ead76975785" path="/var/lib/kubelet/pods/445e68a7-1eae-460b-aebf-3ead76975785/volumes" Mar 14 08:26:55 crc kubenswrapper[5129]: I0314 08:26:55.036514 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:26:55 crc kubenswrapper[5129]: E0314 08:26:55.037311 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:27:07 crc kubenswrapper[5129]: I0314 08:27:07.036477 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:27:07 crc kubenswrapper[5129]: E0314 08:27:07.037367 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:27:20 crc kubenswrapper[5129]: I0314 08:27:20.035833 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:27:20 crc kubenswrapper[5129]: E0314 08:27:20.036510 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:27:33 crc kubenswrapper[5129]: I0314 08:27:33.037361 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:27:33 crc kubenswrapper[5129]: E0314 08:27:33.039374 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:27:45 crc kubenswrapper[5129]: I0314 08:27:45.037282 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:27:45 crc kubenswrapper[5129]: E0314 08:27:45.038541 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:27:57 crc kubenswrapper[5129]: I0314 08:27:57.037816 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:27:57 crc kubenswrapper[5129]: E0314 08:27:57.039318 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.169735 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557948-kcb48"] Mar 14 08:28:00 crc kubenswrapper[5129]: E0314 08:28:00.170808 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="registry-server" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.170827 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="registry-server" Mar 14 08:28:00 crc kubenswrapper[5129]: E0314 08:28:00.170841 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="extract-utilities" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.170848 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="extract-utilities" Mar 14 08:28:00 crc kubenswrapper[5129]: E0314 08:28:00.170873 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="extract-content" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.170881 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="extract-content" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.171072 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="445e68a7-1eae-460b-aebf-3ead76975785" containerName="registry-server" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.171917 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-kcb48" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.175627 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.180028 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.190134 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.196236 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-kcb48"] Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.239231 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lhd\" (UniqueName: \"kubernetes.io/projected/96b0dbbb-30c1-4812-880b-5f1384846f82-kube-api-access-w7lhd\") pod \"auto-csr-approver-29557948-kcb48\" (UID: \"96b0dbbb-30c1-4812-880b-5f1384846f82\") " pod="openshift-infra/auto-csr-approver-29557948-kcb48" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.341999 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lhd\" (UniqueName: \"kubernetes.io/projected/96b0dbbb-30c1-4812-880b-5f1384846f82-kube-api-access-w7lhd\") pod \"auto-csr-approver-29557948-kcb48\" (UID: \"96b0dbbb-30c1-4812-880b-5f1384846f82\") " pod="openshift-infra/auto-csr-approver-29557948-kcb48" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.381391 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lhd\" (UniqueName: \"kubernetes.io/projected/96b0dbbb-30c1-4812-880b-5f1384846f82-kube-api-access-w7lhd\") pod \"auto-csr-approver-29557948-kcb48\" (UID: \"96b0dbbb-30c1-4812-880b-5f1384846f82\") " pod="openshift-infra/auto-csr-approver-29557948-kcb48" Mar 14 08:28:00 crc kubenswrapper[5129]: I0314 08:28:00.504534 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-kcb48" Mar 14 08:28:01 crc kubenswrapper[5129]: I0314 08:28:01.030741 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-kcb48"] Mar 14 08:28:01 crc kubenswrapper[5129]: I0314 08:28:01.738983 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557948-kcb48" event={"ID":"96b0dbbb-30c1-4812-880b-5f1384846f82","Type":"ContainerStarted","Data":"ee22d7ca00656dc8e2f26c3f8d7d01ddeda66631696b785337a59945846638ff"} Mar 14 08:28:02 crc kubenswrapper[5129]: I0314 08:28:02.753466 5129 generic.go:334] "Generic (PLEG): container finished" podID="96b0dbbb-30c1-4812-880b-5f1384846f82" containerID="2c51730ec0bf9f36ddca04f79c357bb6a930ffe33c0a94e3f7dbd10dd399c28b" exitCode=0 Mar 14 08:28:02 crc kubenswrapper[5129]: I0314 08:28:02.753509 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557948-kcb48" event={"ID":"96b0dbbb-30c1-4812-880b-5f1384846f82","Type":"ContainerDied","Data":"2c51730ec0bf9f36ddca04f79c357bb6a930ffe33c0a94e3f7dbd10dd399c28b"} Mar 14 08:28:04 crc kubenswrapper[5129]: I0314 08:28:04.101895 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-kcb48" Mar 14 08:28:04 crc kubenswrapper[5129]: I0314 08:28:04.206680 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7lhd\" (UniqueName: \"kubernetes.io/projected/96b0dbbb-30c1-4812-880b-5f1384846f82-kube-api-access-w7lhd\") pod \"96b0dbbb-30c1-4812-880b-5f1384846f82\" (UID: \"96b0dbbb-30c1-4812-880b-5f1384846f82\") " Mar 14 08:28:04 crc kubenswrapper[5129]: I0314 08:28:04.213351 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b0dbbb-30c1-4812-880b-5f1384846f82-kube-api-access-w7lhd" (OuterVolumeSpecName: "kube-api-access-w7lhd") pod "96b0dbbb-30c1-4812-880b-5f1384846f82" (UID: "96b0dbbb-30c1-4812-880b-5f1384846f82"). InnerVolumeSpecName "kube-api-access-w7lhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:28:04 crc kubenswrapper[5129]: I0314 08:28:04.309403 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7lhd\" (UniqueName: \"kubernetes.io/projected/96b0dbbb-30c1-4812-880b-5f1384846f82-kube-api-access-w7lhd\") on node \"crc\" DevicePath \"\"" Mar 14 08:28:04 crc kubenswrapper[5129]: I0314 08:28:04.784470 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557948-kcb48" event={"ID":"96b0dbbb-30c1-4812-880b-5f1384846f82","Type":"ContainerDied","Data":"ee22d7ca00656dc8e2f26c3f8d7d01ddeda66631696b785337a59945846638ff"} Mar 14 08:28:04 crc kubenswrapper[5129]: I0314 08:28:04.784567 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee22d7ca00656dc8e2f26c3f8d7d01ddeda66631696b785337a59945846638ff" Mar 14 08:28:04 crc kubenswrapper[5129]: I0314 08:28:04.784716 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557948-kcb48" Mar 14 08:28:05 crc kubenswrapper[5129]: I0314 08:28:05.197838 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-d6h5f"] Mar 14 08:28:05 crc kubenswrapper[5129]: I0314 08:28:05.205498 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557942-d6h5f"] Mar 14 08:28:06 crc kubenswrapper[5129]: I0314 08:28:06.054027 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9be32f2-6e41-4329-857f-b81cd9dc21b6" path="/var/lib/kubelet/pods/e9be32f2-6e41-4329-857f-b81cd9dc21b6/volumes" Mar 14 08:28:11 crc kubenswrapper[5129]: I0314 08:28:11.036893 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:28:11 crc kubenswrapper[5129]: E0314 08:28:11.037950 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:28:23 crc kubenswrapper[5129]: I0314 08:28:23.036859 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:28:23 crc kubenswrapper[5129]: I0314 08:28:23.963862 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"c83e0d17e7239a4e3d626b98c707f403777782f6beee2546f9008683acbabe1b"} Mar 14 08:28:25 crc kubenswrapper[5129]: I0314 08:28:25.208317 5129 scope.go:117] "RemoveContainer" containerID="0ed3e9e127d72ce74cd155cbdc6a251638f472f64e6c74e27a731b032287e502" Mar 14 08:28:36 crc kubenswrapper[5129]: I0314 08:28:36.884696 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qdqx5"] Mar 14 08:28:36 crc kubenswrapper[5129]: E0314 08:28:36.885736 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b0dbbb-30c1-4812-880b-5f1384846f82" containerName="oc" Mar 14 08:28:36 crc kubenswrapper[5129]: I0314 08:28:36.885757 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b0dbbb-30c1-4812-880b-5f1384846f82" containerName="oc" Mar 14 08:28:36 crc kubenswrapper[5129]: I0314 08:28:36.886012 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b0dbbb-30c1-4812-880b-5f1384846f82" containerName="oc" Mar 14 08:28:36 crc kubenswrapper[5129]: I0314 08:28:36.887627 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:36 crc kubenswrapper[5129]: I0314 08:28:36.893175 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdqx5"] Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.059724 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjc2k\" (UniqueName: \"kubernetes.io/projected/233996d7-58ec-4eab-bfb3-88edfb2272b3-kube-api-access-gjc2k\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.059818 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-utilities\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.059843 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-catalog-content\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.162117 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjc2k\" (UniqueName: \"kubernetes.io/projected/233996d7-58ec-4eab-bfb3-88edfb2272b3-kube-api-access-gjc2k\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.162274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-utilities\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.162329 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-catalog-content\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.162880 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-utilities\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.163058 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-catalog-content\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.181587 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjc2k\" (UniqueName: \"kubernetes.io/projected/233996d7-58ec-4eab-bfb3-88edfb2272b3-kube-api-access-gjc2k\") pod \"certified-operators-qdqx5\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.221300 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:37 crc kubenswrapper[5129]: I0314 08:28:37.685958 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdqx5"] Mar 14 08:28:37 crc kubenswrapper[5129]: W0314 08:28:37.690356 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod233996d7_58ec_4eab_bfb3_88edfb2272b3.slice/crio-9c9ed58c9c0e2b1eb18912968e539cdba34b72604fab144227cf58d6b5b484f0 WatchSource:0}: Error finding container 9c9ed58c9c0e2b1eb18912968e539cdba34b72604fab144227cf58d6b5b484f0: Status 404 returned error can't find the container with id 9c9ed58c9c0e2b1eb18912968e539cdba34b72604fab144227cf58d6b5b484f0 Mar 14 08:28:38 crc kubenswrapper[5129]: I0314 08:28:38.107710 5129 generic.go:334] "Generic (PLEG): container finished" podID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerID="3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226" exitCode=0 Mar 14 08:28:38 crc kubenswrapper[5129]: I0314 08:28:38.107773 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdqx5" event={"ID":"233996d7-58ec-4eab-bfb3-88edfb2272b3","Type":"ContainerDied","Data":"3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226"} Mar 14 08:28:38 crc kubenswrapper[5129]: I0314 08:28:38.108088 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdqx5" event={"ID":"233996d7-58ec-4eab-bfb3-88edfb2272b3","Type":"ContainerStarted","Data":"9c9ed58c9c0e2b1eb18912968e539cdba34b72604fab144227cf58d6b5b484f0"} Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.288572 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6nfh"] Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.293453 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.307356 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6nfh"] Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.393665 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-utilities\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.393840 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-catalog-content\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.393971 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf5kh\" (UniqueName: \"kubernetes.io/projected/4da040da-b094-47dc-9617-57daf1f2fc48-kube-api-access-vf5kh\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.495416 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-utilities\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.495855 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-utilities\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.495888 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-catalog-content\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.495998 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf5kh\" (UniqueName: \"kubernetes.io/projected/4da040da-b094-47dc-9617-57daf1f2fc48-kube-api-access-vf5kh\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.496740 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-catalog-content\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.521718 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf5kh\" (UniqueName: \"kubernetes.io/projected/4da040da-b094-47dc-9617-57daf1f2fc48-kube-api-access-vf5kh\") pod \"redhat-operators-j6nfh\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:39 crc kubenswrapper[5129]: I0314 08:28:39.630636 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:40 crc kubenswrapper[5129]: I0314 08:28:40.084099 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6nfh"] Mar 14 08:28:40 crc kubenswrapper[5129]: W0314 08:28:40.088130 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4da040da_b094_47dc_9617_57daf1f2fc48.slice/crio-2f41257253c564a7b2f6ec27ff4e690fbbc9c63d72ea9f373fb13b0360f863be WatchSource:0}: Error finding container 2f41257253c564a7b2f6ec27ff4e690fbbc9c63d72ea9f373fb13b0360f863be: Status 404 returned error can't find the container with id 2f41257253c564a7b2f6ec27ff4e690fbbc9c63d72ea9f373fb13b0360f863be Mar 14 08:28:40 crc kubenswrapper[5129]: I0314 08:28:40.127278 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nfh" event={"ID":"4da040da-b094-47dc-9617-57daf1f2fc48","Type":"ContainerStarted","Data":"2f41257253c564a7b2f6ec27ff4e690fbbc9c63d72ea9f373fb13b0360f863be"} Mar 14 08:28:40 crc kubenswrapper[5129]: I0314 08:28:40.130833 5129 generic.go:334] "Generic (PLEG): container finished" podID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerID="b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a" exitCode=0 Mar 14 08:28:40 crc kubenswrapper[5129]: I0314 08:28:40.130867 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdqx5" event={"ID":"233996d7-58ec-4eab-bfb3-88edfb2272b3","Type":"ContainerDied","Data":"b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a"} Mar 14 08:28:41 crc kubenswrapper[5129]: I0314 08:28:41.144846 5129 generic.go:334] "Generic (PLEG): container finished" podID="4da040da-b094-47dc-9617-57daf1f2fc48" containerID="cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c" exitCode=0 Mar 14 08:28:41 crc kubenswrapper[5129]: I0314 08:28:41.144994 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nfh" event={"ID":"4da040da-b094-47dc-9617-57daf1f2fc48","Type":"ContainerDied","Data":"cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c"} Mar 14 08:28:41 crc kubenswrapper[5129]: I0314 08:28:41.152782 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdqx5" event={"ID":"233996d7-58ec-4eab-bfb3-88edfb2272b3","Type":"ContainerStarted","Data":"af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283"} Mar 14 08:28:41 crc kubenswrapper[5129]: I0314 08:28:41.195392 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qdqx5" podStartSLOduration=2.700937058 podStartE2EDuration="5.195365803s" podCreationTimestamp="2026-03-14 08:28:36 +0000 UTC" firstStartedPulling="2026-03-14 08:28:38.109674271 +0000 UTC m=+5380.861589465" lastFinishedPulling="2026-03-14 08:28:40.604103036 +0000 UTC m=+5383.356018210" observedRunningTime="2026-03-14 08:28:41.19264633 +0000 UTC m=+5383.944561534" watchObservedRunningTime="2026-03-14 08:28:41.195365803 +0000 UTC m=+5383.947280987" Mar 14 08:28:42 crc kubenswrapper[5129]: I0314 08:28:42.161108 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nfh" event={"ID":"4da040da-b094-47dc-9617-57daf1f2fc48","Type":"ContainerStarted","Data":"4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357"} Mar 14 08:28:43 crc kubenswrapper[5129]: I0314 08:28:43.173995 5129 generic.go:334] "Generic (PLEG): container finished" podID="4da040da-b094-47dc-9617-57daf1f2fc48" containerID="4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357" exitCode=0 Mar 14 08:28:43 crc kubenswrapper[5129]: I0314 08:28:43.174066 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nfh" event={"ID":"4da040da-b094-47dc-9617-57daf1f2fc48","Type":"ContainerDied","Data":"4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357"} Mar 14 08:28:44 crc kubenswrapper[5129]: I0314 08:28:44.187494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nfh" event={"ID":"4da040da-b094-47dc-9617-57daf1f2fc48","Type":"ContainerStarted","Data":"3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a"} Mar 14 08:28:44 crc kubenswrapper[5129]: I0314 08:28:44.211723 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6nfh" podStartSLOduration=2.759239557 podStartE2EDuration="5.211691343s" podCreationTimestamp="2026-03-14 08:28:39 +0000 UTC" firstStartedPulling="2026-03-14 08:28:41.149097826 +0000 UTC m=+5383.901013020" lastFinishedPulling="2026-03-14 08:28:43.601549612 +0000 UTC m=+5386.353464806" observedRunningTime="2026-03-14 08:28:44.209841352 +0000 UTC m=+5386.961756606" watchObservedRunningTime="2026-03-14 08:28:44.211691343 +0000 UTC m=+5386.963606567" Mar 14 08:28:47 crc kubenswrapper[5129]: I0314 08:28:47.221657 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:47 crc kubenswrapper[5129]: I0314 08:28:47.222063 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:47 crc kubenswrapper[5129]: I0314 08:28:47.286551 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:48 crc kubenswrapper[5129]: I0314 08:28:48.275511 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:48 crc kubenswrapper[5129]: I0314 08:28:48.481043 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdqx5"] Mar 14 08:28:49 crc kubenswrapper[5129]: I0314 08:28:49.631844 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:49 crc kubenswrapper[5129]: I0314 08:28:49.632514 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.238021 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qdqx5" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="registry-server" containerID="cri-o://af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283" gracePeriod=2 Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.618939 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.688881 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6nfh" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="registry-server" probeResult="failure" output=< Mar 14 08:28:50 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 08:28:50 crc kubenswrapper[5129]: > Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.776937 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjc2k\" (UniqueName: \"kubernetes.io/projected/233996d7-58ec-4eab-bfb3-88edfb2272b3-kube-api-access-gjc2k\") pod \"233996d7-58ec-4eab-bfb3-88edfb2272b3\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.777087 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-catalog-content\") pod \"233996d7-58ec-4eab-bfb3-88edfb2272b3\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.777139 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-utilities\") pod \"233996d7-58ec-4eab-bfb3-88edfb2272b3\" (UID: \"233996d7-58ec-4eab-bfb3-88edfb2272b3\") " Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.778251 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-utilities" (OuterVolumeSpecName: "utilities") pod "233996d7-58ec-4eab-bfb3-88edfb2272b3" (UID: "233996d7-58ec-4eab-bfb3-88edfb2272b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.788846 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233996d7-58ec-4eab-bfb3-88edfb2272b3-kube-api-access-gjc2k" (OuterVolumeSpecName: "kube-api-access-gjc2k") pod "233996d7-58ec-4eab-bfb3-88edfb2272b3" (UID: "233996d7-58ec-4eab-bfb3-88edfb2272b3"). InnerVolumeSpecName "kube-api-access-gjc2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.879182 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjc2k\" (UniqueName: \"kubernetes.io/projected/233996d7-58ec-4eab-bfb3-88edfb2272b3-kube-api-access-gjc2k\") on node \"crc\" DevicePath \"\"" Mar 14 08:28:50 crc kubenswrapper[5129]: I0314 08:28:50.879214 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.254448 5129 generic.go:334] "Generic (PLEG): container finished" podID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerID="af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283" exitCode=0 Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.254496 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdqx5" event={"ID":"233996d7-58ec-4eab-bfb3-88edfb2272b3","Type":"ContainerDied","Data":"af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283"} Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.254523 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdqx5" event={"ID":"233996d7-58ec-4eab-bfb3-88edfb2272b3","Type":"ContainerDied","Data":"9c9ed58c9c0e2b1eb18912968e539cdba34b72604fab144227cf58d6b5b484f0"} Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.254542 5129 scope.go:117] "RemoveContainer" containerID="af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.254721 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdqx5" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.282428 5129 scope.go:117] "RemoveContainer" containerID="b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.307206 5129 scope.go:117] "RemoveContainer" containerID="3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.317935 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "233996d7-58ec-4eab-bfb3-88edfb2272b3" (UID: "233996d7-58ec-4eab-bfb3-88edfb2272b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.348496 5129 scope.go:117] "RemoveContainer" containerID="af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283" Mar 14 08:28:51 crc kubenswrapper[5129]: E0314 08:28:51.350062 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283\": container with ID starting with af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283 not found: ID does not exist" containerID="af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.350098 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283"} err="failed to get container status \"af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283\": rpc error: code = NotFound desc = could not find container \"af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283\": container with ID starting with af078187d21dd2a7f8b1ade355f634f8816b531621b0bd8d11df55f1c7ac1283 not found: ID does not exist" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.350117 5129 scope.go:117] "RemoveContainer" containerID="b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a" Mar 14 08:28:51 crc kubenswrapper[5129]: E0314 08:28:51.350528 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a\": container with ID starting with b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a not found: ID does not exist" containerID="b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.350546 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a"} err="failed to get container status \"b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a\": rpc error: code = NotFound desc = could not find container \"b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a\": container with ID starting with b4ab5ebaa0f19c76031efa60b7799ac34adb732b55753741e4209c495fe56a9a not found: ID does not exist" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.350560 5129 scope.go:117] "RemoveContainer" containerID="3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226" Mar 14 08:28:51 crc kubenswrapper[5129]: E0314 08:28:51.350964 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226\": container with ID starting with 3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226 not found: ID does not exist" containerID="3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.350986 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226"} err="failed to get container status \"3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226\": rpc error: code = NotFound desc = could not find container \"3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226\": container with ID starting with 3451f28a6c6b09783af2838c2c76029a60d059ebd85735079e1d2c75613df226 not found: ID does not exist" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.387535 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/233996d7-58ec-4eab-bfb3-88edfb2272b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.604335 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdqx5"] Mar 14 08:28:51 crc kubenswrapper[5129]: I0314 08:28:51.613649 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qdqx5"] Mar 14 08:28:52 crc kubenswrapper[5129]: I0314 08:28:52.047986 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" path="/var/lib/kubelet/pods/233996d7-58ec-4eab-bfb3-88edfb2272b3/volumes" Mar 14 08:28:59 crc kubenswrapper[5129]: I0314 08:28:59.684156 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:59 crc kubenswrapper[5129]: I0314 08:28:59.746554 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:28:59 crc kubenswrapper[5129]: I0314 08:28:59.925293 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6nfh"] Mar 14 08:29:01 crc kubenswrapper[5129]: I0314 08:29:01.361768 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j6nfh" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="registry-server" containerID="cri-o://3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a" gracePeriod=2 Mar 14 08:29:01 crc kubenswrapper[5129]: I0314 08:29:01.914994 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.027260 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-utilities\") pod \"4da040da-b094-47dc-9617-57daf1f2fc48\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.027450 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf5kh\" (UniqueName: \"kubernetes.io/projected/4da040da-b094-47dc-9617-57daf1f2fc48-kube-api-access-vf5kh\") pod \"4da040da-b094-47dc-9617-57daf1f2fc48\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.027503 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-catalog-content\") pod \"4da040da-b094-47dc-9617-57daf1f2fc48\" (UID: \"4da040da-b094-47dc-9617-57daf1f2fc48\") " Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.028599 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-utilities" (OuterVolumeSpecName: "utilities") pod "4da040da-b094-47dc-9617-57daf1f2fc48" (UID: "4da040da-b094-47dc-9617-57daf1f2fc48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.034681 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da040da-b094-47dc-9617-57daf1f2fc48-kube-api-access-vf5kh" (OuterVolumeSpecName: "kube-api-access-vf5kh") pod "4da040da-b094-47dc-9617-57daf1f2fc48" (UID: "4da040da-b094-47dc-9617-57daf1f2fc48"). InnerVolumeSpecName "kube-api-access-vf5kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.129971 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.130012 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf5kh\" (UniqueName: \"kubernetes.io/projected/4da040da-b094-47dc-9617-57daf1f2fc48-kube-api-access-vf5kh\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.174545 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4da040da-b094-47dc-9617-57daf1f2fc48" (UID: "4da040da-b094-47dc-9617-57daf1f2fc48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.232305 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da040da-b094-47dc-9617-57daf1f2fc48-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.371134 5129 generic.go:334] "Generic (PLEG): container finished" podID="4da040da-b094-47dc-9617-57daf1f2fc48" containerID="3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a" exitCode=0 Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.371175 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nfh" event={"ID":"4da040da-b094-47dc-9617-57daf1f2fc48","Type":"ContainerDied","Data":"3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a"} Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.371204 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6nfh" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.371240 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6nfh" event={"ID":"4da040da-b094-47dc-9617-57daf1f2fc48","Type":"ContainerDied","Data":"2f41257253c564a7b2f6ec27ff4e690fbbc9c63d72ea9f373fb13b0360f863be"} Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.371273 5129 scope.go:117] "RemoveContainer" containerID="3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.393007 5129 scope.go:117] "RemoveContainer" containerID="4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.411252 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6nfh"] Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.416437 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j6nfh"] Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.418405 5129 scope.go:117] "RemoveContainer" containerID="cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.442789 5129 scope.go:117] "RemoveContainer" containerID="3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a" Mar 14 08:29:02 crc kubenswrapper[5129]: E0314 08:29:02.443243 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a\": container with ID starting with 3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a not found: ID does not exist" containerID="3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.443309 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a"} err="failed to get container status \"3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a\": rpc error: code = NotFound desc = could not find container \"3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a\": container with ID starting with 3e2d9cbae866310b0c2ba385600b5e36ac2c134d2def990dcfd2fa2bf13d4a4a not found: ID does not exist" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.443350 5129 scope.go:117] "RemoveContainer" containerID="4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357" Mar 14 08:29:02 crc kubenswrapper[5129]: E0314 08:29:02.443892 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357\": container with ID starting with 4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357 not found: ID does not exist" containerID="4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.443921 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357"} err="failed to get container status \"4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357\": rpc error: code = NotFound desc = could not find container \"4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357\": container with ID starting with 4a4a00ccf6ce4a2f37eb5de6dd397c08971f3367ac793644a3bee2293273c357 not found: ID does not exist" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.443943 5129 scope.go:117] "RemoveContainer" containerID="cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c" Mar 14 08:29:02 crc kubenswrapper[5129]: E0314 08:29:02.444436 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c\": container with ID starting with cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c not found: ID does not exist" containerID="cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c" Mar 14 08:29:02 crc kubenswrapper[5129]: I0314 08:29:02.444944 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c"} err="failed to get container status \"cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c\": rpc error: code = NotFound desc = could not find container \"cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c\": container with ID starting with cb15c64049f82971d2a03a5f6e7e289278e70149f3becfe10d902dbaa8abd93c not found: ID does not exist" Mar 14 08:29:04 crc kubenswrapper[5129]: I0314 08:29:04.044268 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" path="/var/lib/kubelet/pods/4da040da-b094-47dc-9617-57daf1f2fc48/volumes" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.148192 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557950-9f7mv"] Mar 14 08:30:00 crc kubenswrapper[5129]: E0314 08:30:00.149103 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="registry-server" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149120 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="registry-server" Mar 14 08:30:00 crc kubenswrapper[5129]: E0314 08:30:00.149139 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="extract-content" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149147 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="extract-content" Mar 14 08:30:00 crc kubenswrapper[5129]: E0314 08:30:00.149163 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="extract-utilities" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149171 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="extract-utilities" Mar 14 08:30:00 crc kubenswrapper[5129]: E0314 08:30:00.149184 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="registry-server" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149193 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="registry-server" Mar 14 08:30:00 crc kubenswrapper[5129]: E0314 08:30:00.149205 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="extract-utilities" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149212 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="extract-utilities" Mar 14 08:30:00 crc kubenswrapper[5129]: E0314 08:30:00.149229 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="extract-content" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149237 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="extract-content" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149388 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da040da-b094-47dc-9617-57daf1f2fc48" containerName="registry-server" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149414 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="233996d7-58ec-4eab-bfb3-88edfb2272b3" containerName="registry-server" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.149979 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-9f7mv" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.156126 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.156132 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.156886 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.170168 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw"] Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.171556 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.175647 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.176435 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.177306 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-9f7mv"] Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.186034 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw"] Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.310192 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d908e0-7ff9-4857-9702-3e913e9750e0-config-volume\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.310231 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqf6\" (UniqueName: \"kubernetes.io/projected/fcb8e06c-eafa-4452-8cd4-211d499a27e2-kube-api-access-lqqf6\") pod \"auto-csr-approver-29557950-9f7mv\" (UID: \"fcb8e06c-eafa-4452-8cd4-211d499a27e2\") " pod="openshift-infra/auto-csr-approver-29557950-9f7mv" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.310251 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrhnm\" (UniqueName: \"kubernetes.io/projected/92d908e0-7ff9-4857-9702-3e913e9750e0-kube-api-access-lrhnm\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.310285 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d908e0-7ff9-4857-9702-3e913e9750e0-secret-volume\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.412288 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d908e0-7ff9-4857-9702-3e913e9750e0-secret-volume\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.412535 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d908e0-7ff9-4857-9702-3e913e9750e0-config-volume\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.412584 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqf6\" (UniqueName: \"kubernetes.io/projected/fcb8e06c-eafa-4452-8cd4-211d499a27e2-kube-api-access-lqqf6\") pod \"auto-csr-approver-29557950-9f7mv\" (UID: \"fcb8e06c-eafa-4452-8cd4-211d499a27e2\") " pod="openshift-infra/auto-csr-approver-29557950-9f7mv" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.412714 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrhnm\" (UniqueName: \"kubernetes.io/projected/92d908e0-7ff9-4857-9702-3e913e9750e0-kube-api-access-lrhnm\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.414346 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d908e0-7ff9-4857-9702-3e913e9750e0-config-volume\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.433399 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d908e0-7ff9-4857-9702-3e913e9750e0-secret-volume\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.454423 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqf6\" (UniqueName: \"kubernetes.io/projected/fcb8e06c-eafa-4452-8cd4-211d499a27e2-kube-api-access-lqqf6\") pod \"auto-csr-approver-29557950-9f7mv\" (UID: \"fcb8e06c-eafa-4452-8cd4-211d499a27e2\") " pod="openshift-infra/auto-csr-approver-29557950-9f7mv" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.459100 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrhnm\" (UniqueName: \"kubernetes.io/projected/92d908e0-7ff9-4857-9702-3e913e9750e0-kube-api-access-lrhnm\") pod \"collect-profiles-29557950-8f2sw\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.468670 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-9f7mv" Mar 14 08:30:00 crc kubenswrapper[5129]: I0314 08:30:00.495573 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:01 crc kubenswrapper[5129]: I0314 08:30:01.025213 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw"] Mar 14 08:30:01 crc kubenswrapper[5129]: I0314 08:30:01.069312 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-9f7mv"] Mar 14 08:30:01 crc kubenswrapper[5129]: W0314 08:30:01.076031 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcb8e06c_eafa_4452_8cd4_211d499a27e2.slice/crio-16c188cae5a0da4b5f8e182a3c7da6b848cb8c4a65f90ad5ead8b76344553ec7 WatchSource:0}: Error finding container 16c188cae5a0da4b5f8e182a3c7da6b848cb8c4a65f90ad5ead8b76344553ec7: Status 404 returned error can't find the container with id 16c188cae5a0da4b5f8e182a3c7da6b848cb8c4a65f90ad5ead8b76344553ec7 Mar 14 08:30:01 crc kubenswrapper[5129]: I0314 08:30:01.954870 5129 generic.go:334] "Generic (PLEG): container finished" podID="92d908e0-7ff9-4857-9702-3e913e9750e0" containerID="a81c4c4b7be60df0c078b65c98bea1bd921c3091962c05a08a8532a20f292a02" exitCode=0 Mar 14 08:30:01 crc kubenswrapper[5129]: I0314 08:30:01.955762 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" event={"ID":"92d908e0-7ff9-4857-9702-3e913e9750e0","Type":"ContainerDied","Data":"a81c4c4b7be60df0c078b65c98bea1bd921c3091962c05a08a8532a20f292a02"} Mar 14 08:30:01 crc kubenswrapper[5129]: I0314 08:30:01.955819 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" event={"ID":"92d908e0-7ff9-4857-9702-3e913e9750e0","Type":"ContainerStarted","Data":"e0d0dc5e4eb32c034e3f68dc7ff60b9c542fa03cd5a2ad4641c6df6c09560e94"} Mar 14 08:30:01 crc kubenswrapper[5129]: I0314 08:30:01.958653 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-9f7mv" event={"ID":"fcb8e06c-eafa-4452-8cd4-211d499a27e2","Type":"ContainerStarted","Data":"16c188cae5a0da4b5f8e182a3c7da6b848cb8c4a65f90ad5ead8b76344553ec7"} Mar 14 08:30:02 crc kubenswrapper[5129]: I0314 08:30:02.967248 5129 generic.go:334] "Generic (PLEG): container finished" podID="fcb8e06c-eafa-4452-8cd4-211d499a27e2" containerID="f9d4c19f36e46caa75856f7defc22e419ea64fff26fd8871104e27c309466539" exitCode=0 Mar 14 08:30:02 crc kubenswrapper[5129]: I0314 08:30:02.967304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-9f7mv" event={"ID":"fcb8e06c-eafa-4452-8cd4-211d499a27e2","Type":"ContainerDied","Data":"f9d4c19f36e46caa75856f7defc22e419ea64fff26fd8871104e27c309466539"} Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.238474 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.396552 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrhnm\" (UniqueName: \"kubernetes.io/projected/92d908e0-7ff9-4857-9702-3e913e9750e0-kube-api-access-lrhnm\") pod \"92d908e0-7ff9-4857-9702-3e913e9750e0\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.396700 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d908e0-7ff9-4857-9702-3e913e9750e0-secret-volume\") pod \"92d908e0-7ff9-4857-9702-3e913e9750e0\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.396760 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d908e0-7ff9-4857-9702-3e913e9750e0-config-volume\") pod \"92d908e0-7ff9-4857-9702-3e913e9750e0\" (UID: \"92d908e0-7ff9-4857-9702-3e913e9750e0\") " Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.397432 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d908e0-7ff9-4857-9702-3e913e9750e0-config-volume" (OuterVolumeSpecName: "config-volume") pod "92d908e0-7ff9-4857-9702-3e913e9750e0" (UID: "92d908e0-7ff9-4857-9702-3e913e9750e0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.403995 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d908e0-7ff9-4857-9702-3e913e9750e0-kube-api-access-lrhnm" (OuterVolumeSpecName: "kube-api-access-lrhnm") pod "92d908e0-7ff9-4857-9702-3e913e9750e0" (UID: "92d908e0-7ff9-4857-9702-3e913e9750e0"). InnerVolumeSpecName "kube-api-access-lrhnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.404191 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d908e0-7ff9-4857-9702-3e913e9750e0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "92d908e0-7ff9-4857-9702-3e913e9750e0" (UID: "92d908e0-7ff9-4857-9702-3e913e9750e0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.498926 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d908e0-7ff9-4857-9702-3e913e9750e0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.498971 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d908e0-7ff9-4857-9702-3e913e9750e0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.498986 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrhnm\" (UniqueName: \"kubernetes.io/projected/92d908e0-7ff9-4857-9702-3e913e9750e0-kube-api-access-lrhnm\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.975322 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.975330 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw" event={"ID":"92d908e0-7ff9-4857-9702-3e913e9750e0","Type":"ContainerDied","Data":"e0d0dc5e4eb32c034e3f68dc7ff60b9c542fa03cd5a2ad4641c6df6c09560e94"} Mar 14 08:30:03 crc kubenswrapper[5129]: I0314 08:30:03.975382 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0d0dc5e4eb32c034e3f68dc7ff60b9c542fa03cd5a2ad4641c6df6c09560e94" Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.215615 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-9f7mv" Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.310743 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p"] Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.317363 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557905-5b65p"] Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.411879 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqqf6\" (UniqueName: \"kubernetes.io/projected/fcb8e06c-eafa-4452-8cd4-211d499a27e2-kube-api-access-lqqf6\") pod \"fcb8e06c-eafa-4452-8cd4-211d499a27e2\" (UID: \"fcb8e06c-eafa-4452-8cd4-211d499a27e2\") " Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.416800 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb8e06c-eafa-4452-8cd4-211d499a27e2-kube-api-access-lqqf6" (OuterVolumeSpecName: "kube-api-access-lqqf6") pod "fcb8e06c-eafa-4452-8cd4-211d499a27e2" (UID: "fcb8e06c-eafa-4452-8cd4-211d499a27e2"). InnerVolumeSpecName "kube-api-access-lqqf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.513832 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqqf6\" (UniqueName: \"kubernetes.io/projected/fcb8e06c-eafa-4452-8cd4-211d499a27e2-kube-api-access-lqqf6\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.988979 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-9f7mv" event={"ID":"fcb8e06c-eafa-4452-8cd4-211d499a27e2","Type":"ContainerDied","Data":"16c188cae5a0da4b5f8e182a3c7da6b848cb8c4a65f90ad5ead8b76344553ec7"} Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.989724 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16c188cae5a0da4b5f8e182a3c7da6b848cb8c4a65f90ad5ead8b76344553ec7" Mar 14 08:30:04 crc kubenswrapper[5129]: I0314 08:30:04.989112 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-9f7mv" Mar 14 08:30:05 crc kubenswrapper[5129]: I0314 08:30:05.274001 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-s7bv2"] Mar 14 08:30:05 crc kubenswrapper[5129]: I0314 08:30:05.278782 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557944-s7bv2"] Mar 14 08:30:06 crc kubenswrapper[5129]: I0314 08:30:06.052080 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5d079a-0e66-4f29-afcd-58cfbff4d26a" path="/var/lib/kubelet/pods/bd5d079a-0e66-4f29-afcd-58cfbff4d26a/volumes" Mar 14 08:30:06 crc kubenswrapper[5129]: I0314 08:30:06.053209 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9187a75-8afa-425b-8159-2012e38ae798" path="/var/lib/kubelet/pods/e9187a75-8afa-425b-8159-2012e38ae798/volumes" Mar 14 08:30:25 crc kubenswrapper[5129]: I0314 08:30:25.371704 5129 scope.go:117] "RemoveContainer" containerID="ff6681691946acfec8d2b67200d13477b1b44942bea993e698af895f27efbcc5" Mar 14 08:30:25 crc kubenswrapper[5129]: I0314 08:30:25.409996 5129 scope.go:117] "RemoveContainer" containerID="5ebcff27e48fabdf868b3802e7af686dd0862cb00b842fc5e734d8acf84f8490" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.659638 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vk59j"] Mar 14 08:30:43 crc kubenswrapper[5129]: E0314 08:30:43.660694 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb8e06c-eafa-4452-8cd4-211d499a27e2" containerName="oc" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.660716 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb8e06c-eafa-4452-8cd4-211d499a27e2" containerName="oc" Mar 14 08:30:43 crc kubenswrapper[5129]: E0314 08:30:43.660739 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d908e0-7ff9-4857-9702-3e913e9750e0" containerName="collect-profiles" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.660755 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d908e0-7ff9-4857-9702-3e913e9750e0" containerName="collect-profiles" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.661030 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb8e06c-eafa-4452-8cd4-211d499a27e2" containerName="oc" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.661064 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d908e0-7ff9-4857-9702-3e913e9750e0" containerName="collect-profiles" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.663063 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.679472 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk59j"] Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.826589 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-utilities\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.826838 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-988fl\" (UniqueName: \"kubernetes.io/projected/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-kube-api-access-988fl\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.826934 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-catalog-content\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.928315 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-utilities\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.928380 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-988fl\" (UniqueName: \"kubernetes.io/projected/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-kube-api-access-988fl\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.928429 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-catalog-content\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.929326 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-utilities\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.929393 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-catalog-content\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:43 crc kubenswrapper[5129]: I0314 08:30:43.958791 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-988fl\" (UniqueName: \"kubernetes.io/projected/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-kube-api-access-988fl\") pod \"redhat-marketplace-vk59j\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:44 crc kubenswrapper[5129]: I0314 08:30:44.001463 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:44 crc kubenswrapper[5129]: I0314 08:30:44.308437 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk59j"] Mar 14 08:30:44 crc kubenswrapper[5129]: I0314 08:30:44.345082 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk59j" event={"ID":"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a","Type":"ContainerStarted","Data":"df121a1c708ebf6e0c78f4ffc7200abed966043f2903c114fa192ab5b4ccc791"} Mar 14 08:30:45 crc kubenswrapper[5129]: I0314 08:30:45.360147 5129 generic.go:334] "Generic (PLEG): container finished" podID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerID="a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83" exitCode=0 Mar 14 08:30:45 crc kubenswrapper[5129]: I0314 08:30:45.360226 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk59j" event={"ID":"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a","Type":"ContainerDied","Data":"a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83"} Mar 14 08:30:46 crc kubenswrapper[5129]: I0314 08:30:46.370290 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk59j" event={"ID":"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a","Type":"ContainerStarted","Data":"a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4"} Mar 14 08:30:47 crc kubenswrapper[5129]: I0314 08:30:47.382298 5129 generic.go:334] "Generic (PLEG): container finished" podID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerID="a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4" exitCode=0 Mar 14 08:30:47 crc kubenswrapper[5129]: I0314 08:30:47.382367 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk59j" event={"ID":"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a","Type":"ContainerDied","Data":"a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4"} Mar 14 08:30:48 crc kubenswrapper[5129]: I0314 08:30:48.390489 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk59j" event={"ID":"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a","Type":"ContainerStarted","Data":"8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521"} Mar 14 08:30:48 crc kubenswrapper[5129]: I0314 08:30:48.414497 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vk59j" podStartSLOduration=2.978261535 podStartE2EDuration="5.414478478s" podCreationTimestamp="2026-03-14 08:30:43 +0000 UTC" firstStartedPulling="2026-03-14 08:30:45.369341837 +0000 UTC m=+5508.121257061" lastFinishedPulling="2026-03-14 08:30:47.80555882 +0000 UTC m=+5510.557474004" observedRunningTime="2026-03-14 08:30:48.413186602 +0000 UTC m=+5511.165101796" watchObservedRunningTime="2026-03-14 08:30:48.414478478 +0000 UTC m=+5511.166393662" Mar 14 08:30:49 crc kubenswrapper[5129]: I0314 08:30:49.574912 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:30:49 crc kubenswrapper[5129]: I0314 08:30:49.575521 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:30:54 crc kubenswrapper[5129]: I0314 08:30:54.002204 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:54 crc kubenswrapper[5129]: I0314 08:30:54.002823 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:54 crc kubenswrapper[5129]: I0314 08:30:54.044619 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:54 crc kubenswrapper[5129]: I0314 08:30:54.530055 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:57 crc kubenswrapper[5129]: I0314 08:30:57.654196 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk59j"] Mar 14 08:30:57 crc kubenswrapper[5129]: I0314 08:30:57.655479 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vk59j" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="registry-server" containerID="cri-o://8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521" gracePeriod=2 Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.147255 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.265800 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-utilities\") pod \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.265862 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-988fl\" (UniqueName: \"kubernetes.io/projected/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-kube-api-access-988fl\") pod \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.265893 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-catalog-content\") pod \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\" (UID: \"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a\") " Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.267329 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-utilities" (OuterVolumeSpecName: "utilities") pod "2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" (UID: "2dab0cf9-8e7a-4674-ba04-45c34a2bb71a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.271543 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-kube-api-access-988fl" (OuterVolumeSpecName: "kube-api-access-988fl") pod "2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" (UID: "2dab0cf9-8e7a-4674-ba04-45c34a2bb71a"). InnerVolumeSpecName "kube-api-access-988fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.295912 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" (UID: "2dab0cf9-8e7a-4674-ba04-45c34a2bb71a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.366774 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.366824 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-988fl\" (UniqueName: \"kubernetes.io/projected/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-kube-api-access-988fl\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.366838 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.508767 5129 generic.go:334] "Generic (PLEG): container finished" podID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerID="8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521" exitCode=0 Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.508843 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk59j" event={"ID":"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a","Type":"ContainerDied","Data":"8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521"} Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.508872 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vk59j" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.508899 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vk59j" event={"ID":"2dab0cf9-8e7a-4674-ba04-45c34a2bb71a","Type":"ContainerDied","Data":"df121a1c708ebf6e0c78f4ffc7200abed966043f2903c114fa192ab5b4ccc791"} Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.508926 5129 scope.go:117] "RemoveContainer" containerID="8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.530809 5129 scope.go:117] "RemoveContainer" containerID="a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.547567 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk59j"] Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.553405 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vk59j"] Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.556999 5129 scope.go:117] "RemoveContainer" containerID="a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.592267 5129 scope.go:117] "RemoveContainer" containerID="8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521" Mar 14 08:30:58 crc kubenswrapper[5129]: E0314 08:30:58.592755 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521\": container with ID starting with 8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521 not found: ID does not exist" containerID="8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.592786 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521"} err="failed to get container status \"8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521\": rpc error: code = NotFound desc = could not find container \"8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521\": container with ID starting with 8c2b0de49b470411a2095f383951a70697cae1454b3c680408575b25dfa49521 not found: ID does not exist" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.592803 5129 scope.go:117] "RemoveContainer" containerID="a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4" Mar 14 08:30:58 crc kubenswrapper[5129]: E0314 08:30:58.593189 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4\": container with ID starting with a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4 not found: ID does not exist" containerID="a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.593216 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4"} err="failed to get container status \"a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4\": rpc error: code = NotFound desc = could not find container \"a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4\": container with ID starting with a700800793dc77811c7b6551644782e68268bc83f9becefd59198487d3aa77b4 not found: ID does not exist" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.593233 5129 scope.go:117] "RemoveContainer" containerID="a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83" Mar 14 08:30:58 crc kubenswrapper[5129]: E0314 08:30:58.593474 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83\": container with ID starting with a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83 not found: ID does not exist" containerID="a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83" Mar 14 08:30:58 crc kubenswrapper[5129]: I0314 08:30:58.593492 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83"} err="failed to get container status \"a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83\": rpc error: code = NotFound desc = could not find container \"a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83\": container with ID starting with a49ae0ea7d90967c3db05244b6d9c0571d022eb4a9d8eb54e58c687c17cd5e83 not found: ID does not exist" Mar 14 08:31:00 crc kubenswrapper[5129]: I0314 08:31:00.045348 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" path="/var/lib/kubelet/pods/2dab0cf9-8e7a-4674-ba04-45c34a2bb71a/volumes" Mar 14 08:31:19 crc kubenswrapper[5129]: I0314 08:31:19.575390 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:31:19 crc kubenswrapper[5129]: I0314 08:31:19.575972 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.574314 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.575376 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.575531 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.576511 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c83e0d17e7239a4e3d626b98c707f403777782f6beee2546f9008683acbabe1b"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.576642 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://c83e0d17e7239a4e3d626b98c707f403777782f6beee2546f9008683acbabe1b" gracePeriod=600 Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.923730 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="c83e0d17e7239a4e3d626b98c707f403777782f6beee2546f9008683acbabe1b" exitCode=0 Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.923796 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"c83e0d17e7239a4e3d626b98c707f403777782f6beee2546f9008683acbabe1b"} Mar 14 08:31:49 crc kubenswrapper[5129]: I0314 08:31:49.924145 5129 scope.go:117] "RemoveContainer" containerID="cfc693ce5fb3741cdb0d2de9a49a14474262c94e4ac101480abd3d1107c838dd" Mar 14 08:31:50 crc kubenswrapper[5129]: I0314 08:31:50.940230 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0"} Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.163196 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557952-78f2s"] Mar 14 08:32:00 crc kubenswrapper[5129]: E0314 08:32:00.164375 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="extract-utilities" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.164402 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="extract-utilities" Mar 14 08:32:00 crc kubenswrapper[5129]: E0314 08:32:00.164429 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="registry-server" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.164444 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="registry-server" Mar 14 08:32:00 crc kubenswrapper[5129]: E0314 08:32:00.164467 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="extract-content" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.164479 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="extract-content" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.164826 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dab0cf9-8e7a-4674-ba04-45c34a2bb71a" containerName="registry-server" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.165610 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-78f2s" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.169837 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.169914 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-78f2s"] Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.170521 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.170564 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.265907 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rptr\" (UniqueName: \"kubernetes.io/projected/ffc02e4c-98f7-41a2-b0d4-83f074e5b614-kube-api-access-7rptr\") pod \"auto-csr-approver-29557952-78f2s\" (UID: \"ffc02e4c-98f7-41a2-b0d4-83f074e5b614\") " pod="openshift-infra/auto-csr-approver-29557952-78f2s" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.368021 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rptr\" (UniqueName: \"kubernetes.io/projected/ffc02e4c-98f7-41a2-b0d4-83f074e5b614-kube-api-access-7rptr\") pod \"auto-csr-approver-29557952-78f2s\" (UID: \"ffc02e4c-98f7-41a2-b0d4-83f074e5b614\") " pod="openshift-infra/auto-csr-approver-29557952-78f2s" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.391784 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rptr\" (UniqueName: \"kubernetes.io/projected/ffc02e4c-98f7-41a2-b0d4-83f074e5b614-kube-api-access-7rptr\") pod \"auto-csr-approver-29557952-78f2s\" (UID: \"ffc02e4c-98f7-41a2-b0d4-83f074e5b614\") " pod="openshift-infra/auto-csr-approver-29557952-78f2s" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.502519 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-78f2s" Mar 14 08:32:00 crc kubenswrapper[5129]: I0314 08:32:00.995396 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-78f2s"] Mar 14 08:32:01 crc kubenswrapper[5129]: I0314 08:32:01.009721 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:32:01 crc kubenswrapper[5129]: I0314 08:32:01.050082 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-78f2s" event={"ID":"ffc02e4c-98f7-41a2-b0d4-83f074e5b614","Type":"ContainerStarted","Data":"531490a9573e97405015126176e239aa89613163b17bd845695719281bbdf1a7"} Mar 14 08:32:03 crc kubenswrapper[5129]: I0314 08:32:03.067655 5129 generic.go:334] "Generic (PLEG): container finished" podID="ffc02e4c-98f7-41a2-b0d4-83f074e5b614" containerID="55ed641a8a9c8fe51b58fb59e25c3e0411486306d2ce4c959df7906ba082c1c4" exitCode=0 Mar 14 08:32:03 crc kubenswrapper[5129]: I0314 08:32:03.067758 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-78f2s" event={"ID":"ffc02e4c-98f7-41a2-b0d4-83f074e5b614","Type":"ContainerDied","Data":"55ed641a8a9c8fe51b58fb59e25c3e0411486306d2ce4c959df7906ba082c1c4"} Mar 14 08:32:04 crc kubenswrapper[5129]: I0314 08:32:04.342714 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-78f2s" Mar 14 08:32:04 crc kubenswrapper[5129]: I0314 08:32:04.455516 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rptr\" (UniqueName: \"kubernetes.io/projected/ffc02e4c-98f7-41a2-b0d4-83f074e5b614-kube-api-access-7rptr\") pod \"ffc02e4c-98f7-41a2-b0d4-83f074e5b614\" (UID: \"ffc02e4c-98f7-41a2-b0d4-83f074e5b614\") " Mar 14 08:32:04 crc kubenswrapper[5129]: I0314 08:32:04.464134 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc02e4c-98f7-41a2-b0d4-83f074e5b614-kube-api-access-7rptr" (OuterVolumeSpecName: "kube-api-access-7rptr") pod "ffc02e4c-98f7-41a2-b0d4-83f074e5b614" (UID: "ffc02e4c-98f7-41a2-b0d4-83f074e5b614"). InnerVolumeSpecName "kube-api-access-7rptr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:04 crc kubenswrapper[5129]: I0314 08:32:04.558198 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rptr\" (UniqueName: \"kubernetes.io/projected/ffc02e4c-98f7-41a2-b0d4-83f074e5b614-kube-api-access-7rptr\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:05 crc kubenswrapper[5129]: I0314 08:32:05.086291 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-78f2s" event={"ID":"ffc02e4c-98f7-41a2-b0d4-83f074e5b614","Type":"ContainerDied","Data":"531490a9573e97405015126176e239aa89613163b17bd845695719281bbdf1a7"} Mar 14 08:32:05 crc kubenswrapper[5129]: I0314 08:32:05.086329 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="531490a9573e97405015126176e239aa89613163b17bd845695719281bbdf1a7" Mar 14 08:32:05 crc kubenswrapper[5129]: I0314 08:32:05.086332 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-78f2s" Mar 14 08:32:05 crc kubenswrapper[5129]: I0314 08:32:05.425981 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-6lfbj"] Mar 14 08:32:05 crc kubenswrapper[5129]: I0314 08:32:05.434486 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557946-6lfbj"] Mar 14 08:32:06 crc kubenswrapper[5129]: I0314 08:32:06.045912 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d53e29f5-e63a-42bc-8327-a51fb1ccbebe" path="/var/lib/kubelet/pods/d53e29f5-e63a-42bc-8327-a51fb1ccbebe/volumes" Mar 14 08:32:25 crc kubenswrapper[5129]: I0314 08:32:25.565881 5129 scope.go:117] "RemoveContainer" containerID="96641c701f38d5d8e49a8c415603601a23c2d7f23f6f9d4e976239d372a9334a" Mar 14 08:33:49 crc kubenswrapper[5129]: I0314 08:33:49.574860 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:33:49 crc kubenswrapper[5129]: I0314 08:33:49.575691 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.161237 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557954-qzqld"] Mar 14 08:34:00 crc kubenswrapper[5129]: E0314 08:34:00.162459 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc02e4c-98f7-41a2-b0d4-83f074e5b614" containerName="oc" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.162484 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc02e4c-98f7-41a2-b0d4-83f074e5b614" containerName="oc" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.162755 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc02e4c-98f7-41a2-b0d4-83f074e5b614" containerName="oc" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.163485 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-qzqld" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.166367 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.167110 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.170324 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.179925 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-qzqld"] Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.212440 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw695\" (UniqueName: \"kubernetes.io/projected/57add038-a424-4de2-ba7d-6f480f85bb15-kube-api-access-bw695\") pod \"auto-csr-approver-29557954-qzqld\" (UID: \"57add038-a424-4de2-ba7d-6f480f85bb15\") " pod="openshift-infra/auto-csr-approver-29557954-qzqld" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.314173 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw695\" (UniqueName: \"kubernetes.io/projected/57add038-a424-4de2-ba7d-6f480f85bb15-kube-api-access-bw695\") pod \"auto-csr-approver-29557954-qzqld\" (UID: \"57add038-a424-4de2-ba7d-6f480f85bb15\") " pod="openshift-infra/auto-csr-approver-29557954-qzqld" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.355975 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw695\" (UniqueName: \"kubernetes.io/projected/57add038-a424-4de2-ba7d-6f480f85bb15-kube-api-access-bw695\") pod \"auto-csr-approver-29557954-qzqld\" (UID: \"57add038-a424-4de2-ba7d-6f480f85bb15\") " pod="openshift-infra/auto-csr-approver-29557954-qzqld" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.494014 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-qzqld" Mar 14 08:34:00 crc kubenswrapper[5129]: I0314 08:34:00.814592 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-qzqld"] Mar 14 08:34:01 crc kubenswrapper[5129]: I0314 08:34:01.200573 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-qzqld" event={"ID":"57add038-a424-4de2-ba7d-6f480f85bb15","Type":"ContainerStarted","Data":"cc8aa8c7b2154932164b2e90c91f4f5e98924e857b84d7158b9bcbafc69ad899"} Mar 14 08:34:02 crc kubenswrapper[5129]: I0314 08:34:02.210614 5129 generic.go:334] "Generic (PLEG): container finished" podID="57add038-a424-4de2-ba7d-6f480f85bb15" containerID="958955d3221ec788be2a773fa7635ba3103526d14730aa74ccdbf5e236f76389" exitCode=0 Mar 14 08:34:02 crc kubenswrapper[5129]: I0314 08:34:02.210675 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-qzqld" event={"ID":"57add038-a424-4de2-ba7d-6f480f85bb15","Type":"ContainerDied","Data":"958955d3221ec788be2a773fa7635ba3103526d14730aa74ccdbf5e236f76389"} Mar 14 08:34:03 crc kubenswrapper[5129]: I0314 08:34:03.581993 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-qzqld" Mar 14 08:34:03 crc kubenswrapper[5129]: I0314 08:34:03.767787 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw695\" (UniqueName: \"kubernetes.io/projected/57add038-a424-4de2-ba7d-6f480f85bb15-kube-api-access-bw695\") pod \"57add038-a424-4de2-ba7d-6f480f85bb15\" (UID: \"57add038-a424-4de2-ba7d-6f480f85bb15\") " Mar 14 08:34:03 crc kubenswrapper[5129]: I0314 08:34:03.774898 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57add038-a424-4de2-ba7d-6f480f85bb15-kube-api-access-bw695" (OuterVolumeSpecName: "kube-api-access-bw695") pod "57add038-a424-4de2-ba7d-6f480f85bb15" (UID: "57add038-a424-4de2-ba7d-6f480f85bb15"). InnerVolumeSpecName "kube-api-access-bw695". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:03 crc kubenswrapper[5129]: I0314 08:34:03.870160 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw695\" (UniqueName: \"kubernetes.io/projected/57add038-a424-4de2-ba7d-6f480f85bb15-kube-api-access-bw695\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:04 crc kubenswrapper[5129]: I0314 08:34:04.226318 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-qzqld" event={"ID":"57add038-a424-4de2-ba7d-6f480f85bb15","Type":"ContainerDied","Data":"cc8aa8c7b2154932164b2e90c91f4f5e98924e857b84d7158b9bcbafc69ad899"} Mar 14 08:34:04 crc kubenswrapper[5129]: I0314 08:34:04.226370 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc8aa8c7b2154932164b2e90c91f4f5e98924e857b84d7158b9bcbafc69ad899" Mar 14 08:34:04 crc kubenswrapper[5129]: I0314 08:34:04.226396 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-qzqld" Mar 14 08:34:04 crc kubenswrapper[5129]: I0314 08:34:04.666371 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-kcb48"] Mar 14 08:34:04 crc kubenswrapper[5129]: I0314 08:34:04.672259 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557948-kcb48"] Mar 14 08:34:06 crc kubenswrapper[5129]: I0314 08:34:06.044588 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b0dbbb-30c1-4812-880b-5f1384846f82" path="/var/lib/kubelet/pods/96b0dbbb-30c1-4812-880b-5f1384846f82/volumes" Mar 14 08:34:19 crc kubenswrapper[5129]: I0314 08:34:19.574694 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:34:19 crc kubenswrapper[5129]: I0314 08:34:19.575443 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:34:25 crc kubenswrapper[5129]: I0314 08:34:25.674494 5129 scope.go:117] "RemoveContainer" containerID="2c51730ec0bf9f36ddca04f79c357bb6a930ffe33c0a94e3f7dbd10dd399c28b" Mar 14 08:34:49 crc kubenswrapper[5129]: I0314 08:34:49.575678 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:34:49 crc kubenswrapper[5129]: I0314 08:34:49.576634 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:34:49 crc kubenswrapper[5129]: I0314 08:34:49.576722 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:34:49 crc kubenswrapper[5129]: I0314 08:34:49.577631 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:34:49 crc kubenswrapper[5129]: I0314 08:34:49.577705 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" gracePeriod=600 Mar 14 08:34:49 crc kubenswrapper[5129]: E0314 08:34:49.711665 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:34:50 crc kubenswrapper[5129]: I0314 08:34:50.636226 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" exitCode=0 Mar 14 08:34:50 crc kubenswrapper[5129]: I0314 08:34:50.636384 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0"} Mar 14 08:34:50 crc kubenswrapper[5129]: I0314 08:34:50.636836 5129 scope.go:117] "RemoveContainer" containerID="c83e0d17e7239a4e3d626b98c707f403777782f6beee2546f9008683acbabe1b" Mar 14 08:34:50 crc kubenswrapper[5129]: I0314 08:34:50.637565 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:34:50 crc kubenswrapper[5129]: E0314 08:34:50.638051 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:35:02 crc kubenswrapper[5129]: I0314 08:35:02.036427 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:35:02 crc kubenswrapper[5129]: E0314 08:35:02.037246 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:35:14 crc kubenswrapper[5129]: I0314 08:35:14.036754 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:35:14 crc kubenswrapper[5129]: E0314 08:35:14.037943 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:35:29 crc kubenswrapper[5129]: I0314 08:35:29.037152 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:35:29 crc kubenswrapper[5129]: E0314 08:35:29.038492 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:35:40 crc kubenswrapper[5129]: I0314 08:35:40.036927 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:35:40 crc kubenswrapper[5129]: E0314 08:35:40.038102 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:35:52 crc kubenswrapper[5129]: I0314 08:35:52.036875 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:35:52 crc kubenswrapper[5129]: E0314 08:35:52.038070 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.163888 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557956-zbnbh"] Mar 14 08:36:00 crc kubenswrapper[5129]: E0314 08:36:00.166340 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57add038-a424-4de2-ba7d-6f480f85bb15" containerName="oc" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.166377 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="57add038-a424-4de2-ba7d-6f480f85bb15" containerName="oc" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.166570 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="57add038-a424-4de2-ba7d-6f480f85bb15" containerName="oc" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.167325 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-zbnbh" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.170741 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.171197 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.171718 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.193135 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-zbnbh"] Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.267821 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6jtc\" (UniqueName: \"kubernetes.io/projected/9c2f7000-40ff-42ae-90cb-c9af757b3d4f-kube-api-access-s6jtc\") pod \"auto-csr-approver-29557956-zbnbh\" (UID: \"9c2f7000-40ff-42ae-90cb-c9af757b3d4f\") " pod="openshift-infra/auto-csr-approver-29557956-zbnbh" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.370113 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6jtc\" (UniqueName: \"kubernetes.io/projected/9c2f7000-40ff-42ae-90cb-c9af757b3d4f-kube-api-access-s6jtc\") pod \"auto-csr-approver-29557956-zbnbh\" (UID: \"9c2f7000-40ff-42ae-90cb-c9af757b3d4f\") " pod="openshift-infra/auto-csr-approver-29557956-zbnbh" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.396250 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6jtc\" (UniqueName: \"kubernetes.io/projected/9c2f7000-40ff-42ae-90cb-c9af757b3d4f-kube-api-access-s6jtc\") pod \"auto-csr-approver-29557956-zbnbh\" (UID: \"9c2f7000-40ff-42ae-90cb-c9af757b3d4f\") " pod="openshift-infra/auto-csr-approver-29557956-zbnbh" Mar 14 08:36:00 crc kubenswrapper[5129]: I0314 08:36:00.502509 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-zbnbh" Mar 14 08:36:01 crc kubenswrapper[5129]: I0314 08:36:01.028050 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-zbnbh"] Mar 14 08:36:01 crc kubenswrapper[5129]: W0314 08:36:01.040503 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c2f7000_40ff_42ae_90cb_c9af757b3d4f.slice/crio-3d49ce322ce90a5be42d1bd0a2d3a5632e7bc30ff4a15f18120a1c62f6a274fc WatchSource:0}: Error finding container 3d49ce322ce90a5be42d1bd0a2d3a5632e7bc30ff4a15f18120a1c62f6a274fc: Status 404 returned error can't find the container with id 3d49ce322ce90a5be42d1bd0a2d3a5632e7bc30ff4a15f18120a1c62f6a274fc Mar 14 08:36:01 crc kubenswrapper[5129]: I0314 08:36:01.260218 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-zbnbh" event={"ID":"9c2f7000-40ff-42ae-90cb-c9af757b3d4f","Type":"ContainerStarted","Data":"3d49ce322ce90a5be42d1bd0a2d3a5632e7bc30ff4a15f18120a1c62f6a274fc"} Mar 14 08:36:03 crc kubenswrapper[5129]: I0314 08:36:03.281191 5129 generic.go:334] "Generic (PLEG): container finished" podID="9c2f7000-40ff-42ae-90cb-c9af757b3d4f" containerID="31213297cc39fc3d90f5620eec879eb5663851329922d530791c835db4761ef9" exitCode=0 Mar 14 08:36:03 crc kubenswrapper[5129]: I0314 08:36:03.281312 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-zbnbh" event={"ID":"9c2f7000-40ff-42ae-90cb-c9af757b3d4f","Type":"ContainerDied","Data":"31213297cc39fc3d90f5620eec879eb5663851329922d530791c835db4761ef9"} Mar 14 08:36:04 crc kubenswrapper[5129]: I0314 08:36:04.037300 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:36:04 crc kubenswrapper[5129]: E0314 08:36:04.037675 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:36:04 crc kubenswrapper[5129]: I0314 08:36:04.669619 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-zbnbh" Mar 14 08:36:04 crc kubenswrapper[5129]: I0314 08:36:04.754053 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6jtc\" (UniqueName: \"kubernetes.io/projected/9c2f7000-40ff-42ae-90cb-c9af757b3d4f-kube-api-access-s6jtc\") pod \"9c2f7000-40ff-42ae-90cb-c9af757b3d4f\" (UID: \"9c2f7000-40ff-42ae-90cb-c9af757b3d4f\") " Mar 14 08:36:04 crc kubenswrapper[5129]: I0314 08:36:04.765985 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2f7000-40ff-42ae-90cb-c9af757b3d4f-kube-api-access-s6jtc" (OuterVolumeSpecName: "kube-api-access-s6jtc") pod "9c2f7000-40ff-42ae-90cb-c9af757b3d4f" (UID: "9c2f7000-40ff-42ae-90cb-c9af757b3d4f"). InnerVolumeSpecName "kube-api-access-s6jtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:36:04 crc kubenswrapper[5129]: I0314 08:36:04.856263 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6jtc\" (UniqueName: \"kubernetes.io/projected/9c2f7000-40ff-42ae-90cb-c9af757b3d4f-kube-api-access-s6jtc\") on node \"crc\" DevicePath \"\"" Mar 14 08:36:05 crc kubenswrapper[5129]: I0314 08:36:05.307052 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-zbnbh" event={"ID":"9c2f7000-40ff-42ae-90cb-c9af757b3d4f","Type":"ContainerDied","Data":"3d49ce322ce90a5be42d1bd0a2d3a5632e7bc30ff4a15f18120a1c62f6a274fc"} Mar 14 08:36:05 crc kubenswrapper[5129]: I0314 08:36:05.307132 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d49ce322ce90a5be42d1bd0a2d3a5632e7bc30ff4a15f18120a1c62f6a274fc" Mar 14 08:36:05 crc kubenswrapper[5129]: I0314 08:36:05.307171 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-zbnbh" Mar 14 08:36:05 crc kubenswrapper[5129]: I0314 08:36:05.755050 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-9f7mv"] Mar 14 08:36:05 crc kubenswrapper[5129]: I0314 08:36:05.762089 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-9f7mv"] Mar 14 08:36:06 crc kubenswrapper[5129]: I0314 08:36:06.050470 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb8e06c-eafa-4452-8cd4-211d499a27e2" path="/var/lib/kubelet/pods/fcb8e06c-eafa-4452-8cd4-211d499a27e2/volumes" Mar 14 08:36:15 crc kubenswrapper[5129]: I0314 08:36:15.036183 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:36:15 crc kubenswrapper[5129]: E0314 08:36:15.037504 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:36:25 crc kubenswrapper[5129]: I0314 08:36:25.808245 5129 scope.go:117] "RemoveContainer" containerID="f9d4c19f36e46caa75856f7defc22e419ea64fff26fd8871104e27c309466539" Mar 14 08:36:28 crc kubenswrapper[5129]: I0314 08:36:28.040825 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:36:28 crc kubenswrapper[5129]: E0314 08:36:28.041891 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:36:40 crc kubenswrapper[5129]: I0314 08:36:40.038173 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:36:40 crc kubenswrapper[5129]: E0314 08:36:40.039281 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:36:53 crc kubenswrapper[5129]: I0314 08:36:53.036319 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:36:53 crc kubenswrapper[5129]: E0314 08:36:53.037390 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:37:05 crc kubenswrapper[5129]: I0314 08:37:05.036992 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:37:05 crc kubenswrapper[5129]: E0314 08:37:05.038380 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:37:18 crc kubenswrapper[5129]: I0314 08:37:18.041872 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:37:18 crc kubenswrapper[5129]: E0314 08:37:18.043173 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:37:32 crc kubenswrapper[5129]: I0314 08:37:32.036663 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:37:32 crc kubenswrapper[5129]: E0314 08:37:32.037772 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:37:46 crc kubenswrapper[5129]: I0314 08:37:46.037862 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:37:46 crc kubenswrapper[5129]: E0314 08:37:46.039806 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:37:59 crc kubenswrapper[5129]: I0314 08:37:59.037572 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:37:59 crc kubenswrapper[5129]: E0314 08:37:59.041367 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.179424 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557958-6nrrl"] Mar 14 08:38:00 crc kubenswrapper[5129]: E0314 08:38:00.183593 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2f7000-40ff-42ae-90cb-c9af757b3d4f" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.183932 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2f7000-40ff-42ae-90cb-c9af757b3d4f" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.185024 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2f7000-40ff-42ae-90cb-c9af757b3d4f" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.186804 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.190591 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.191681 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.192489 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.193511 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-6nrrl"] Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.299075 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l2st\" (UniqueName: \"kubernetes.io/projected/b28690fd-bf49-4e65-b34f-f45051a34f2c-kube-api-access-5l2st\") pod \"auto-csr-approver-29557958-6nrrl\" (UID: \"b28690fd-bf49-4e65-b34f-f45051a34f2c\") " pod="openshift-infra/auto-csr-approver-29557958-6nrrl" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.395128 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-99bwb"] Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.397487 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.401248 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l2st\" (UniqueName: \"kubernetes.io/projected/b28690fd-bf49-4e65-b34f-f45051a34f2c-kube-api-access-5l2st\") pod \"auto-csr-approver-29557958-6nrrl\" (UID: \"b28690fd-bf49-4e65-b34f-f45051a34f2c\") " pod="openshift-infra/auto-csr-approver-29557958-6nrrl" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.409354 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99bwb"] Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.438466 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l2st\" (UniqueName: \"kubernetes.io/projected/b28690fd-bf49-4e65-b34f-f45051a34f2c-kube-api-access-5l2st\") pod \"auto-csr-approver-29557958-6nrrl\" (UID: \"b28690fd-bf49-4e65-b34f-f45051a34f2c\") " pod="openshift-infra/auto-csr-approver-29557958-6nrrl" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.503274 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-utilities\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.503334 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-catalog-content\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.503358 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd2nr\" (UniqueName: \"kubernetes.io/projected/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-kube-api-access-bd2nr\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.513588 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.605329 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-utilities\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.605741 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-catalog-content\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.605770 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd2nr\" (UniqueName: \"kubernetes.io/projected/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-kube-api-access-bd2nr\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.607181 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-utilities\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.607324 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-catalog-content\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.628579 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd2nr\" (UniqueName: \"kubernetes.io/projected/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-kube-api-access-bd2nr\") pod \"community-operators-99bwb\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:00 crc kubenswrapper[5129]: I0314 08:38:00.723576 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:01 crc kubenswrapper[5129]: I0314 08:38:01.016016 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99bwb"] Mar 14 08:38:01 crc kubenswrapper[5129]: I0314 08:38:01.112360 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-6nrrl"] Mar 14 08:38:01 crc kubenswrapper[5129]: I0314 08:38:01.113440 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:38:01 crc kubenswrapper[5129]: I0314 08:38:01.548012 5129 generic.go:334] "Generic (PLEG): container finished" podID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerID="f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d" exitCode=0 Mar 14 08:38:01 crc kubenswrapper[5129]: I0314 08:38:01.548152 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99bwb" event={"ID":"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13","Type":"ContainerDied","Data":"f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d"} Mar 14 08:38:01 crc kubenswrapper[5129]: I0314 08:38:01.548203 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99bwb" event={"ID":"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13","Type":"ContainerStarted","Data":"663c250c09b978a2435e4554ae3e8abcc030d0a2ae76c02ae614ddc5f5b12144"} Mar 14 08:38:01 crc kubenswrapper[5129]: I0314 08:38:01.550994 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" event={"ID":"b28690fd-bf49-4e65-b34f-f45051a34f2c","Type":"ContainerStarted","Data":"48542f48fa2374aeabe9ab09eaeff55349dc72c4a92260779a0e7890054c6c07"} Mar 14 08:38:02 crc kubenswrapper[5129]: I0314 08:38:02.564237 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" event={"ID":"b28690fd-bf49-4e65-b34f-f45051a34f2c","Type":"ContainerStarted","Data":"dafa89101ae2bd589c432dd0531910c43f02e300b1ea7b27dcca88f636dec1f0"} Mar 14 08:38:02 crc kubenswrapper[5129]: I0314 08:38:02.574517 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99bwb" event={"ID":"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13","Type":"ContainerStarted","Data":"28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9"} Mar 14 08:38:02 crc kubenswrapper[5129]: I0314 08:38:02.583515 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" podStartSLOduration=1.6669739 podStartE2EDuration="2.58348745s" podCreationTimestamp="2026-03-14 08:38:00 +0000 UTC" firstStartedPulling="2026-03-14 08:38:01.113119589 +0000 UTC m=+5943.865034773" lastFinishedPulling="2026-03-14 08:38:02.029633119 +0000 UTC m=+5944.781548323" observedRunningTime="2026-03-14 08:38:02.582069672 +0000 UTC m=+5945.333984856" watchObservedRunningTime="2026-03-14 08:38:02.58348745 +0000 UTC m=+5945.335402684" Mar 14 08:38:03 crc kubenswrapper[5129]: I0314 08:38:03.583960 5129 generic.go:334] "Generic (PLEG): container finished" podID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerID="28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9" exitCode=0 Mar 14 08:38:03 crc kubenswrapper[5129]: I0314 08:38:03.584121 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99bwb" event={"ID":"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13","Type":"ContainerDied","Data":"28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9"} Mar 14 08:38:03 crc kubenswrapper[5129]: I0314 08:38:03.585634 5129 generic.go:334] "Generic (PLEG): container finished" podID="b28690fd-bf49-4e65-b34f-f45051a34f2c" containerID="dafa89101ae2bd589c432dd0531910c43f02e300b1ea7b27dcca88f636dec1f0" exitCode=0 Mar 14 08:38:03 crc kubenswrapper[5129]: I0314 08:38:03.585679 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" event={"ID":"b28690fd-bf49-4e65-b34f-f45051a34f2c","Type":"ContainerDied","Data":"dafa89101ae2bd589c432dd0531910c43f02e300b1ea7b27dcca88f636dec1f0"} Mar 14 08:38:04 crc kubenswrapper[5129]: I0314 08:38:04.596312 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99bwb" event={"ID":"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13","Type":"ContainerStarted","Data":"600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad"} Mar 14 08:38:04 crc kubenswrapper[5129]: I0314 08:38:04.625957 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-99bwb" podStartSLOduration=2.08048434 podStartE2EDuration="4.625937311s" podCreationTimestamp="2026-03-14 08:38:00 +0000 UTC" firstStartedPulling="2026-03-14 08:38:01.551411651 +0000 UTC m=+5944.303326875" lastFinishedPulling="2026-03-14 08:38:04.096864652 +0000 UTC m=+5946.848779846" observedRunningTime="2026-03-14 08:38:04.621253283 +0000 UTC m=+5947.373168477" watchObservedRunningTime="2026-03-14 08:38:04.625937311 +0000 UTC m=+5947.377852495" Mar 14 08:38:04 crc kubenswrapper[5129]: I0314 08:38:04.901819 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.087810 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l2st\" (UniqueName: \"kubernetes.io/projected/b28690fd-bf49-4e65-b34f-f45051a34f2c-kube-api-access-5l2st\") pod \"b28690fd-bf49-4e65-b34f-f45051a34f2c\" (UID: \"b28690fd-bf49-4e65-b34f-f45051a34f2c\") " Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.098875 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28690fd-bf49-4e65-b34f-f45051a34f2c-kube-api-access-5l2st" (OuterVolumeSpecName: "kube-api-access-5l2st") pod "b28690fd-bf49-4e65-b34f-f45051a34f2c" (UID: "b28690fd-bf49-4e65-b34f-f45051a34f2c"). InnerVolumeSpecName "kube-api-access-5l2st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.190173 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l2st\" (UniqueName: \"kubernetes.io/projected/b28690fd-bf49-4e65-b34f-f45051a34f2c-kube-api-access-5l2st\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.606916 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" event={"ID":"b28690fd-bf49-4e65-b34f-f45051a34f2c","Type":"ContainerDied","Data":"48542f48fa2374aeabe9ab09eaeff55349dc72c4a92260779a0e7890054c6c07"} Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.607006 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48542f48fa2374aeabe9ab09eaeff55349dc72c4a92260779a0e7890054c6c07" Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.606940 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-6nrrl" Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.701848 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-78f2s"] Mar 14 08:38:05 crc kubenswrapper[5129]: I0314 08:38:05.718487 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-78f2s"] Mar 14 08:38:06 crc kubenswrapper[5129]: I0314 08:38:06.048127 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc02e4c-98f7-41a2-b0d4-83f074e5b614" path="/var/lib/kubelet/pods/ffc02e4c-98f7-41a2-b0d4-83f074e5b614/volumes" Mar 14 08:38:10 crc kubenswrapper[5129]: I0314 08:38:10.724245 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:10 crc kubenswrapper[5129]: I0314 08:38:10.726044 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:10 crc kubenswrapper[5129]: I0314 08:38:10.812705 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:11 crc kubenswrapper[5129]: I0314 08:38:11.744628 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:11 crc kubenswrapper[5129]: I0314 08:38:11.816564 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-99bwb"] Mar 14 08:38:13 crc kubenswrapper[5129]: I0314 08:38:13.037521 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:38:13 crc kubenswrapper[5129]: E0314 08:38:13.038284 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:38:13 crc kubenswrapper[5129]: I0314 08:38:13.686813 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-99bwb" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="registry-server" containerID="cri-o://600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad" gracePeriod=2 Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.164717 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.183560 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd2nr\" (UniqueName: \"kubernetes.io/projected/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-kube-api-access-bd2nr\") pod \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.183822 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-utilities\") pod \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.184011 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-catalog-content\") pod \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\" (UID: \"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13\") " Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.185679 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-utilities" (OuterVolumeSpecName: "utilities") pod "4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" (UID: "4baf2cb7-14ff-4583-a1a3-c5f5214a5b13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.195897 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-kube-api-access-bd2nr" (OuterVolumeSpecName: "kube-api-access-bd2nr") pod "4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" (UID: "4baf2cb7-14ff-4583-a1a3-c5f5214a5b13"). InnerVolumeSpecName "kube-api-access-bd2nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.260856 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" (UID: "4baf2cb7-14ff-4583-a1a3-c5f5214a5b13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.286082 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.286125 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd2nr\" (UniqueName: \"kubernetes.io/projected/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-kube-api-access-bd2nr\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.286142 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.702436 5129 generic.go:334] "Generic (PLEG): container finished" podID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerID="600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad" exitCode=0 Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.702522 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99bwb" event={"ID":"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13","Type":"ContainerDied","Data":"600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad"} Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.702544 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99bwb" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.702581 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99bwb" event={"ID":"4baf2cb7-14ff-4583-a1a3-c5f5214a5b13","Type":"ContainerDied","Data":"663c250c09b978a2435e4554ae3e8abcc030d0a2ae76c02ae614ddc5f5b12144"} Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.702651 5129 scope.go:117] "RemoveContainer" containerID="600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.750359 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-99bwb"] Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.763499 5129 scope.go:117] "RemoveContainer" containerID="28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.794381 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-99bwb"] Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.809025 5129 scope.go:117] "RemoveContainer" containerID="f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.835526 5129 scope.go:117] "RemoveContainer" containerID="600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad" Mar 14 08:38:14 crc kubenswrapper[5129]: E0314 08:38:14.836464 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad\": container with ID starting with 600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad not found: ID does not exist" containerID="600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.836560 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad"} err="failed to get container status \"600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad\": rpc error: code = NotFound desc = could not find container \"600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad\": container with ID starting with 600ce143bdbba5f2b2082a9eb685b27234dae0cbba613b61fbcda541575dccad not found: ID does not exist" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.836585 5129 scope.go:117] "RemoveContainer" containerID="28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9" Mar 14 08:38:14 crc kubenswrapper[5129]: E0314 08:38:14.837071 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9\": container with ID starting with 28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9 not found: ID does not exist" containerID="28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.837105 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9"} err="failed to get container status \"28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9\": rpc error: code = NotFound desc = could not find container \"28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9\": container with ID starting with 28b72526bdbbedc052670ffdb1e04c6d13559ab3610916d3398385914aabe9a9 not found: ID does not exist" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.837124 5129 scope.go:117] "RemoveContainer" containerID="f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d" Mar 14 08:38:14 crc kubenswrapper[5129]: E0314 08:38:14.837490 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d\": container with ID starting with f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d not found: ID does not exist" containerID="f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d" Mar 14 08:38:14 crc kubenswrapper[5129]: I0314 08:38:14.837520 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d"} err="failed to get container status \"f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d\": rpc error: code = NotFound desc = could not find container \"f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d\": container with ID starting with f665f696cd7adf360b1fa6b5c486feed7dcf195535bd1a038d092a40caabc52d not found: ID does not exist" Mar 14 08:38:16 crc kubenswrapper[5129]: I0314 08:38:16.046507 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" path="/var/lib/kubelet/pods/4baf2cb7-14ff-4583-a1a3-c5f5214a5b13/volumes" Mar 14 08:38:25 crc kubenswrapper[5129]: I0314 08:38:25.943725 5129 scope.go:117] "RemoveContainer" containerID="55ed641a8a9c8fe51b58fb59e25c3e0411486306d2ce4c959df7906ba082c1c4" Mar 14 08:38:27 crc kubenswrapper[5129]: I0314 08:38:27.037481 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:38:27 crc kubenswrapper[5129]: E0314 08:38:27.038859 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:38:42 crc kubenswrapper[5129]: I0314 08:38:42.037285 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:38:42 crc kubenswrapper[5129]: E0314 08:38:42.039739 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:38:54 crc kubenswrapper[5129]: I0314 08:38:54.036426 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:38:54 crc kubenswrapper[5129]: E0314 08:38:54.037532 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.036284 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:39:05 crc kubenswrapper[5129]: E0314 08:39:05.037392 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.296155 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bt8xr"] Mar 14 08:39:05 crc kubenswrapper[5129]: E0314 08:39:05.296486 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="extract-content" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.296499 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="extract-content" Mar 14 08:39:05 crc kubenswrapper[5129]: E0314 08:39:05.296516 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="extract-utilities" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.296523 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="extract-utilities" Mar 14 08:39:05 crc kubenswrapper[5129]: E0314 08:39:05.296541 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="registry-server" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.296548 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="registry-server" Mar 14 08:39:05 crc kubenswrapper[5129]: E0314 08:39:05.296561 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28690fd-bf49-4e65-b34f-f45051a34f2c" containerName="oc" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.296567 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28690fd-bf49-4e65-b34f-f45051a34f2c" containerName="oc" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.296755 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28690fd-bf49-4e65-b34f-f45051a34f2c" containerName="oc" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.296769 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4baf2cb7-14ff-4583-a1a3-c5f5214a5b13" containerName="registry-server" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.297825 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.331007 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bt8xr"] Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.387839 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdq8r\" (UniqueName: \"kubernetes.io/projected/23449468-b1fb-4705-92a8-7029d50d33db-kube-api-access-wdq8r\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.387891 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-catalog-content\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.388029 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-utilities\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.489725 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-utilities\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.489842 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdq8r\" (UniqueName: \"kubernetes.io/projected/23449468-b1fb-4705-92a8-7029d50d33db-kube-api-access-wdq8r\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.489875 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-catalog-content\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.490578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-utilities\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.490798 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-catalog-content\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.516079 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdq8r\" (UniqueName: \"kubernetes.io/projected/23449468-b1fb-4705-92a8-7029d50d33db-kube-api-access-wdq8r\") pod \"redhat-operators-bt8xr\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.632409 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:05 crc kubenswrapper[5129]: I0314 08:39:05.939857 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bt8xr"] Mar 14 08:39:06 crc kubenswrapper[5129]: I0314 08:39:06.268211 5129 generic.go:334] "Generic (PLEG): container finished" podID="23449468-b1fb-4705-92a8-7029d50d33db" containerID="b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82" exitCode=0 Mar 14 08:39:06 crc kubenswrapper[5129]: I0314 08:39:06.268332 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt8xr" event={"ID":"23449468-b1fb-4705-92a8-7029d50d33db","Type":"ContainerDied","Data":"b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82"} Mar 14 08:39:06 crc kubenswrapper[5129]: I0314 08:39:06.268638 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt8xr" event={"ID":"23449468-b1fb-4705-92a8-7029d50d33db","Type":"ContainerStarted","Data":"876cf2dd4ae4c52ee50b84918c4d329c95a1e738908239a0215caf7ca1e9c232"} Mar 14 08:39:07 crc kubenswrapper[5129]: I0314 08:39:07.279896 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt8xr" event={"ID":"23449468-b1fb-4705-92a8-7029d50d33db","Type":"ContainerStarted","Data":"dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd"} Mar 14 08:39:08 crc kubenswrapper[5129]: I0314 08:39:08.292913 5129 generic.go:334] "Generic (PLEG): container finished" podID="23449468-b1fb-4705-92a8-7029d50d33db" containerID="dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd" exitCode=0 Mar 14 08:39:08 crc kubenswrapper[5129]: I0314 08:39:08.292982 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt8xr" event={"ID":"23449468-b1fb-4705-92a8-7029d50d33db","Type":"ContainerDied","Data":"dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd"} Mar 14 08:39:09 crc kubenswrapper[5129]: I0314 08:39:09.305271 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt8xr" event={"ID":"23449468-b1fb-4705-92a8-7029d50d33db","Type":"ContainerStarted","Data":"b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656"} Mar 14 08:39:09 crc kubenswrapper[5129]: I0314 08:39:09.344021 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bt8xr" podStartSLOduration=1.86502978 podStartE2EDuration="4.343994235s" podCreationTimestamp="2026-03-14 08:39:05 +0000 UTC" firstStartedPulling="2026-03-14 08:39:06.269891507 +0000 UTC m=+6009.021806681" lastFinishedPulling="2026-03-14 08:39:08.748855952 +0000 UTC m=+6011.500771136" observedRunningTime="2026-03-14 08:39:09.340187131 +0000 UTC m=+6012.092102325" watchObservedRunningTime="2026-03-14 08:39:09.343994235 +0000 UTC m=+6012.095909419" Mar 14 08:39:15 crc kubenswrapper[5129]: I0314 08:39:15.633071 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:15 crc kubenswrapper[5129]: I0314 08:39:15.634394 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:16 crc kubenswrapper[5129]: I0314 08:39:16.710710 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bt8xr" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="registry-server" probeResult="failure" output=< Mar 14 08:39:16 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 08:39:16 crc kubenswrapper[5129]: > Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.640365 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-wvhn6"] Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.653954 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-wvhn6"] Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.801375 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7jbch"] Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.802995 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.806189 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.806305 5129 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dl4r7" Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.806203 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.806587 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.826522 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7jbch"] Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.905042 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/88c004ab-1cb2-4811-86ef-967541b73885-node-mnt\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.905118 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/88c004ab-1cb2-4811-86ef-967541b73885-crc-storage\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:17 crc kubenswrapper[5129]: I0314 08:39:17.905172 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mtkg\" (UniqueName: \"kubernetes.io/projected/88c004ab-1cb2-4811-86ef-967541b73885-kube-api-access-2mtkg\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.007757 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/88c004ab-1cb2-4811-86ef-967541b73885-node-mnt\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.007839 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/88c004ab-1cb2-4811-86ef-967541b73885-crc-storage\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.007888 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mtkg\" (UniqueName: \"kubernetes.io/projected/88c004ab-1cb2-4811-86ef-967541b73885-kube-api-access-2mtkg\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.008248 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/88c004ab-1cb2-4811-86ef-967541b73885-node-mnt\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.008972 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/88c004ab-1cb2-4811-86ef-967541b73885-crc-storage\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.034598 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mtkg\" (UniqueName: \"kubernetes.io/projected/88c004ab-1cb2-4811-86ef-967541b73885-kube-api-access-2mtkg\") pod \"crc-storage-crc-7jbch\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.049401 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c62665-6770-4c8e-a229-aa9f971a7db1" path="/var/lib/kubelet/pods/26c62665-6770-4c8e-a229-aa9f971a7db1/volumes" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.128336 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:18 crc kubenswrapper[5129]: I0314 08:39:18.449766 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7jbch"] Mar 14 08:39:19 crc kubenswrapper[5129]: I0314 08:39:19.415075 5129 generic.go:334] "Generic (PLEG): container finished" podID="88c004ab-1cb2-4811-86ef-967541b73885" containerID="648ac160b0e5ef41c7d315fd4639e4deaf049211c1b21000a6726f06f5ede149" exitCode=0 Mar 14 08:39:19 crc kubenswrapper[5129]: I0314 08:39:19.415313 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7jbch" event={"ID":"88c004ab-1cb2-4811-86ef-967541b73885","Type":"ContainerDied","Data":"648ac160b0e5ef41c7d315fd4639e4deaf049211c1b21000a6726f06f5ede149"} Mar 14 08:39:19 crc kubenswrapper[5129]: I0314 08:39:19.415698 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7jbch" event={"ID":"88c004ab-1cb2-4811-86ef-967541b73885","Type":"ContainerStarted","Data":"29f45175a2600c460f895db89ca3cc4b09f637e5101e42d0f91929d2f6377968"} Mar 14 08:39:20 crc kubenswrapper[5129]: I0314 08:39:20.037839 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:39:20 crc kubenswrapper[5129]: E0314 08:39:20.038299 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.220067 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.261863 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/88c004ab-1cb2-4811-86ef-967541b73885-node-mnt\") pod \"88c004ab-1cb2-4811-86ef-967541b73885\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.262006 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c004ab-1cb2-4811-86ef-967541b73885-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "88c004ab-1cb2-4811-86ef-967541b73885" (UID: "88c004ab-1cb2-4811-86ef-967541b73885"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.262046 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mtkg\" (UniqueName: \"kubernetes.io/projected/88c004ab-1cb2-4811-86ef-967541b73885-kube-api-access-2mtkg\") pod \"88c004ab-1cb2-4811-86ef-967541b73885\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.262144 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/88c004ab-1cb2-4811-86ef-967541b73885-crc-storage\") pod \"88c004ab-1cb2-4811-86ef-967541b73885\" (UID: \"88c004ab-1cb2-4811-86ef-967541b73885\") " Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.262330 5129 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/88c004ab-1cb2-4811-86ef-967541b73885-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.270251 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c004ab-1cb2-4811-86ef-967541b73885-kube-api-access-2mtkg" (OuterVolumeSpecName: "kube-api-access-2mtkg") pod "88c004ab-1cb2-4811-86ef-967541b73885" (UID: "88c004ab-1cb2-4811-86ef-967541b73885"). InnerVolumeSpecName "kube-api-access-2mtkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.295995 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c004ab-1cb2-4811-86ef-967541b73885-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "88c004ab-1cb2-4811-86ef-967541b73885" (UID: "88c004ab-1cb2-4811-86ef-967541b73885"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.365107 5129 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/88c004ab-1cb2-4811-86ef-967541b73885-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.365155 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mtkg\" (UniqueName: \"kubernetes.io/projected/88c004ab-1cb2-4811-86ef-967541b73885-kube-api-access-2mtkg\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.437993 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7jbch" event={"ID":"88c004ab-1cb2-4811-86ef-967541b73885","Type":"ContainerDied","Data":"29f45175a2600c460f895db89ca3cc4b09f637e5101e42d0f91929d2f6377968"} Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.438125 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f45175a2600c460f895db89ca3cc4b09f637e5101e42d0f91929d2f6377968" Mar 14 08:39:21 crc kubenswrapper[5129]: I0314 08:39:21.438046 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7jbch" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.767566 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-7jbch"] Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.779512 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-7jbch"] Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.947694 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-44r8c"] Mar 14 08:39:23 crc kubenswrapper[5129]: E0314 08:39:23.948674 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c004ab-1cb2-4811-86ef-967541b73885" containerName="storage" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.948829 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c004ab-1cb2-4811-86ef-967541b73885" containerName="storage" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.949292 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c004ab-1cb2-4811-86ef-967541b73885" containerName="storage" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.950289 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.953108 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.953306 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.953560 5129 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-dl4r7" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.953595 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 14 08:39:23 crc kubenswrapper[5129]: I0314 08:39:23.957520 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-44r8c"] Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.015507 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d494d423-2c6f-439b-8224-127d0fd1a9fa-node-mnt\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.015749 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d494d423-2c6f-439b-8224-127d0fd1a9fa-crc-storage\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.015823 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdl2\" (UniqueName: \"kubernetes.io/projected/d494d423-2c6f-439b-8224-127d0fd1a9fa-kube-api-access-gfdl2\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.046956 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c004ab-1cb2-4811-86ef-967541b73885" path="/var/lib/kubelet/pods/88c004ab-1cb2-4811-86ef-967541b73885/volumes" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.116496 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d494d423-2c6f-439b-8224-127d0fd1a9fa-crc-storage\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.116596 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdl2\" (UniqueName: \"kubernetes.io/projected/d494d423-2c6f-439b-8224-127d0fd1a9fa-kube-api-access-gfdl2\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.116750 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d494d423-2c6f-439b-8224-127d0fd1a9fa-node-mnt\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.117852 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d494d423-2c6f-439b-8224-127d0fd1a9fa-node-mnt\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.118913 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d494d423-2c6f-439b-8224-127d0fd1a9fa-crc-storage\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.142792 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdl2\" (UniqueName: \"kubernetes.io/projected/d494d423-2c6f-439b-8224-127d0fd1a9fa-kube-api-access-gfdl2\") pod \"crc-storage-crc-44r8c\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.274386 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:24 crc kubenswrapper[5129]: I0314 08:39:24.779308 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-44r8c"] Mar 14 08:39:25 crc kubenswrapper[5129]: I0314 08:39:25.481655 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-44r8c" event={"ID":"d494d423-2c6f-439b-8224-127d0fd1a9fa","Type":"ContainerStarted","Data":"98b3feffe9702b410fb005151f685c6104b3efd9f076af2aa9b079d1d820112a"} Mar 14 08:39:25 crc kubenswrapper[5129]: I0314 08:39:25.712386 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:25 crc kubenswrapper[5129]: I0314 08:39:25.783943 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:25 crc kubenswrapper[5129]: I0314 08:39:25.969578 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bt8xr"] Mar 14 08:39:26 crc kubenswrapper[5129]: I0314 08:39:26.092256 5129 scope.go:117] "RemoveContainer" containerID="d1c2e8eb6ec0d2e29ae70e4296da04982a751b1adfd4cb0459bb26e93499a540" Mar 14 08:39:26 crc kubenswrapper[5129]: I0314 08:39:26.499069 5129 generic.go:334] "Generic (PLEG): container finished" podID="d494d423-2c6f-439b-8224-127d0fd1a9fa" containerID="accd70c7e5a7d835a1173052a05b9108c7ded18ec1d557d9a85f66596a277e46" exitCode=0 Mar 14 08:39:26 crc kubenswrapper[5129]: I0314 08:39:26.499463 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-44r8c" event={"ID":"d494d423-2c6f-439b-8224-127d0fd1a9fa","Type":"ContainerDied","Data":"accd70c7e5a7d835a1173052a05b9108c7ded18ec1d557d9a85f66596a277e46"} Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.514763 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bt8xr" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="registry-server" containerID="cri-o://b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656" gracePeriod=2 Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.862558 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.892324 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d494d423-2c6f-439b-8224-127d0fd1a9fa-crc-storage\") pod \"d494d423-2c6f-439b-8224-127d0fd1a9fa\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.892388 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfdl2\" (UniqueName: \"kubernetes.io/projected/d494d423-2c6f-439b-8224-127d0fd1a9fa-kube-api-access-gfdl2\") pod \"d494d423-2c6f-439b-8224-127d0fd1a9fa\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.892520 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d494d423-2c6f-439b-8224-127d0fd1a9fa-node-mnt\") pod \"d494d423-2c6f-439b-8224-127d0fd1a9fa\" (UID: \"d494d423-2c6f-439b-8224-127d0fd1a9fa\") " Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.892892 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d494d423-2c6f-439b-8224-127d0fd1a9fa-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d494d423-2c6f-439b-8224-127d0fd1a9fa" (UID: "d494d423-2c6f-439b-8224-127d0fd1a9fa"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.906992 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d494d423-2c6f-439b-8224-127d0fd1a9fa-kube-api-access-gfdl2" (OuterVolumeSpecName: "kube-api-access-gfdl2") pod "d494d423-2c6f-439b-8224-127d0fd1a9fa" (UID: "d494d423-2c6f-439b-8224-127d0fd1a9fa"). InnerVolumeSpecName "kube-api-access-gfdl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.914236 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d494d423-2c6f-439b-8224-127d0fd1a9fa-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d494d423-2c6f-439b-8224-127d0fd1a9fa" (UID: "d494d423-2c6f-439b-8224-127d0fd1a9fa"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.951679 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.993864 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-utilities\") pod \"23449468-b1fb-4705-92a8-7029d50d33db\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.993998 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-catalog-content\") pod \"23449468-b1fb-4705-92a8-7029d50d33db\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.994207 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdq8r\" (UniqueName: \"kubernetes.io/projected/23449468-b1fb-4705-92a8-7029d50d33db-kube-api-access-wdq8r\") pod \"23449468-b1fb-4705-92a8-7029d50d33db\" (UID: \"23449468-b1fb-4705-92a8-7029d50d33db\") " Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.994633 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfdl2\" (UniqueName: \"kubernetes.io/projected/d494d423-2c6f-439b-8224-127d0fd1a9fa-kube-api-access-gfdl2\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.994654 5129 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d494d423-2c6f-439b-8224-127d0fd1a9fa-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.994669 5129 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d494d423-2c6f-439b-8224-127d0fd1a9fa-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.997408 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23449468-b1fb-4705-92a8-7029d50d33db-kube-api-access-wdq8r" (OuterVolumeSpecName: "kube-api-access-wdq8r") pod "23449468-b1fb-4705-92a8-7029d50d33db" (UID: "23449468-b1fb-4705-92a8-7029d50d33db"). InnerVolumeSpecName "kube-api-access-wdq8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:39:27 crc kubenswrapper[5129]: I0314 08:39:27.997573 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-utilities" (OuterVolumeSpecName: "utilities") pod "23449468-b1fb-4705-92a8-7029d50d33db" (UID: "23449468-b1fb-4705-92a8-7029d50d33db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.096222 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdq8r\" (UniqueName: \"kubernetes.io/projected/23449468-b1fb-4705-92a8-7029d50d33db-kube-api-access-wdq8r\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.096257 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.136383 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23449468-b1fb-4705-92a8-7029d50d33db" (UID: "23449468-b1fb-4705-92a8-7029d50d33db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.196684 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23449468-b1fb-4705-92a8-7029d50d33db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.521687 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-44r8c" event={"ID":"d494d423-2c6f-439b-8224-127d0fd1a9fa","Type":"ContainerDied","Data":"98b3feffe9702b410fb005151f685c6104b3efd9f076af2aa9b079d1d820112a"} Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.521745 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-44r8c" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.521758 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98b3feffe9702b410fb005151f685c6104b3efd9f076af2aa9b079d1d820112a" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.526387 5129 generic.go:334] "Generic (PLEG): container finished" podID="23449468-b1fb-4705-92a8-7029d50d33db" containerID="b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656" exitCode=0 Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.526481 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt8xr" event={"ID":"23449468-b1fb-4705-92a8-7029d50d33db","Type":"ContainerDied","Data":"b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656"} Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.526537 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bt8xr" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.526661 5129 scope.go:117] "RemoveContainer" containerID="b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.526573 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bt8xr" event={"ID":"23449468-b1fb-4705-92a8-7029d50d33db","Type":"ContainerDied","Data":"876cf2dd4ae4c52ee50b84918c4d329c95a1e738908239a0215caf7ca1e9c232"} Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.559683 5129 scope.go:117] "RemoveContainer" containerID="dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.582170 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bt8xr"] Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.589815 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bt8xr"] Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.596461 5129 scope.go:117] "RemoveContainer" containerID="b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.626260 5129 scope.go:117] "RemoveContainer" containerID="b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656" Mar 14 08:39:28 crc kubenswrapper[5129]: E0314 08:39:28.626820 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656\": container with ID starting with b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656 not found: ID does not exist" containerID="b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.626894 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656"} err="failed to get container status \"b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656\": rpc error: code = NotFound desc = could not find container \"b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656\": container with ID starting with b579c16b45f2ca80e2b1ef5159bed3131bc41d4696b637a2524204f41ec95656 not found: ID does not exist" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.626933 5129 scope.go:117] "RemoveContainer" containerID="dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd" Mar 14 08:39:28 crc kubenswrapper[5129]: E0314 08:39:28.627428 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd\": container with ID starting with dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd not found: ID does not exist" containerID="dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.627466 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd"} err="failed to get container status \"dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd\": rpc error: code = NotFound desc = could not find container \"dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd\": container with ID starting with dd23770dd7e8bf2f9511fa7f222634ffa137afceebbbe723edab005c61d170fd not found: ID does not exist" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.627495 5129 scope.go:117] "RemoveContainer" containerID="b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82" Mar 14 08:39:28 crc kubenswrapper[5129]: E0314 08:39:28.628355 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82\": container with ID starting with b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82 not found: ID does not exist" containerID="b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82" Mar 14 08:39:28 crc kubenswrapper[5129]: I0314 08:39:28.628389 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82"} err="failed to get container status \"b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82\": rpc error: code = NotFound desc = could not find container \"b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82\": container with ID starting with b299758165b0b1835ef2bea01185ccaa810517d313422b9e444618dabe895d82 not found: ID does not exist" Mar 14 08:39:30 crc kubenswrapper[5129]: I0314 08:39:30.052214 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23449468-b1fb-4705-92a8-7029d50d33db" path="/var/lib/kubelet/pods/23449468-b1fb-4705-92a8-7029d50d33db/volumes" Mar 14 08:39:35 crc kubenswrapper[5129]: I0314 08:39:35.037449 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:39:35 crc kubenswrapper[5129]: E0314 08:39:35.039429 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:39:48 crc kubenswrapper[5129]: I0314 08:39:48.046461 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:39:48 crc kubenswrapper[5129]: E0314 08:39:48.047908 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.185673 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557960-kkw5p"] Mar 14 08:40:00 crc kubenswrapper[5129]: E0314 08:40:00.187032 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="extract-utilities" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.187050 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="extract-utilities" Mar 14 08:40:00 crc kubenswrapper[5129]: E0314 08:40:00.187075 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="extract-content" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.187083 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="extract-content" Mar 14 08:40:00 crc kubenswrapper[5129]: E0314 08:40:00.187105 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d494d423-2c6f-439b-8224-127d0fd1a9fa" containerName="storage" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.187114 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d494d423-2c6f-439b-8224-127d0fd1a9fa" containerName="storage" Mar 14 08:40:00 crc kubenswrapper[5129]: E0314 08:40:00.187131 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.187138 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.187298 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d494d423-2c6f-439b-8224-127d0fd1a9fa" containerName="storage" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.187318 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="23449468-b1fb-4705-92a8-7029d50d33db" containerName="registry-server" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.188118 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.190464 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.195194 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.195201 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.205224 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-kkw5p"] Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.313765 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgk8\" (UniqueName: \"kubernetes.io/projected/c4408d59-d046-40f4-a52d-8c87a07200f8-kube-api-access-hsgk8\") pod \"auto-csr-approver-29557960-kkw5p\" (UID: \"c4408d59-d046-40f4-a52d-8c87a07200f8\") " pod="openshift-infra/auto-csr-approver-29557960-kkw5p" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.416685 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgk8\" (UniqueName: \"kubernetes.io/projected/c4408d59-d046-40f4-a52d-8c87a07200f8-kube-api-access-hsgk8\") pod \"auto-csr-approver-29557960-kkw5p\" (UID: \"c4408d59-d046-40f4-a52d-8c87a07200f8\") " pod="openshift-infra/auto-csr-approver-29557960-kkw5p" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.450713 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgk8\" (UniqueName: \"kubernetes.io/projected/c4408d59-d046-40f4-a52d-8c87a07200f8-kube-api-access-hsgk8\") pod \"auto-csr-approver-29557960-kkw5p\" (UID: \"c4408d59-d046-40f4-a52d-8c87a07200f8\") " pod="openshift-infra/auto-csr-approver-29557960-kkw5p" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.508107 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.828759 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-kkw5p"] Mar 14 08:40:00 crc kubenswrapper[5129]: I0314 08:40:00.867119 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" event={"ID":"c4408d59-d046-40f4-a52d-8c87a07200f8","Type":"ContainerStarted","Data":"4973be70fe0d5d1912e311ac606b44ac5961986942eed181fc5334e077bc307a"} Mar 14 08:40:02 crc kubenswrapper[5129]: I0314 08:40:02.037702 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:40:02 crc kubenswrapper[5129]: I0314 08:40:02.889252 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"b7dd76c976b1c799fda9092d5895c529ac9cbd574cb01ccbcc80fb6c3c94c49a"} Mar 14 08:40:02 crc kubenswrapper[5129]: I0314 08:40:02.892272 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" event={"ID":"c4408d59-d046-40f4-a52d-8c87a07200f8","Type":"ContainerStarted","Data":"48c2044f325a7aed9472dbaca322b356ea39d6fa5711d2bb573922ff271520e9"} Mar 14 08:40:03 crc kubenswrapper[5129]: I0314 08:40:03.902196 5129 generic.go:334] "Generic (PLEG): container finished" podID="c4408d59-d046-40f4-a52d-8c87a07200f8" containerID="48c2044f325a7aed9472dbaca322b356ea39d6fa5711d2bb573922ff271520e9" exitCode=0 Mar 14 08:40:03 crc kubenswrapper[5129]: I0314 08:40:03.902335 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" event={"ID":"c4408d59-d046-40f4-a52d-8c87a07200f8","Type":"ContainerDied","Data":"48c2044f325a7aed9472dbaca322b356ea39d6fa5711d2bb573922ff271520e9"} Mar 14 08:40:05 crc kubenswrapper[5129]: I0314 08:40:05.232822 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" Mar 14 08:40:05 crc kubenswrapper[5129]: I0314 08:40:05.412771 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsgk8\" (UniqueName: \"kubernetes.io/projected/c4408d59-d046-40f4-a52d-8c87a07200f8-kube-api-access-hsgk8\") pod \"c4408d59-d046-40f4-a52d-8c87a07200f8\" (UID: \"c4408d59-d046-40f4-a52d-8c87a07200f8\") " Mar 14 08:40:05 crc kubenswrapper[5129]: I0314 08:40:05.429930 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4408d59-d046-40f4-a52d-8c87a07200f8-kube-api-access-hsgk8" (OuterVolumeSpecName: "kube-api-access-hsgk8") pod "c4408d59-d046-40f4-a52d-8c87a07200f8" (UID: "c4408d59-d046-40f4-a52d-8c87a07200f8"). InnerVolumeSpecName "kube-api-access-hsgk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:40:05 crc kubenswrapper[5129]: I0314 08:40:05.516249 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsgk8\" (UniqueName: \"kubernetes.io/projected/c4408d59-d046-40f4-a52d-8c87a07200f8-kube-api-access-hsgk8\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:05 crc kubenswrapper[5129]: I0314 08:40:05.923376 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" event={"ID":"c4408d59-d046-40f4-a52d-8c87a07200f8","Type":"ContainerDied","Data":"4973be70fe0d5d1912e311ac606b44ac5961986942eed181fc5334e077bc307a"} Mar 14 08:40:05 crc kubenswrapper[5129]: I0314 08:40:05.923459 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-kkw5p" Mar 14 08:40:05 crc kubenswrapper[5129]: I0314 08:40:05.923467 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4973be70fe0d5d1912e311ac606b44ac5961986942eed181fc5334e077bc307a" Mar 14 08:40:06 crc kubenswrapper[5129]: I0314 08:40:06.317207 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-qzqld"] Mar 14 08:40:06 crc kubenswrapper[5129]: I0314 08:40:06.324558 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-qzqld"] Mar 14 08:40:08 crc kubenswrapper[5129]: I0314 08:40:08.052985 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57add038-a424-4de2-ba7d-6f480f85bb15" path="/var/lib/kubelet/pods/57add038-a424-4de2-ba7d-6f480f85bb15/volumes" Mar 14 08:40:26 crc kubenswrapper[5129]: I0314 08:40:26.185533 5129 scope.go:117] "RemoveContainer" containerID="958955d3221ec788be2a773fa7635ba3103526d14730aa74ccdbf5e236f76389" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.047004 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc74d6fb5-9hd54"] Mar 14 08:41:35 crc kubenswrapper[5129]: E0314 08:41:35.049388 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4408d59-d046-40f4-a52d-8c87a07200f8" containerName="oc" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.049408 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4408d59-d046-40f4-a52d-8c87a07200f8" containerName="oc" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.049564 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4408d59-d046-40f4-a52d-8c87a07200f8" containerName="oc" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.050259 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.054240 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.054987 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lmrvw" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.056664 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.056879 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.072276 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc74d6fb5-9hd54"] Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.095760 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b8f4dc9f-29vbm"] Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.101524 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.106901 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.143768 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b8f4dc9f-29vbm"] Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.182482 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826hc\" (UniqueName: \"kubernetes.io/projected/f6414662-676e-4c0f-b509-3ce8d1f61ab4-kube-api-access-826hc\") pod \"dnsmasq-dns-bc74d6fb5-9hd54\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.182578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6414662-676e-4c0f-b509-3ce8d1f61ab4-config\") pod \"dnsmasq-dns-bc74d6fb5-9hd54\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.284394 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wpm\" (UniqueName: \"kubernetes.io/projected/4d4cf759-66a7-4f0f-862a-68522a71bba6-kube-api-access-w9wpm\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.284903 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-826hc\" (UniqueName: \"kubernetes.io/projected/f6414662-676e-4c0f-b509-3ce8d1f61ab4-kube-api-access-826hc\") pod \"dnsmasq-dns-bc74d6fb5-9hd54\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.284956 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6414662-676e-4c0f-b509-3ce8d1f61ab4-config\") pod \"dnsmasq-dns-bc74d6fb5-9hd54\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.284982 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-dns-svc\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.285040 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-config\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.286237 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6414662-676e-4c0f-b509-3ce8d1f61ab4-config\") pod \"dnsmasq-dns-bc74d6fb5-9hd54\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.328855 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-826hc\" (UniqueName: \"kubernetes.io/projected/f6414662-676e-4c0f-b509-3ce8d1f61ab4-kube-api-access-826hc\") pod \"dnsmasq-dns-bc74d6fb5-9hd54\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.376383 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.388347 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wpm\" (UniqueName: \"kubernetes.io/projected/4d4cf759-66a7-4f0f-862a-68522a71bba6-kube-api-access-w9wpm\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.388420 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-dns-svc\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.388468 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-config\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.389495 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-dns-svc\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.391395 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-config\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.412270 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wpm\" (UniqueName: \"kubernetes.io/projected/4d4cf759-66a7-4f0f-862a-68522a71bba6-kube-api-access-w9wpm\") pod \"dnsmasq-dns-9b8f4dc9f-29vbm\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.440547 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.495378 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc74d6fb5-9hd54"] Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.525426 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-776d6cccf5-nr64p"] Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.527091 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.549286 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-776d6cccf5-nr64p"] Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.694082 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-dns-svc\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.694567 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-config\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.694626 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg2k9\" (UniqueName: \"kubernetes.io/projected/04fd60cf-8519-49d4-bff2-367a1a0a6667-kube-api-access-hg2k9\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.796009 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-dns-svc\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.796067 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-config\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.796110 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg2k9\" (UniqueName: \"kubernetes.io/projected/04fd60cf-8519-49d4-bff2-367a1a0a6667-kube-api-access-hg2k9\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.797716 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-dns-svc\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.798213 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-config\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.817723 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg2k9\" (UniqueName: \"kubernetes.io/projected/04fd60cf-8519-49d4-bff2-367a1a0a6667-kube-api-access-hg2k9\") pod \"dnsmasq-dns-776d6cccf5-nr64p\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:35 crc kubenswrapper[5129]: I0314 08:41:35.851615 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:36 crc kubenswrapper[5129]: W0314 08:41:36.062362 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d4cf759_66a7_4f0f_862a_68522a71bba6.slice/crio-76deffe658a42c4d1dbdb8a5ecbfc314724bfc42e882cc20b638de9da4d57788 WatchSource:0}: Error finding container 76deffe658a42c4d1dbdb8a5ecbfc314724bfc42e882cc20b638de9da4d57788: Status 404 returned error can't find the container with id 76deffe658a42c4d1dbdb8a5ecbfc314724bfc42e882cc20b638de9da4d57788 Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.064206 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b8f4dc9f-29vbm"] Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.069267 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc74d6fb5-9hd54"] Mar 14 08:41:36 crc kubenswrapper[5129]: W0314 08:41:36.070953 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6414662_676e_4c0f_b509_3ce8d1f61ab4.slice/crio-32652059e7ccc49ee8f833bfd3dac7ccd64407d26f52b21d210d784f0cee6915 WatchSource:0}: Error finding container 32652059e7ccc49ee8f833bfd3dac7ccd64407d26f52b21d210d784f0cee6915: Status 404 returned error can't find the container with id 32652059e7ccc49ee8f833bfd3dac7ccd64407d26f52b21d210d784f0cee6915 Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.368721 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-776d6cccf5-nr64p"] Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.395273 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65bcbb86c9-jtnfv"] Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.396792 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.411881 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bcbb86c9-jtnfv"] Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.446830 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-776d6cccf5-nr64p"] Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.508788 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djpjv\" (UniqueName: \"kubernetes.io/projected/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-kube-api-access-djpjv\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.508864 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-config\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.509056 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-dns-svc\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.611138 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-config\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.611248 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-dns-svc\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.611331 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djpjv\" (UniqueName: \"kubernetes.io/projected/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-kube-api-access-djpjv\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.612765 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-config\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.613161 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-dns-svc\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.641992 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djpjv\" (UniqueName: \"kubernetes.io/projected/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-kube-api-access-djpjv\") pod \"dnsmasq-dns-65bcbb86c9-jtnfv\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.706230 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.707883 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.714098 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7gmct" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.714905 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.714974 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.715910 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.719983 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.720223 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.720443 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.730226 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.756754 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.803417 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" event={"ID":"f6414662-676e-4c0f-b509-3ce8d1f61ab4","Type":"ContainerStarted","Data":"32652059e7ccc49ee8f833bfd3dac7ccd64407d26f52b21d210d784f0cee6915"} Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.805525 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" event={"ID":"4d4cf759-66a7-4f0f-862a-68522a71bba6","Type":"ContainerStarted","Data":"76deffe658a42c4d1dbdb8a5ecbfc314724bfc42e882cc20b638de9da4d57788"} Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.808543 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" event={"ID":"04fd60cf-8519-49d4-bff2-367a1a0a6667","Type":"ContainerStarted","Data":"6e6b8fbf4ae6061a566aa2bc16d8a1e57b666e72052472e5fa13e7849a49f472"} Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.820594 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99e335f0-247c-4012-8b7a-67147c2cb39f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.820924 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.820959 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.820979 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.821009 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99e335f0-247c-4012-8b7a-67147c2cb39f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.821033 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.821056 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.821082 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.821100 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.821142 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.821166 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqmk\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-kube-api-access-4rqmk\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922413 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922481 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922513 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922545 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922590 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922694 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqmk\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-kube-api-access-4rqmk\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922735 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99e335f0-247c-4012-8b7a-67147c2cb39f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922782 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922825 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922851 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.922881 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99e335f0-247c-4012-8b7a-67147c2cb39f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.924268 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.925174 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.925741 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.926810 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.927124 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.929308 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.929531 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.929548 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99e335f0-247c-4012-8b7a-67147c2cb39f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.930168 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.930215 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/06af7259acb5e8dba4b6de6868139e8986e4b013821acf605ac634f3b027303c/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.930707 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99e335f0-247c-4012-8b7a-67147c2cb39f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.946676 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqmk\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-kube-api-access-4rqmk\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:36 crc kubenswrapper[5129]: I0314 08:41:36.985036 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.079562 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.254274 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bcbb86c9-jtnfv"] Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.558563 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.565906 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.571392 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.571493 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.571681 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.571752 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.571754 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.571867 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.571907 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l6pdc" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.572153 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.643375 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.643439 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.643472 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.643768 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.643871 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.643928 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.644073 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.644098 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vzg5\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-kube-api-access-9vzg5\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.644139 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.644301 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.644337 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.684907 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745707 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745774 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vzg5\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-kube-api-access-9vzg5\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745806 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745848 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745877 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745928 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745949 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.745973 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.746014 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.746048 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.746077 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.747101 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.748378 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.748632 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.748847 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.749079 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.752986 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.753627 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.753660 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8772a841facf27c028a2777758a67103ea1c530fdaed26926545981d1b0917a9/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.754436 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.763934 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.764887 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.767647 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vzg5\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-kube-api-access-9vzg5\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.790653 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.821108 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" event={"ID":"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f","Type":"ContainerStarted","Data":"b0cee4e42a7ea2827e45a6a6fbc17233aa5318cb8d677f73f9d46629d512683d"} Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.823134 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99e335f0-247c-4012-8b7a-67147c2cb39f","Type":"ContainerStarted","Data":"05175ec5edd6f3b646f5bf3f4d4fc5066e05be578c504b49ff660248c423b671"} Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.882074 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.883948 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.888055 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.888197 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.888841 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.888167 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-84slz" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.895019 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.896526 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.897316 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.951283 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpv8\" (UniqueName: \"kubernetes.io/projected/069305d5-891e-4158-ba2e-9fc30afeadcb-kube-api-access-kmpv8\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.951794 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-kolla-config\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.951837 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/069305d5-891e-4158-ba2e-9fc30afeadcb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.951987 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-config-data-default\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.952156 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.952194 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.952220 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/069305d5-891e-4158-ba2e-9fc30afeadcb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:37 crc kubenswrapper[5129]: I0314 08:41:37.952282 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069305d5-891e-4158-ba2e-9fc30afeadcb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.058274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.058398 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/069305d5-891e-4158-ba2e-9fc30afeadcb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.058766 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-operator-scripts\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.059228 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.059354 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069305d5-891e-4158-ba2e-9fc30afeadcb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.059538 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpv8\" (UniqueName: \"kubernetes.io/projected/069305d5-891e-4158-ba2e-9fc30afeadcb-kube-api-access-kmpv8\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.060410 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-kolla-config\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.060533 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/069305d5-891e-4158-ba2e-9fc30afeadcb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.060709 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-config-data-default\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.061299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/069305d5-891e-4158-ba2e-9fc30afeadcb-config-data-generated\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.062148 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-config-data-default\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.063062 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/069305d5-891e-4158-ba2e-9fc30afeadcb-kolla-config\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.063255 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069305d5-891e-4158-ba2e-9fc30afeadcb-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.064494 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.064570 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/476da7a1081a70cb5cae4460ca1b850c9d5da34a6622a6fe6c1076d9e1f0a23a/globalmount\"" pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.076001 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/069305d5-891e-4158-ba2e-9fc30afeadcb-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.095835 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpv8\" (UniqueName: \"kubernetes.io/projected/069305d5-891e-4158-ba2e-9fc30afeadcb-kube-api-access-kmpv8\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.138072 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ebf088bf-5922-4e55-9adc-7fb1808ab7f3\") pod \"openstack-galera-0\" (UID: \"069305d5-891e-4158-ba2e-9fc30afeadcb\") " pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.229636 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.525804 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.790063 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.841743 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"069305d5-891e-4158-ba2e-9fc30afeadcb","Type":"ContainerStarted","Data":"94b80903716d04738b7f3dad0f4ec775867f602d89a95fe74da9ae04066bc9c7"} Mar 14 08:41:38 crc kubenswrapper[5129]: I0314 08:41:38.843103 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10","Type":"ContainerStarted","Data":"522b442f6200f468b2b54ea73223aac767631d8fd155d979d37c3385ddfdf982"} Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.225582 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.229808 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.236187 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.236639 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.236755 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.236835 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ks28w" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.257476 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306258 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8731d474-6329-4a56-be08-fe3d12bb33cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306402 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306459 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-670cc701-5b3a-433d-b557-f64a378054c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670cc701-5b3a-433d-b557-f64a378054c6\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306515 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8731d474-6329-4a56-be08-fe3d12bb33cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306548 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8731d474-6329-4a56-be08-fe3d12bb33cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306570 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bb87\" (UniqueName: \"kubernetes.io/projected/8731d474-6329-4a56-be08-fe3d12bb33cd-kube-api-access-8bb87\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306589 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.306636 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.409082 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.409200 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8731d474-6329-4a56-be08-fe3d12bb33cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.409989 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8731d474-6329-4a56-be08-fe3d12bb33cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.410041 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.410098 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-670cc701-5b3a-433d-b557-f64a378054c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670cc701-5b3a-433d-b557-f64a378054c6\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.410120 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.410170 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8731d474-6329-4a56-be08-fe3d12bb33cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.410225 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8731d474-6329-4a56-be08-fe3d12bb33cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.410266 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bb87\" (UniqueName: \"kubernetes.io/projected/8731d474-6329-4a56-be08-fe3d12bb33cd-kube-api-access-8bb87\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.410287 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.415143 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.417218 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.417274 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-670cc701-5b3a-433d-b557-f64a378054c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670cc701-5b3a-433d-b557-f64a378054c6\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9e2168981f8ae5ca7d1d9854505666a81af2dd79b72ad154f7c2c1de2e962f0c/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.418810 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8731d474-6329-4a56-be08-fe3d12bb33cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.424291 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8731d474-6329-4a56-be08-fe3d12bb33cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.429150 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8731d474-6329-4a56-be08-fe3d12bb33cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.440915 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bb87\" (UniqueName: \"kubernetes.io/projected/8731d474-6329-4a56-be08-fe3d12bb33cd-kube-api-access-8bb87\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.499578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-670cc701-5b3a-433d-b557-f64a378054c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670cc701-5b3a-433d-b557-f64a378054c6\") pod \"openstack-cell1-galera-0\" (UID: \"8731d474-6329-4a56-be08-fe3d12bb33cd\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.569269 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.624203 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.625204 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.627924 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.628444 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-75jm7" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.628660 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.647593 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.716253 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.716314 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnj78\" (UniqueName: \"kubernetes.io/projected/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-kube-api-access-tnj78\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.716357 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.716384 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-kolla-config\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.716423 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-config-data\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.819572 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-config-data\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.820166 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.820203 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnj78\" (UniqueName: \"kubernetes.io/projected/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-kube-api-access-tnj78\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.820255 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.820288 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-kolla-config\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.821560 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-kolla-config\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.822215 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-config-data\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.834636 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.846433 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.870105 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnj78\" (UniqueName: \"kubernetes.io/projected/2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429-kube-api-access-tnj78\") pod \"memcached-0\" (UID: \"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429\") " pod="openstack/memcached-0" Mar 14 08:41:39 crc kubenswrapper[5129]: I0314 08:41:39.957981 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 08:41:40 crc kubenswrapper[5129]: I0314 08:41:40.125096 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 08:41:40 crc kubenswrapper[5129]: I0314 08:41:40.895155 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8731d474-6329-4a56-be08-fe3d12bb33cd","Type":"ContainerStarted","Data":"37c359325cae0f35b4266a982e3d31b70f9a383e9909f8df824f8ff590c08b86"} Mar 14 08:41:41 crc kubenswrapper[5129]: I0314 08:41:41.672141 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 08:41:41 crc kubenswrapper[5129]: I0314 08:41:41.905551 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429","Type":"ContainerStarted","Data":"50fdbb72c7d6793712f58c3e8dffc4507b1fd031c65377e36fe7b3de019543af"} Mar 14 08:41:55 crc kubenswrapper[5129]: E0314 08:41:55.887413 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:7002c9136b77c6990bfebf085d6871b3" Mar 14 08:41:55 crc kubenswrapper[5129]: E0314 08:41:55.888412 5129 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:7002c9136b77c6990bfebf085d6871b3" Mar 14 08:41:55 crc kubenswrapper[5129]: E0314 08:41:55.888765 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:7002c9136b77c6990bfebf085d6871b3,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n56ch95h5b7h74h644hfdh548hb4h7bh679h9fhdfh549h7dh5bdhd6h5f5h5cdhd5h68h64fh575hb4hcch59bhch67dh595h655h697h57dh558q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnj78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:41:55 crc kubenswrapper[5129]: E0314 08:41:55.890030 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429" Mar 14 08:41:56 crc kubenswrapper[5129]: E0314 08:41:56.052293 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:7002c9136b77c6990bfebf085d6871b3\\\"\"" pod="openstack/memcached-0" podUID="2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.078406 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" event={"ID":"4d4cf759-66a7-4f0f-862a-68522a71bba6","Type":"ContainerStarted","Data":"46aceb00dcea7201ccb69d49331148ea33899d3634da21f6f09db104b438c6e7"} Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.081071 5129 generic.go:334] "Generic (PLEG): container finished" podID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerID="d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f" exitCode=0 Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.081196 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" event={"ID":"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f","Type":"ContainerDied","Data":"d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f"} Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.090523 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"069305d5-891e-4158-ba2e-9fc30afeadcb","Type":"ContainerStarted","Data":"357a133ce43f85bd3288aa661da139a24a7e08d226012f070434c894ff4ccf48"} Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.096543 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8731d474-6329-4a56-be08-fe3d12bb33cd","Type":"ContainerStarted","Data":"d3e31c1672baf3a973a3927e2830638d9f272c9cf3c851c5f21b093fab1b910d"} Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.100329 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" event={"ID":"04fd60cf-8519-49d4-bff2-367a1a0a6667","Type":"ContainerStarted","Data":"f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646"} Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.105102 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" event={"ID":"f6414662-676e-4c0f-b509-3ce8d1f61ab4","Type":"ContainerStarted","Data":"51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420"} Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.685086 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.689278 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.841419 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6414662-676e-4c0f-b509-3ce8d1f61ab4-config\") pod \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.841872 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-826hc\" (UniqueName: \"kubernetes.io/projected/f6414662-676e-4c0f-b509-3ce8d1f61ab4-kube-api-access-826hc\") pod \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\" (UID: \"f6414662-676e-4c0f-b509-3ce8d1f61ab4\") " Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.841919 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-dns-svc\") pod \"04fd60cf-8519-49d4-bff2-367a1a0a6667\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.842040 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-config\") pod \"04fd60cf-8519-49d4-bff2-367a1a0a6667\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.842128 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg2k9\" (UniqueName: \"kubernetes.io/projected/04fd60cf-8519-49d4-bff2-367a1a0a6667-kube-api-access-hg2k9\") pod \"04fd60cf-8519-49d4-bff2-367a1a0a6667\" (UID: \"04fd60cf-8519-49d4-bff2-367a1a0a6667\") " Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.846697 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6414662-676e-4c0f-b509-3ce8d1f61ab4-kube-api-access-826hc" (OuterVolumeSpecName: "kube-api-access-826hc") pod "f6414662-676e-4c0f-b509-3ce8d1f61ab4" (UID: "f6414662-676e-4c0f-b509-3ce8d1f61ab4"). InnerVolumeSpecName "kube-api-access-826hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.864965 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fd60cf-8519-49d4-bff2-367a1a0a6667-kube-api-access-hg2k9" (OuterVolumeSpecName: "kube-api-access-hg2k9") pod "04fd60cf-8519-49d4-bff2-367a1a0a6667" (UID: "04fd60cf-8519-49d4-bff2-367a1a0a6667"). InnerVolumeSpecName "kube-api-access-hg2k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.871841 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6414662-676e-4c0f-b509-3ce8d1f61ab4-config" (OuterVolumeSpecName: "config") pod "f6414662-676e-4c0f-b509-3ce8d1f61ab4" (UID: "f6414662-676e-4c0f-b509-3ce8d1f61ab4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.872158 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-config" (OuterVolumeSpecName: "config") pod "04fd60cf-8519-49d4-bff2-367a1a0a6667" (UID: "04fd60cf-8519-49d4-bff2-367a1a0a6667"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.874022 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04fd60cf-8519-49d4-bff2-367a1a0a6667" (UID: "04fd60cf-8519-49d4-bff2-367a1a0a6667"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.945248 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.945304 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg2k9\" (UniqueName: \"kubernetes.io/projected/04fd60cf-8519-49d4-bff2-367a1a0a6667-kube-api-access-hg2k9\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.945318 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6414662-676e-4c0f-b509-3ce8d1f61ab4-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.945332 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-826hc\" (UniqueName: \"kubernetes.io/projected/f6414662-676e-4c0f-b509-3ce8d1f61ab4-kube-api-access-826hc\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:59 crc kubenswrapper[5129]: I0314 08:41:59.945345 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fd60cf-8519-49d4-bff2-367a1a0a6667-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.127765 5129 generic.go:334] "Generic (PLEG): container finished" podID="f6414662-676e-4c0f-b509-3ce8d1f61ab4" containerID="51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420" exitCode=0 Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.127867 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" event={"ID":"f6414662-676e-4c0f-b509-3ce8d1f61ab4","Type":"ContainerDied","Data":"51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.127953 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" event={"ID":"f6414662-676e-4c0f-b509-3ce8d1f61ab4","Type":"ContainerDied","Data":"32652059e7ccc49ee8f833bfd3dac7ccd64407d26f52b21d210d784f0cee6915"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.127984 5129 scope.go:117] "RemoveContainer" containerID="51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.127894 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc74d6fb5-9hd54" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.130290 5129 generic.go:334] "Generic (PLEG): container finished" podID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerID="46aceb00dcea7201ccb69d49331148ea33899d3634da21f6f09db104b438c6e7" exitCode=0 Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.130364 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" event={"ID":"4d4cf759-66a7-4f0f-862a-68522a71bba6","Type":"ContainerDied","Data":"46aceb00dcea7201ccb69d49331148ea33899d3634da21f6f09db104b438c6e7"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.134679 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10","Type":"ContainerStarted","Data":"c06a639f387eab53dbd8c8f59e1311567f11fe4c81bb6c9f02776934dcbced53"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.145155 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" event={"ID":"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f","Type":"ContainerStarted","Data":"a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.146419 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.148046 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99e335f0-247c-4012-8b7a-67147c2cb39f","Type":"ContainerStarted","Data":"5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.151405 5129 generic.go:334] "Generic (PLEG): container finished" podID="04fd60cf-8519-49d4-bff2-367a1a0a6667" containerID="f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646" exitCode=0 Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.151488 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.151545 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" event={"ID":"04fd60cf-8519-49d4-bff2-367a1a0a6667","Type":"ContainerDied","Data":"f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.151646 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776d6cccf5-nr64p" event={"ID":"04fd60cf-8519-49d4-bff2-367a1a0a6667","Type":"ContainerDied","Data":"6e6b8fbf4ae6061a566aa2bc16d8a1e57b666e72052472e5fa13e7849a49f472"} Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.171763 5129 scope.go:117] "RemoveContainer" containerID="51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420" Mar 14 08:42:00 crc kubenswrapper[5129]: E0314 08:42:00.172750 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420\": container with ID starting with 51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420 not found: ID does not exist" containerID="51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.172809 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420"} err="failed to get container status \"51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420\": rpc error: code = NotFound desc = could not find container \"51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420\": container with ID starting with 51c82b33e74461e2f5a94b90218d9de10faff7a77fe4d8fc1de8ebd8a4433420 not found: ID does not exist" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.172842 5129 scope.go:117] "RemoveContainer" containerID="f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.263663 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc74d6fb5-9hd54"] Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.269998 5129 scope.go:117] "RemoveContainer" containerID="f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646" Mar 14 08:42:00 crc kubenswrapper[5129]: E0314 08:42:00.278865 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646\": container with ID starting with f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646 not found: ID does not exist" containerID="f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.278935 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646"} err="failed to get container status \"f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646\": rpc error: code = NotFound desc = could not find container \"f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646\": container with ID starting with f506de21c31416f318ab417aa250d16f01d98aec8359d3ed720f56f284c2d646 not found: ID does not exist" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.284868 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc74d6fb5-9hd54"] Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.294040 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557962-5zh9s"] Mar 14 08:42:00 crc kubenswrapper[5129]: E0314 08:42:00.294462 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6414662-676e-4c0f-b509-3ce8d1f61ab4" containerName="init" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.294476 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6414662-676e-4c0f-b509-3ce8d1f61ab4" containerName="init" Mar 14 08:42:00 crc kubenswrapper[5129]: E0314 08:42:00.294490 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fd60cf-8519-49d4-bff2-367a1a0a6667" containerName="init" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.294496 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fd60cf-8519-49d4-bff2-367a1a0a6667" containerName="init" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.294686 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6414662-676e-4c0f-b509-3ce8d1f61ab4" containerName="init" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.294702 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fd60cf-8519-49d4-bff2-367a1a0a6667" containerName="init" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.296354 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-5zh9s" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.300264 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.300958 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.301157 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.311237 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-5zh9s"] Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.367981 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-776d6cccf5-nr64p"] Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.374110 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-776d6cccf5-nr64p"] Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.378487 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" podStartSLOduration=3.275599811 podStartE2EDuration="24.378462967s" podCreationTimestamp="2026-03-14 08:41:36 +0000 UTC" firstStartedPulling="2026-03-14 08:41:37.297287354 +0000 UTC m=+6160.049202538" lastFinishedPulling="2026-03-14 08:41:58.40015051 +0000 UTC m=+6181.152065694" observedRunningTime="2026-03-14 08:42:00.330075289 +0000 UTC m=+6183.081990473" watchObservedRunningTime="2026-03-14 08:42:00.378462967 +0000 UTC m=+6183.130378151" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.382277 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts8dd\" (UniqueName: \"kubernetes.io/projected/432b6523-ff6c-48e5-808f-8804f72613b8-kube-api-access-ts8dd\") pod \"auto-csr-approver-29557962-5zh9s\" (UID: \"432b6523-ff6c-48e5-808f-8804f72613b8\") " pod="openshift-infra/auto-csr-approver-29557962-5zh9s" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.483722 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts8dd\" (UniqueName: \"kubernetes.io/projected/432b6523-ff6c-48e5-808f-8804f72613b8-kube-api-access-ts8dd\") pod \"auto-csr-approver-29557962-5zh9s\" (UID: \"432b6523-ff6c-48e5-808f-8804f72613b8\") " pod="openshift-infra/auto-csr-approver-29557962-5zh9s" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.502561 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts8dd\" (UniqueName: \"kubernetes.io/projected/432b6523-ff6c-48e5-808f-8804f72613b8-kube-api-access-ts8dd\") pod \"auto-csr-approver-29557962-5zh9s\" (UID: \"432b6523-ff6c-48e5-808f-8804f72613b8\") " pod="openshift-infra/auto-csr-approver-29557962-5zh9s" Mar 14 08:42:00 crc kubenswrapper[5129]: I0314 08:42:00.638281 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-5zh9s" Mar 14 08:42:01 crc kubenswrapper[5129]: I0314 08:42:01.164096 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" event={"ID":"4d4cf759-66a7-4f0f-862a-68522a71bba6","Type":"ContainerStarted","Data":"a06a86b6c2bf9a87fb67c83a6e8a76bb3bc86241b37ff0b9545d5d1c2486e570"} Mar 14 08:42:01 crc kubenswrapper[5129]: I0314 08:42:01.165679 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:42:01 crc kubenswrapper[5129]: I0314 08:42:01.167529 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-5zh9s"] Mar 14 08:42:01 crc kubenswrapper[5129]: I0314 08:42:01.198662 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" podStartSLOduration=3.868904715 podStartE2EDuration="26.198636462s" podCreationTimestamp="2026-03-14 08:41:35 +0000 UTC" firstStartedPulling="2026-03-14 08:41:36.070730321 +0000 UTC m=+6158.822645505" lastFinishedPulling="2026-03-14 08:41:58.400462058 +0000 UTC m=+6181.152377252" observedRunningTime="2026-03-14 08:42:01.190242045 +0000 UTC m=+6183.942157239" watchObservedRunningTime="2026-03-14 08:42:01.198636462 +0000 UTC m=+6183.950551646" Mar 14 08:42:02 crc kubenswrapper[5129]: I0314 08:42:02.049779 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fd60cf-8519-49d4-bff2-367a1a0a6667" path="/var/lib/kubelet/pods/04fd60cf-8519-49d4-bff2-367a1a0a6667/volumes" Mar 14 08:42:02 crc kubenswrapper[5129]: I0314 08:42:02.050857 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6414662-676e-4c0f-b509-3ce8d1f61ab4" path="/var/lib/kubelet/pods/f6414662-676e-4c0f-b509-3ce8d1f61ab4/volumes" Mar 14 08:42:02 crc kubenswrapper[5129]: I0314 08:42:02.180957 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557962-5zh9s" event={"ID":"432b6523-ff6c-48e5-808f-8804f72613b8","Type":"ContainerStarted","Data":"1f465d294668e30f22fe64657d9035351d66caa05e0ca304548ee5d68d918555"} Mar 14 08:42:03 crc kubenswrapper[5129]: I0314 08:42:03.191465 5129 generic.go:334] "Generic (PLEG): container finished" podID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerID="d3e31c1672baf3a973a3927e2830638d9f272c9cf3c851c5f21b093fab1b910d" exitCode=0 Mar 14 08:42:03 crc kubenswrapper[5129]: I0314 08:42:03.191550 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8731d474-6329-4a56-be08-fe3d12bb33cd","Type":"ContainerDied","Data":"d3e31c1672baf3a973a3927e2830638d9f272c9cf3c851c5f21b093fab1b910d"} Mar 14 08:42:03 crc kubenswrapper[5129]: I0314 08:42:03.196870 5129 generic.go:334] "Generic (PLEG): container finished" podID="069305d5-891e-4158-ba2e-9fc30afeadcb" containerID="357a133ce43f85bd3288aa661da139a24a7e08d226012f070434c894ff4ccf48" exitCode=0 Mar 14 08:42:03 crc kubenswrapper[5129]: I0314 08:42:03.196970 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"069305d5-891e-4158-ba2e-9fc30afeadcb","Type":"ContainerDied","Data":"357a133ce43f85bd3288aa661da139a24a7e08d226012f070434c894ff4ccf48"} Mar 14 08:42:03 crc kubenswrapper[5129]: I0314 08:42:03.199130 5129 generic.go:334] "Generic (PLEG): container finished" podID="432b6523-ff6c-48e5-808f-8804f72613b8" containerID="ae85c91f1f75bb4a2d7a9f3181fef3861049cc1dce0e6fa31dd44e32301a6f59" exitCode=0 Mar 14 08:42:03 crc kubenswrapper[5129]: I0314 08:42:03.199172 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557962-5zh9s" event={"ID":"432b6523-ff6c-48e5-808f-8804f72613b8","Type":"ContainerDied","Data":"ae85c91f1f75bb4a2d7a9f3181fef3861049cc1dce0e6fa31dd44e32301a6f59"} Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.214163 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"069305d5-891e-4158-ba2e-9fc30afeadcb","Type":"ContainerStarted","Data":"49e9258341537630e7c381811baa936298b6424bb39709428c035f9034e7e735"} Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.217879 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8731d474-6329-4a56-be08-fe3d12bb33cd","Type":"ContainerStarted","Data":"20e5acfb329f8878860a36d4f51d6133a01201fc457a475385a7568f634aebcf"} Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.251319 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.651048948 podStartE2EDuration="28.251294877s" podCreationTimestamp="2026-03-14 08:41:36 +0000 UTC" firstStartedPulling="2026-03-14 08:41:38.803909518 +0000 UTC m=+6161.555824702" lastFinishedPulling="2026-03-14 08:41:58.404155437 +0000 UTC m=+6181.156070631" observedRunningTime="2026-03-14 08:42:04.242240701 +0000 UTC m=+6186.994155975" watchObservedRunningTime="2026-03-14 08:42:04.251294877 +0000 UTC m=+6187.003210071" Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.279865 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.057340201 podStartE2EDuration="26.279838809s" podCreationTimestamp="2026-03-14 08:41:38 +0000 UTC" firstStartedPulling="2026-03-14 08:41:40.209355457 +0000 UTC m=+6162.961270642" lastFinishedPulling="2026-03-14 08:41:58.431854066 +0000 UTC m=+6181.183769250" observedRunningTime="2026-03-14 08:42:04.271887903 +0000 UTC m=+6187.023803107" watchObservedRunningTime="2026-03-14 08:42:04.279838809 +0000 UTC m=+6187.031754003" Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.583954 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-5zh9s" Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.771166 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts8dd\" (UniqueName: \"kubernetes.io/projected/432b6523-ff6c-48e5-808f-8804f72613b8-kube-api-access-ts8dd\") pod \"432b6523-ff6c-48e5-808f-8804f72613b8\" (UID: \"432b6523-ff6c-48e5-808f-8804f72613b8\") " Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.778901 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432b6523-ff6c-48e5-808f-8804f72613b8-kube-api-access-ts8dd" (OuterVolumeSpecName: "kube-api-access-ts8dd") pod "432b6523-ff6c-48e5-808f-8804f72613b8" (UID: "432b6523-ff6c-48e5-808f-8804f72613b8"). InnerVolumeSpecName "kube-api-access-ts8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:04 crc kubenswrapper[5129]: I0314 08:42:04.873377 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts8dd\" (UniqueName: \"kubernetes.io/projected/432b6523-ff6c-48e5-808f-8804f72613b8-kube-api-access-ts8dd\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:05 crc kubenswrapper[5129]: I0314 08:42:05.232150 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557962-5zh9s" event={"ID":"432b6523-ff6c-48e5-808f-8804f72613b8","Type":"ContainerDied","Data":"1f465d294668e30f22fe64657d9035351d66caa05e0ca304548ee5d68d918555"} Mar 14 08:42:05 crc kubenswrapper[5129]: I0314 08:42:05.232209 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f465d294668e30f22fe64657d9035351d66caa05e0ca304548ee5d68d918555" Mar 14 08:42:05 crc kubenswrapper[5129]: I0314 08:42:05.232267 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-5zh9s" Mar 14 08:42:05 crc kubenswrapper[5129]: I0314 08:42:05.443768 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:42:05 crc kubenswrapper[5129]: I0314 08:42:05.669938 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-zbnbh"] Mar 14 08:42:05 crc kubenswrapper[5129]: I0314 08:42:05.677812 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-zbnbh"] Mar 14 08:42:06 crc kubenswrapper[5129]: I0314 08:42:06.051703 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2f7000-40ff-42ae-90cb-c9af757b3d4f" path="/var/lib/kubelet/pods/9c2f7000-40ff-42ae-90cb-c9af757b3d4f/volumes" Mar 14 08:42:06 crc kubenswrapper[5129]: I0314 08:42:06.761914 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:42:06 crc kubenswrapper[5129]: I0314 08:42:06.814276 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b8f4dc9f-29vbm"] Mar 14 08:42:06 crc kubenswrapper[5129]: I0314 08:42:06.814871 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" podUID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerName="dnsmasq-dns" containerID="cri-o://a06a86b6c2bf9a87fb67c83a6e8a76bb3bc86241b37ff0b9545d5d1c2486e570" gracePeriod=10 Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.283410 5129 generic.go:334] "Generic (PLEG): container finished" podID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerID="a06a86b6c2bf9a87fb67c83a6e8a76bb3bc86241b37ff0b9545d5d1c2486e570" exitCode=0 Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.283968 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" event={"ID":"4d4cf759-66a7-4f0f-862a-68522a71bba6","Type":"ContainerDied","Data":"a06a86b6c2bf9a87fb67c83a6e8a76bb3bc86241b37ff0b9545d5d1c2486e570"} Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.284010 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" event={"ID":"4d4cf759-66a7-4f0f-862a-68522a71bba6","Type":"ContainerDied","Data":"76deffe658a42c4d1dbdb8a5ecbfc314724bfc42e882cc20b638de9da4d57788"} Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.284025 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76deffe658a42c4d1dbdb8a5ecbfc314724bfc42e882cc20b638de9da4d57788" Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.308749 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.418909 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9wpm\" (UniqueName: \"kubernetes.io/projected/4d4cf759-66a7-4f0f-862a-68522a71bba6-kube-api-access-w9wpm\") pod \"4d4cf759-66a7-4f0f-862a-68522a71bba6\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.418976 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-config\") pod \"4d4cf759-66a7-4f0f-862a-68522a71bba6\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.419071 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-dns-svc\") pod \"4d4cf759-66a7-4f0f-862a-68522a71bba6\" (UID: \"4d4cf759-66a7-4f0f-862a-68522a71bba6\") " Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.425716 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4cf759-66a7-4f0f-862a-68522a71bba6-kube-api-access-w9wpm" (OuterVolumeSpecName: "kube-api-access-w9wpm") pod "4d4cf759-66a7-4f0f-862a-68522a71bba6" (UID: "4d4cf759-66a7-4f0f-862a-68522a71bba6"). InnerVolumeSpecName "kube-api-access-w9wpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.462555 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d4cf759-66a7-4f0f-862a-68522a71bba6" (UID: "4d4cf759-66a7-4f0f-862a-68522a71bba6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.476244 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-config" (OuterVolumeSpecName: "config") pod "4d4cf759-66a7-4f0f-862a-68522a71bba6" (UID: "4d4cf759-66a7-4f0f-862a-68522a71bba6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.521448 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9wpm\" (UniqueName: \"kubernetes.io/projected/4d4cf759-66a7-4f0f-862a-68522a71bba6-kube-api-access-w9wpm\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.521490 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:07 crc kubenswrapper[5129]: I0314 08:42:07.521502 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d4cf759-66a7-4f0f-862a-68522a71bba6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:08 crc kubenswrapper[5129]: I0314 08:42:08.230540 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 14 08:42:08 crc kubenswrapper[5129]: I0314 08:42:08.231218 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 14 08:42:08 crc kubenswrapper[5129]: I0314 08:42:08.292378 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b8f4dc9f-29vbm" Mar 14 08:42:08 crc kubenswrapper[5129]: I0314 08:42:08.316803 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b8f4dc9f-29vbm"] Mar 14 08:42:08 crc kubenswrapper[5129]: I0314 08:42:08.324779 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b8f4dc9f-29vbm"] Mar 14 08:42:08 crc kubenswrapper[5129]: I0314 08:42:08.363877 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 14 08:42:08 crc kubenswrapper[5129]: I0314 08:42:08.452421 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 14 08:42:09 crc kubenswrapper[5129]: I0314 08:42:09.570141 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 14 08:42:09 crc kubenswrapper[5129]: I0314 08:42:09.570713 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 14 08:42:10 crc kubenswrapper[5129]: I0314 08:42:10.052910 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4cf759-66a7-4f0f-862a-68522a71bba6" path="/var/lib/kubelet/pods/4d4cf759-66a7-4f0f-862a-68522a71bba6/volumes" Mar 14 08:42:10 crc kubenswrapper[5129]: I0314 08:42:10.319329 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429","Type":"ContainerStarted","Data":"ab8107ee9a40ea82ed8fe1eaab62d3e69094066d673f06aaaed1689b8ebac768"} Mar 14 08:42:10 crc kubenswrapper[5129]: I0314 08:42:10.319664 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 14 08:42:10 crc kubenswrapper[5129]: I0314 08:42:10.366760 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.830441033 podStartE2EDuration="31.366724639s" podCreationTimestamp="2026-03-14 08:41:39 +0000 UTC" firstStartedPulling="2026-03-14 08:41:41.694728087 +0000 UTC m=+6164.446643271" lastFinishedPulling="2026-03-14 08:42:09.231011693 +0000 UTC m=+6191.982926877" observedRunningTime="2026-03-14 08:42:10.357094149 +0000 UTC m=+6193.109009363" watchObservedRunningTime="2026-03-14 08:42:10.366724639 +0000 UTC m=+6193.118639853" Mar 14 08:42:10 crc kubenswrapper[5129]: I0314 08:42:10.418144 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 14 08:42:10 crc kubenswrapper[5129]: I0314 08:42:10.530413 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 14 08:42:14 crc kubenswrapper[5129]: I0314 08:42:14.960442 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.859941 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xz5rl"] Mar 14 08:42:16 crc kubenswrapper[5129]: E0314 08:42:16.862423 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerName="dnsmasq-dns" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.862529 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerName="dnsmasq-dns" Mar 14 08:42:16 crc kubenswrapper[5129]: E0314 08:42:16.862759 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerName="init" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.862856 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerName="init" Mar 14 08:42:16 crc kubenswrapper[5129]: E0314 08:42:16.862944 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432b6523-ff6c-48e5-808f-8804f72613b8" containerName="oc" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.863024 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="432b6523-ff6c-48e5-808f-8804f72613b8" containerName="oc" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.863323 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="432b6523-ff6c-48e5-808f-8804f72613b8" containerName="oc" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.863443 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4cf759-66a7-4f0f-862a-68522a71bba6" containerName="dnsmasq-dns" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.864326 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.868528 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 08:42:16 crc kubenswrapper[5129]: I0314 08:42:16.889351 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xz5rl"] Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.003213 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9xq\" (UniqueName: \"kubernetes.io/projected/99f1a8a5-0499-40c5-814b-bdac63abf1e3-kube-api-access-4h9xq\") pod \"root-account-create-update-xz5rl\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.003410 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99f1a8a5-0499-40c5-814b-bdac63abf1e3-operator-scripts\") pod \"root-account-create-update-xz5rl\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.104893 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99f1a8a5-0499-40c5-814b-bdac63abf1e3-operator-scripts\") pod \"root-account-create-update-xz5rl\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.105001 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9xq\" (UniqueName: \"kubernetes.io/projected/99f1a8a5-0499-40c5-814b-bdac63abf1e3-kube-api-access-4h9xq\") pod \"root-account-create-update-xz5rl\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.107144 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99f1a8a5-0499-40c5-814b-bdac63abf1e3-operator-scripts\") pod \"root-account-create-update-xz5rl\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.134569 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9xq\" (UniqueName: \"kubernetes.io/projected/99f1a8a5-0499-40c5-814b-bdac63abf1e3-kube-api-access-4h9xq\") pod \"root-account-create-update-xz5rl\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.206284 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:17 crc kubenswrapper[5129]: I0314 08:42:17.708222 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xz5rl"] Mar 14 08:42:18 crc kubenswrapper[5129]: I0314 08:42:18.399864 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xz5rl" event={"ID":"99f1a8a5-0499-40c5-814b-bdac63abf1e3","Type":"ContainerStarted","Data":"21044b1ce87b2ce3d7134d571d89f31bb93a6c0adc8f3de6f9bdb67aa9a230fd"} Mar 14 08:42:19 crc kubenswrapper[5129]: I0314 08:42:19.411223 5129 generic.go:334] "Generic (PLEG): container finished" podID="99f1a8a5-0499-40c5-814b-bdac63abf1e3" containerID="6f8d25f35db04ee5621c2b38142e20e8755a197450ff9f18bcbf0bc1548892a9" exitCode=0 Mar 14 08:42:19 crc kubenswrapper[5129]: I0314 08:42:19.411296 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xz5rl" event={"ID":"99f1a8a5-0499-40c5-814b-bdac63abf1e3","Type":"ContainerDied","Data":"6f8d25f35db04ee5621c2b38142e20e8755a197450ff9f18bcbf0bc1548892a9"} Mar 14 08:42:19 crc kubenswrapper[5129]: I0314 08:42:19.575220 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:42:19 crc kubenswrapper[5129]: I0314 08:42:19.575832 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:42:20 crc kubenswrapper[5129]: I0314 08:42:20.734824 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:20 crc kubenswrapper[5129]: I0314 08:42:20.874758 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99f1a8a5-0499-40c5-814b-bdac63abf1e3-operator-scripts\") pod \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " Mar 14 08:42:20 crc kubenswrapper[5129]: I0314 08:42:20.875924 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9xq\" (UniqueName: \"kubernetes.io/projected/99f1a8a5-0499-40c5-814b-bdac63abf1e3-kube-api-access-4h9xq\") pod \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\" (UID: \"99f1a8a5-0499-40c5-814b-bdac63abf1e3\") " Mar 14 08:42:20 crc kubenswrapper[5129]: I0314 08:42:20.875994 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99f1a8a5-0499-40c5-814b-bdac63abf1e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99f1a8a5-0499-40c5-814b-bdac63abf1e3" (UID: "99f1a8a5-0499-40c5-814b-bdac63abf1e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:42:20 crc kubenswrapper[5129]: I0314 08:42:20.876275 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99f1a8a5-0499-40c5-814b-bdac63abf1e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:21 crc kubenswrapper[5129]: I0314 08:42:21.267359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f1a8a5-0499-40c5-814b-bdac63abf1e3-kube-api-access-4h9xq" (OuterVolumeSpecName: "kube-api-access-4h9xq") pod "99f1a8a5-0499-40c5-814b-bdac63abf1e3" (UID: "99f1a8a5-0499-40c5-814b-bdac63abf1e3"). InnerVolumeSpecName "kube-api-access-4h9xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:21 crc kubenswrapper[5129]: I0314 08:42:21.283724 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h9xq\" (UniqueName: \"kubernetes.io/projected/99f1a8a5-0499-40c5-814b-bdac63abf1e3-kube-api-access-4h9xq\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:21 crc kubenswrapper[5129]: I0314 08:42:21.441624 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xz5rl" event={"ID":"99f1a8a5-0499-40c5-814b-bdac63abf1e3","Type":"ContainerDied","Data":"21044b1ce87b2ce3d7134d571d89f31bb93a6c0adc8f3de6f9bdb67aa9a230fd"} Mar 14 08:42:21 crc kubenswrapper[5129]: I0314 08:42:21.441677 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21044b1ce87b2ce3d7134d571d89f31bb93a6c0adc8f3de6f9bdb67aa9a230fd" Mar 14 08:42:21 crc kubenswrapper[5129]: I0314 08:42:21.441822 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5rl" Mar 14 08:42:23 crc kubenswrapper[5129]: I0314 08:42:23.179631 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xz5rl"] Mar 14 08:42:23 crc kubenswrapper[5129]: I0314 08:42:23.191817 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xz5rl"] Mar 14 08:42:24 crc kubenswrapper[5129]: I0314 08:42:24.052703 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f1a8a5-0499-40c5-814b-bdac63abf1e3" path="/var/lib/kubelet/pods/99f1a8a5-0499-40c5-814b-bdac63abf1e3/volumes" Mar 14 08:42:26 crc kubenswrapper[5129]: I0314 08:42:26.328220 5129 scope.go:117] "RemoveContainer" containerID="31213297cc39fc3d90f5620eec879eb5663851329922d530791c835db4761ef9" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.208901 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5vnvv"] Mar 14 08:42:28 crc kubenswrapper[5129]: E0314 08:42:28.211124 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f1a8a5-0499-40c5-814b-bdac63abf1e3" containerName="mariadb-account-create-update" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.211229 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f1a8a5-0499-40c5-814b-bdac63abf1e3" containerName="mariadb-account-create-update" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.212075 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f1a8a5-0499-40c5-814b-bdac63abf1e3" containerName="mariadb-account-create-update" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.218391 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5vnvv"] Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.218555 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.225220 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.308135 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlsdw\" (UniqueName: \"kubernetes.io/projected/bbaaa8fb-9971-4abf-aaca-46592954619d-kube-api-access-xlsdw\") pod \"root-account-create-update-5vnvv\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.308281 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbaaa8fb-9971-4abf-aaca-46592954619d-operator-scripts\") pod \"root-account-create-update-5vnvv\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.410232 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlsdw\" (UniqueName: \"kubernetes.io/projected/bbaaa8fb-9971-4abf-aaca-46592954619d-kube-api-access-xlsdw\") pod \"root-account-create-update-5vnvv\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.410358 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbaaa8fb-9971-4abf-aaca-46592954619d-operator-scripts\") pod \"root-account-create-update-5vnvv\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.411180 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbaaa8fb-9971-4abf-aaca-46592954619d-operator-scripts\") pod \"root-account-create-update-5vnvv\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.445976 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlsdw\" (UniqueName: \"kubernetes.io/projected/bbaaa8fb-9971-4abf-aaca-46592954619d-kube-api-access-xlsdw\") pod \"root-account-create-update-5vnvv\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.565047 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:28 crc kubenswrapper[5129]: I0314 08:42:28.816455 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5vnvv"] Mar 14 08:42:28 crc kubenswrapper[5129]: W0314 08:42:28.823176 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbaaa8fb_9971_4abf_aaca_46592954619d.slice/crio-0a40b46a514656204a5d447717b44d5b6ddbb93bc449d7f43bc626b9cf641b29 WatchSource:0}: Error finding container 0a40b46a514656204a5d447717b44d5b6ddbb93bc449d7f43bc626b9cf641b29: Status 404 returned error can't find the container with id 0a40b46a514656204a5d447717b44d5b6ddbb93bc449d7f43bc626b9cf641b29 Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.248392 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lqgq6"] Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.252864 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.268515 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqgq6"] Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.326193 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-utilities\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.326269 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txj4p\" (UniqueName: \"kubernetes.io/projected/30d651c2-c580-4557-b1a3-cb5c318bb466-kube-api-access-txj4p\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.326320 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-catalog-content\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.428196 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-utilities\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.428291 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txj4p\" (UniqueName: \"kubernetes.io/projected/30d651c2-c580-4557-b1a3-cb5c318bb466-kube-api-access-txj4p\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.428362 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-catalog-content\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.428947 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-utilities\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.429079 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-catalog-content\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.450901 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txj4p\" (UniqueName: \"kubernetes.io/projected/30d651c2-c580-4557-b1a3-cb5c318bb466-kube-api-access-txj4p\") pod \"certified-operators-lqgq6\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.520034 5129 generic.go:334] "Generic (PLEG): container finished" podID="bbaaa8fb-9971-4abf-aaca-46592954619d" containerID="8322e2efc2a30956f62de53e5aa643432d764624734a4387ed77f572e4226c06" exitCode=0 Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.520097 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vnvv" event={"ID":"bbaaa8fb-9971-4abf-aaca-46592954619d","Type":"ContainerDied","Data":"8322e2efc2a30956f62de53e5aa643432d764624734a4387ed77f572e4226c06"} Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.520136 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vnvv" event={"ID":"bbaaa8fb-9971-4abf-aaca-46592954619d","Type":"ContainerStarted","Data":"0a40b46a514656204a5d447717b44d5b6ddbb93bc449d7f43bc626b9cf641b29"} Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.652653 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:29 crc kubenswrapper[5129]: I0314 08:42:29.911835 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqgq6"] Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.535267 5129 generic.go:334] "Generic (PLEG): container finished" podID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerID="059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27" exitCode=0 Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.535347 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqgq6" event={"ID":"30d651c2-c580-4557-b1a3-cb5c318bb466","Type":"ContainerDied","Data":"059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27"} Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.535863 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqgq6" event={"ID":"30d651c2-c580-4557-b1a3-cb5c318bb466","Type":"ContainerStarted","Data":"84d146e3261cd2a3335e8cbc1354cee2f0e0e2ca03c144bbd8f26d55648e244a"} Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.876954 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.954480 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlsdw\" (UniqueName: \"kubernetes.io/projected/bbaaa8fb-9971-4abf-aaca-46592954619d-kube-api-access-xlsdw\") pod \"bbaaa8fb-9971-4abf-aaca-46592954619d\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.955024 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbaaa8fb-9971-4abf-aaca-46592954619d-operator-scripts\") pod \"bbaaa8fb-9971-4abf-aaca-46592954619d\" (UID: \"bbaaa8fb-9971-4abf-aaca-46592954619d\") " Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.955703 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbaaa8fb-9971-4abf-aaca-46592954619d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbaaa8fb-9971-4abf-aaca-46592954619d" (UID: "bbaaa8fb-9971-4abf-aaca-46592954619d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:42:30 crc kubenswrapper[5129]: I0314 08:42:30.965673 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbaaa8fb-9971-4abf-aaca-46592954619d-kube-api-access-xlsdw" (OuterVolumeSpecName: "kube-api-access-xlsdw") pod "bbaaa8fb-9971-4abf-aaca-46592954619d" (UID: "bbaaa8fb-9971-4abf-aaca-46592954619d"). InnerVolumeSpecName "kube-api-access-xlsdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.056411 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlsdw\" (UniqueName: \"kubernetes.io/projected/bbaaa8fb-9971-4abf-aaca-46592954619d-kube-api-access-xlsdw\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.056451 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbaaa8fb-9971-4abf-aaca-46592954619d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.544059 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqgq6" event={"ID":"30d651c2-c580-4557-b1a3-cb5c318bb466","Type":"ContainerStarted","Data":"f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff"} Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.546114 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vnvv" event={"ID":"bbaaa8fb-9971-4abf-aaca-46592954619d","Type":"ContainerDied","Data":"0a40b46a514656204a5d447717b44d5b6ddbb93bc449d7f43bc626b9cf641b29"} Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.546138 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a40b46a514656204a5d447717b44d5b6ddbb93bc449d7f43bc626b9cf641b29" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.546166 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vnvv" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.635756 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xkw5w"] Mar 14 08:42:31 crc kubenswrapper[5129]: E0314 08:42:31.636062 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbaaa8fb-9971-4abf-aaca-46592954619d" containerName="mariadb-account-create-update" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.636075 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbaaa8fb-9971-4abf-aaca-46592954619d" containerName="mariadb-account-create-update" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.636236 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbaaa8fb-9971-4abf-aaca-46592954619d" containerName="mariadb-account-create-update" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.637277 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.647842 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkw5w"] Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.771943 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-utilities\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.772044 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkkn\" (UniqueName: \"kubernetes.io/projected/05ec3b47-7615-4aca-b4a7-53ead70d59e3-kube-api-access-vmkkn\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.772093 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-catalog-content\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.873490 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-utilities\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.873988 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkkn\" (UniqueName: \"kubernetes.io/projected/05ec3b47-7615-4aca-b4a7-53ead70d59e3-kube-api-access-vmkkn\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.874037 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-catalog-content\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.874139 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-utilities\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.874834 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-catalog-content\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.892879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkkn\" (UniqueName: \"kubernetes.io/projected/05ec3b47-7615-4aca-b4a7-53ead70d59e3-kube-api-access-vmkkn\") pod \"redhat-marketplace-xkw5w\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:31 crc kubenswrapper[5129]: I0314 08:42:31.953921 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.457126 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkw5w"] Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.556835 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkw5w" event={"ID":"05ec3b47-7615-4aca-b4a7-53ead70d59e3","Type":"ContainerStarted","Data":"603a7a94e309d19042f91f2239f46faa676078d38955083afa1ea01f6e34c5d7"} Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.559569 5129 generic.go:334] "Generic (PLEG): container finished" podID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerID="5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd" exitCode=0 Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.559742 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99e335f0-247c-4012-8b7a-67147c2cb39f","Type":"ContainerDied","Data":"5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd"} Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.566121 5129 generic.go:334] "Generic (PLEG): container finished" podID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerID="c06a639f387eab53dbd8c8f59e1311567f11fe4c81bb6c9f02776934dcbced53" exitCode=0 Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.566262 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10","Type":"ContainerDied","Data":"c06a639f387eab53dbd8c8f59e1311567f11fe4c81bb6c9f02776934dcbced53"} Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.580641 5129 generic.go:334] "Generic (PLEG): container finished" podID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerID="f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff" exitCode=0 Mar 14 08:42:32 crc kubenswrapper[5129]: I0314 08:42:32.581024 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqgq6" event={"ID":"30d651c2-c580-4557-b1a3-cb5c318bb466","Type":"ContainerDied","Data":"f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff"} Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.589299 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10","Type":"ContainerStarted","Data":"5d1c25deabebc9c9f205bf11195174ba1fa0628265f5aef6e786545b3b27a8f3"} Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.590110 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.591416 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqgq6" event={"ID":"30d651c2-c580-4557-b1a3-cb5c318bb466","Type":"ContainerStarted","Data":"179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950"} Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.592835 5129 generic.go:334] "Generic (PLEG): container finished" podID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerID="5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38" exitCode=0 Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.592880 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkw5w" event={"ID":"05ec3b47-7615-4aca-b4a7-53ead70d59e3","Type":"ContainerDied","Data":"5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38"} Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.595242 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99e335f0-247c-4012-8b7a-67147c2cb39f","Type":"ContainerStarted","Data":"1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b"} Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.596221 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.630038 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.799760609 podStartE2EDuration="57.630016937s" podCreationTimestamp="2026-03-14 08:41:36 +0000 UTC" firstStartedPulling="2026-03-14 08:41:38.575063551 +0000 UTC m=+6161.326978735" lastFinishedPulling="2026-03-14 08:41:58.405319869 +0000 UTC m=+6181.157235063" observedRunningTime="2026-03-14 08:42:33.622616397 +0000 UTC m=+6216.374531591" watchObservedRunningTime="2026-03-14 08:42:33.630016937 +0000 UTC m=+6216.381932121" Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.657151 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lqgq6" podStartSLOduration=2.207881232 podStartE2EDuration="4.657127261s" podCreationTimestamp="2026-03-14 08:42:29 +0000 UTC" firstStartedPulling="2026-03-14 08:42:30.539657804 +0000 UTC m=+6213.291572998" lastFinishedPulling="2026-03-14 08:42:32.988903843 +0000 UTC m=+6215.740819027" observedRunningTime="2026-03-14 08:42:33.652127725 +0000 UTC m=+6216.404042919" watchObservedRunningTime="2026-03-14 08:42:33.657127261 +0000 UTC m=+6216.409042455" Mar 14 08:42:33 crc kubenswrapper[5129]: I0314 08:42:33.705072 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.04453375 podStartE2EDuration="58.705041126s" podCreationTimestamp="2026-03-14 08:41:35 +0000 UTC" firstStartedPulling="2026-03-14 08:41:37.714859503 +0000 UTC m=+6160.466774687" lastFinishedPulling="2026-03-14 08:41:58.375366839 +0000 UTC m=+6181.127282063" observedRunningTime="2026-03-14 08:42:33.701177531 +0000 UTC m=+6216.453092715" watchObservedRunningTime="2026-03-14 08:42:33.705041126 +0000 UTC m=+6216.456956310" Mar 14 08:42:34 crc kubenswrapper[5129]: I0314 08:42:34.606636 5129 generic.go:334] "Generic (PLEG): container finished" podID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerID="448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680" exitCode=0 Mar 14 08:42:34 crc kubenswrapper[5129]: I0314 08:42:34.606744 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkw5w" event={"ID":"05ec3b47-7615-4aca-b4a7-53ead70d59e3","Type":"ContainerDied","Data":"448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680"} Mar 14 08:42:35 crc kubenswrapper[5129]: I0314 08:42:35.619652 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkw5w" event={"ID":"05ec3b47-7615-4aca-b4a7-53ead70d59e3","Type":"ContainerStarted","Data":"72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d"} Mar 14 08:42:35 crc kubenswrapper[5129]: I0314 08:42:35.658514 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xkw5w" podStartSLOduration=3.228892899 podStartE2EDuration="4.658481231s" podCreationTimestamp="2026-03-14 08:42:31 +0000 UTC" firstStartedPulling="2026-03-14 08:42:33.59388222 +0000 UTC m=+6216.345797404" lastFinishedPulling="2026-03-14 08:42:35.023470552 +0000 UTC m=+6217.775385736" observedRunningTime="2026-03-14 08:42:35.645953862 +0000 UTC m=+6218.397869056" watchObservedRunningTime="2026-03-14 08:42:35.658481231 +0000 UTC m=+6218.410396435" Mar 14 08:42:39 crc kubenswrapper[5129]: I0314 08:42:39.653705 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:39 crc kubenswrapper[5129]: I0314 08:42:39.654248 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:39 crc kubenswrapper[5129]: I0314 08:42:39.719894 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:40 crc kubenswrapper[5129]: I0314 08:42:40.729905 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:40 crc kubenswrapper[5129]: I0314 08:42:40.851246 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lqgq6"] Mar 14 08:42:41 crc kubenswrapper[5129]: I0314 08:42:41.954315 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:41 crc kubenswrapper[5129]: I0314 08:42:41.954369 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:42 crc kubenswrapper[5129]: I0314 08:42:42.028563 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:42 crc kubenswrapper[5129]: I0314 08:42:42.701012 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lqgq6" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="registry-server" containerID="cri-o://179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950" gracePeriod=2 Mar 14 08:42:42 crc kubenswrapper[5129]: I0314 08:42:42.779971 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.163073 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.228239 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkw5w"] Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.278137 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-catalog-content\") pod \"30d651c2-c580-4557-b1a3-cb5c318bb466\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.278216 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txj4p\" (UniqueName: \"kubernetes.io/projected/30d651c2-c580-4557-b1a3-cb5c318bb466-kube-api-access-txj4p\") pod \"30d651c2-c580-4557-b1a3-cb5c318bb466\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.278414 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-utilities\") pod \"30d651c2-c580-4557-b1a3-cb5c318bb466\" (UID: \"30d651c2-c580-4557-b1a3-cb5c318bb466\") " Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.279449 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-utilities" (OuterVolumeSpecName: "utilities") pod "30d651c2-c580-4557-b1a3-cb5c318bb466" (UID: "30d651c2-c580-4557-b1a3-cb5c318bb466"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.289616 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d651c2-c580-4557-b1a3-cb5c318bb466-kube-api-access-txj4p" (OuterVolumeSpecName: "kube-api-access-txj4p") pod "30d651c2-c580-4557-b1a3-cb5c318bb466" (UID: "30d651c2-c580-4557-b1a3-cb5c318bb466"). InnerVolumeSpecName "kube-api-access-txj4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.351501 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30d651c2-c580-4557-b1a3-cb5c318bb466" (UID: "30d651c2-c580-4557-b1a3-cb5c318bb466"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.380102 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.380150 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d651c2-c580-4557-b1a3-cb5c318bb466-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.380174 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txj4p\" (UniqueName: \"kubernetes.io/projected/30d651c2-c580-4557-b1a3-cb5c318bb466-kube-api-access-txj4p\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.714370 5129 generic.go:334] "Generic (PLEG): container finished" podID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerID="179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950" exitCode=0 Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.714467 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqgq6" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.714483 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqgq6" event={"ID":"30d651c2-c580-4557-b1a3-cb5c318bb466","Type":"ContainerDied","Data":"179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950"} Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.714635 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqgq6" event={"ID":"30d651c2-c580-4557-b1a3-cb5c318bb466","Type":"ContainerDied","Data":"84d146e3261cd2a3335e8cbc1354cee2f0e0e2ca03c144bbd8f26d55648e244a"} Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.714657 5129 scope.go:117] "RemoveContainer" containerID="179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.742673 5129 scope.go:117] "RemoveContainer" containerID="f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.763975 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lqgq6"] Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.772980 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lqgq6"] Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.784632 5129 scope.go:117] "RemoveContainer" containerID="059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.825204 5129 scope.go:117] "RemoveContainer" containerID="179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950" Mar 14 08:42:43 crc kubenswrapper[5129]: E0314 08:42:43.826042 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950\": container with ID starting with 179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950 not found: ID does not exist" containerID="179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.826122 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950"} err="failed to get container status \"179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950\": rpc error: code = NotFound desc = could not find container \"179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950\": container with ID starting with 179db27764b8bb54ba8350f1a68d46e140531214065aa4d885f5caa1a82af950 not found: ID does not exist" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.826164 5129 scope.go:117] "RemoveContainer" containerID="f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff" Mar 14 08:42:43 crc kubenswrapper[5129]: E0314 08:42:43.826993 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff\": container with ID starting with f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff not found: ID does not exist" containerID="f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.827040 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff"} err="failed to get container status \"f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff\": rpc error: code = NotFound desc = could not find container \"f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff\": container with ID starting with f638d6c1cad31a96f8be16da9319331fb1fab70ffcbc14b9886b100da9cf06ff not found: ID does not exist" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.827070 5129 scope.go:117] "RemoveContainer" containerID="059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27" Mar 14 08:42:43 crc kubenswrapper[5129]: E0314 08:42:43.827421 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27\": container with ID starting with 059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27 not found: ID does not exist" containerID="059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27" Mar 14 08:42:43 crc kubenswrapper[5129]: I0314 08:42:43.827460 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27"} err="failed to get container status \"059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27\": rpc error: code = NotFound desc = could not find container \"059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27\": container with ID starting with 059054b8d2533dcc983d6a8793cbb3db1c3822c39fa7ec4ddfdd540c50020e27 not found: ID does not exist" Mar 14 08:42:44 crc kubenswrapper[5129]: I0314 08:42:44.054504 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" path="/var/lib/kubelet/pods/30d651c2-c580-4557-b1a3-cb5c318bb466/volumes" Mar 14 08:42:44 crc kubenswrapper[5129]: I0314 08:42:44.727061 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xkw5w" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="registry-server" containerID="cri-o://72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d" gracePeriod=2 Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.196414 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.321904 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-catalog-content\") pod \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.322096 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-utilities\") pod \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.322352 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmkkn\" (UniqueName: \"kubernetes.io/projected/05ec3b47-7615-4aca-b4a7-53ead70d59e3-kube-api-access-vmkkn\") pod \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\" (UID: \"05ec3b47-7615-4aca-b4a7-53ead70d59e3\") " Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.323635 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-utilities" (OuterVolumeSpecName: "utilities") pod "05ec3b47-7615-4aca-b4a7-53ead70d59e3" (UID: "05ec3b47-7615-4aca-b4a7-53ead70d59e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.333955 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ec3b47-7615-4aca-b4a7-53ead70d59e3-kube-api-access-vmkkn" (OuterVolumeSpecName: "kube-api-access-vmkkn") pod "05ec3b47-7615-4aca-b4a7-53ead70d59e3" (UID: "05ec3b47-7615-4aca-b4a7-53ead70d59e3"). InnerVolumeSpecName "kube-api-access-vmkkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.365113 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05ec3b47-7615-4aca-b4a7-53ead70d59e3" (UID: "05ec3b47-7615-4aca-b4a7-53ead70d59e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.424344 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmkkn\" (UniqueName: \"kubernetes.io/projected/05ec3b47-7615-4aca-b4a7-53ead70d59e3-kube-api-access-vmkkn\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.424375 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.424385 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec3b47-7615-4aca-b4a7-53ead70d59e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.761685 5129 generic.go:334] "Generic (PLEG): container finished" podID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerID="72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d" exitCode=0 Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.761758 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkw5w" event={"ID":"05ec3b47-7615-4aca-b4a7-53ead70d59e3","Type":"ContainerDied","Data":"72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d"} Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.761836 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkw5w" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.761884 5129 scope.go:117] "RemoveContainer" containerID="72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.761860 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkw5w" event={"ID":"05ec3b47-7615-4aca-b4a7-53ead70d59e3","Type":"ContainerDied","Data":"603a7a94e309d19042f91f2239f46faa676078d38955083afa1ea01f6e34c5d7"} Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.805489 5129 scope.go:117] "RemoveContainer" containerID="448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.810346 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkw5w"] Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.818788 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkw5w"] Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.828400 5129 scope.go:117] "RemoveContainer" containerID="5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.870910 5129 scope.go:117] "RemoveContainer" containerID="72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d" Mar 14 08:42:45 crc kubenswrapper[5129]: E0314 08:42:45.872000 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d\": container with ID starting with 72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d not found: ID does not exist" containerID="72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.872047 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d"} err="failed to get container status \"72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d\": rpc error: code = NotFound desc = could not find container \"72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d\": container with ID starting with 72f9b6c13332c3ed4e1bf06b294e058d6c164972e6864984e19f4b1b2a98502d not found: ID does not exist" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.872098 5129 scope.go:117] "RemoveContainer" containerID="448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680" Mar 14 08:42:45 crc kubenswrapper[5129]: E0314 08:42:45.872560 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680\": container with ID starting with 448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680 not found: ID does not exist" containerID="448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.872634 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680"} err="failed to get container status \"448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680\": rpc error: code = NotFound desc = could not find container \"448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680\": container with ID starting with 448bcda5bc7482ab5872c4263888627f7fb17b6f513bf1720b03ae425f8da680 not found: ID does not exist" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.872669 5129 scope.go:117] "RemoveContainer" containerID="5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38" Mar 14 08:42:45 crc kubenswrapper[5129]: E0314 08:42:45.873094 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38\": container with ID starting with 5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38 not found: ID does not exist" containerID="5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38" Mar 14 08:42:45 crc kubenswrapper[5129]: I0314 08:42:45.873154 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38"} err="failed to get container status \"5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38\": rpc error: code = NotFound desc = could not find container \"5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38\": container with ID starting with 5adaef0b6a7ea2972171074a0b787c69a655a550370f12c06adc6263bd051f38 not found: ID does not exist" Mar 14 08:42:46 crc kubenswrapper[5129]: I0314 08:42:46.048751 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" path="/var/lib/kubelet/pods/05ec3b47-7615-4aca-b4a7-53ead70d59e3/volumes" Mar 14 08:42:47 crc kubenswrapper[5129]: I0314 08:42:47.084087 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:42:47 crc kubenswrapper[5129]: I0314 08:42:47.898878 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 08:42:49 crc kubenswrapper[5129]: I0314 08:42:49.574559 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:42:49 crc kubenswrapper[5129]: I0314 08:42:49.575026 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.881625 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-747f589bdf-44rx6"] Mar 14 08:42:52 crc kubenswrapper[5129]: E0314 08:42:52.882959 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="registry-server" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.882980 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="registry-server" Mar 14 08:42:52 crc kubenswrapper[5129]: E0314 08:42:52.882991 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="extract-utilities" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.883002 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="extract-utilities" Mar 14 08:42:52 crc kubenswrapper[5129]: E0314 08:42:52.883019 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="extract-content" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.883027 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="extract-content" Mar 14 08:42:52 crc kubenswrapper[5129]: E0314 08:42:52.883040 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="extract-utilities" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.883048 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="extract-utilities" Mar 14 08:42:52 crc kubenswrapper[5129]: E0314 08:42:52.883093 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="registry-server" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.883100 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="registry-server" Mar 14 08:42:52 crc kubenswrapper[5129]: E0314 08:42:52.883110 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="extract-content" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.883118 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="extract-content" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.883318 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d651c2-c580-4557-b1a3-cb5c318bb466" containerName="registry-server" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.883329 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ec3b47-7615-4aca-b4a7-53ead70d59e3" containerName="registry-server" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.884514 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.928821 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-747f589bdf-44rx6"] Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.967562 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-dns-svc\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.968046 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsx7j\" (UniqueName: \"kubernetes.io/projected/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-kube-api-access-wsx7j\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:52 crc kubenswrapper[5129]: I0314 08:42:52.968100 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-config\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:53 crc kubenswrapper[5129]: I0314 08:42:53.070015 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-dns-svc\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:53 crc kubenswrapper[5129]: I0314 08:42:53.070111 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsx7j\" (UniqueName: \"kubernetes.io/projected/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-kube-api-access-wsx7j\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:53 crc kubenswrapper[5129]: I0314 08:42:53.070137 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-config\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:53 crc kubenswrapper[5129]: I0314 08:42:53.071212 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-dns-svc\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:53 crc kubenswrapper[5129]: I0314 08:42:53.071209 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-config\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:53 crc kubenswrapper[5129]: I0314 08:42:53.096717 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsx7j\" (UniqueName: \"kubernetes.io/projected/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-kube-api-access-wsx7j\") pod \"dnsmasq-dns-747f589bdf-44rx6\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:53 crc kubenswrapper[5129]: I0314 08:42:53.212800 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:53.496652 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-747f589bdf-44rx6"] Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:53.848232 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerID="4777da02711b53675af955a8a09b74acefa8843e11c53e62343f8e797638e915" exitCode=0 Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:53.848450 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" event={"ID":"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc","Type":"ContainerDied","Data":"4777da02711b53675af955a8a09b74acefa8843e11c53e62343f8e797638e915"} Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:53.848644 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" event={"ID":"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc","Type":"ContainerStarted","Data":"c65bf0a6e5c20c398cb7ac2060b57f69583270ae5f22757d8190591fd083bcc2"} Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:53.882953 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:54.808798 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:54.862869 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" event={"ID":"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc","Type":"ContainerStarted","Data":"330b60cd4436a3db2514f18a24a29aaaa77016a838850e8250931a8b47b3c13f"} Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:54.863394 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:42:54 crc kubenswrapper[5129]: I0314 08:42:54.899941 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" podStartSLOduration=2.89991434 podStartE2EDuration="2.89991434s" podCreationTimestamp="2026-03-14 08:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:42:54.894188645 +0000 UTC m=+6237.646103839" watchObservedRunningTime="2026-03-14 08:42:54.89991434 +0000 UTC m=+6237.651829524" Mar 14 08:42:58 crc kubenswrapper[5129]: I0314 08:42:58.396991 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerName="rabbitmq" containerID="cri-o://5d1c25deabebc9c9f205bf11195174ba1fa0628265f5aef6e786545b3b27a8f3" gracePeriod=604796 Mar 14 08:42:59 crc kubenswrapper[5129]: I0314 08:42:59.879026 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerName="rabbitmq" containerID="cri-o://1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b" gracePeriod=604795 Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.215537 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.284851 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bcbb86c9-jtnfv"] Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.285198 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" podUID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerName="dnsmasq-dns" containerID="cri-o://a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d" gracePeriod=10 Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.783717 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.883447 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-config\") pod \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.883767 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-dns-svc\") pod \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.883847 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djpjv\" (UniqueName: \"kubernetes.io/projected/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-kube-api-access-djpjv\") pod \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\" (UID: \"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f\") " Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.890942 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-kube-api-access-djpjv" (OuterVolumeSpecName: "kube-api-access-djpjv") pod "37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" (UID: "37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f"). InnerVolumeSpecName "kube-api-access-djpjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.947810 5129 generic.go:334] "Generic (PLEG): container finished" podID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerID="a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d" exitCode=0 Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.947882 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" event={"ID":"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f","Type":"ContainerDied","Data":"a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d"} Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.947931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" event={"ID":"37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f","Type":"ContainerDied","Data":"b0cee4e42a7ea2827e45a6a6fbc17233aa5318cb8d677f73f9d46629d512683d"} Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.947967 5129 scope.go:117] "RemoveContainer" containerID="a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.948313 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bcbb86c9-jtnfv" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.949696 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-config" (OuterVolumeSpecName: "config") pod "37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" (UID: "37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.956668 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" (UID: "37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.975090 5129 scope.go:117] "RemoveContainer" containerID="d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.987209 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.987241 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:03 crc kubenswrapper[5129]: I0314 08:43:03.987255 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djpjv\" (UniqueName: \"kubernetes.io/projected/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f-kube-api-access-djpjv\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.005562 5129 scope.go:117] "RemoveContainer" containerID="a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d" Mar 14 08:43:04 crc kubenswrapper[5129]: E0314 08:43:04.006184 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d\": container with ID starting with a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d not found: ID does not exist" containerID="a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d" Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.006222 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d"} err="failed to get container status \"a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d\": rpc error: code = NotFound desc = could not find container \"a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d\": container with ID starting with a5d54b4c99a30a48580c96923ec4ebf7858ceaa2f625d36a2ee6e3dc56400d8d not found: ID does not exist" Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.006248 5129 scope.go:117] "RemoveContainer" containerID="d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f" Mar 14 08:43:04 crc kubenswrapper[5129]: E0314 08:43:04.006576 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f\": container with ID starting with d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f not found: ID does not exist" containerID="d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f" Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.006698 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f"} err="failed to get container status \"d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f\": rpc error: code = NotFound desc = could not find container \"d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f\": container with ID starting with d063a38b2fb0ef937645e407cbeb65b98be3c049476a8493337f140daf77db7f not found: ID does not exist" Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.286372 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bcbb86c9-jtnfv"] Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.300640 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65bcbb86c9-jtnfv"] Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.960640 5129 generic.go:334] "Generic (PLEG): container finished" podID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerID="5d1c25deabebc9c9f205bf11195174ba1fa0628265f5aef6e786545b3b27a8f3" exitCode=0 Mar 14 08:43:04 crc kubenswrapper[5129]: I0314 08:43:04.960786 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10","Type":"ContainerDied","Data":"5d1c25deabebc9c9f205bf11195174ba1fa0628265f5aef6e786545b3b27a8f3"} Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.033428 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.213795 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vzg5\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-kube-api-access-9vzg5\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214162 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-plugins\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214254 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-plugins-conf\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214393 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-erlang-cookie-secret\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214470 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-confd\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214550 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-pod-info\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214663 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-server-conf\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214789 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-erlang-cookie\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214873 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-config-data\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.214967 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-tls\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.215198 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\" (UID: \"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10\") " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.216267 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.216971 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.217649 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.223926 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.223124 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-kube-api-access-9vzg5" (OuterVolumeSpecName: "kube-api-access-9vzg5") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "kube-api-access-9vzg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.225301 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.229243 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-pod-info" (OuterVolumeSpecName: "pod-info") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.237535 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114" (OuterVolumeSpecName: "persistence") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.239946 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-config-data" (OuterVolumeSpecName: "config-data") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.274325 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-server-conf" (OuterVolumeSpecName: "server-conf") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.300214 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" (UID: "dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.316953 5129 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317014 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317029 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317041 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317092 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") on node \"crc\" " Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317113 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vzg5\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-kube-api-access-9vzg5\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317128 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317139 5129 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317150 5129 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317161 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.317172 5129 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.334800 5129 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.334962 5129 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114") on node "crc" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.417704 5129 reconciler_common.go:293] "Volume detached for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.973540 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10","Type":"ContainerDied","Data":"522b442f6200f468b2b54ea73223aac767631d8fd155d979d37c3385ddfdf982"} Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.973676 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:43:05 crc kubenswrapper[5129]: I0314 08:43:05.973720 5129 scope.go:117] "RemoveContainer" containerID="5d1c25deabebc9c9f205bf11195174ba1fa0628265f5aef6e786545b3b27a8f3" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.022255 5129 scope.go:117] "RemoveContainer" containerID="c06a639f387eab53dbd8c8f59e1311567f11fe4c81bb6c9f02776934dcbced53" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.054727 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" path="/var/lib/kubelet/pods/37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f/volumes" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.056181 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.056239 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.084304 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:43:06 crc kubenswrapper[5129]: E0314 08:43:06.084930 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerName="init" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.084951 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerName="init" Mar 14 08:43:06 crc kubenswrapper[5129]: E0314 08:43:06.084966 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerName="setup-container" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.084973 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerName="setup-container" Mar 14 08:43:06 crc kubenswrapper[5129]: E0314 08:43:06.084991 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerName="dnsmasq-dns" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.084997 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerName="dnsmasq-dns" Mar 14 08:43:06 crc kubenswrapper[5129]: E0314 08:43:06.085013 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerName="rabbitmq" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.085020 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerName="rabbitmq" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.085155 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" containerName="rabbitmq" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.085173 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e213e2-0c88-4a1c-aa9d-c8d99bd6c10f" containerName="dnsmasq-dns" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.085957 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.089733 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.090122 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.090253 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.090446 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.090950 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.091435 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l6pdc" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.097031 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.117200 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233574 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233647 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsbwg\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-kube-api-access-bsbwg\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233684 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233765 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b433fc5a-ef90-4dc7-9648-f081946560f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233820 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233847 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233868 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233912 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233941 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233964 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b433fc5a-ef90-4dc7-9648-f081946560f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.233983 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.335642 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.335759 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.335819 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.335855 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b433fc5a-ef90-4dc7-9648-f081946560f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.335927 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.336523 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.336550 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.337128 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.337589 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsbwg\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-kube-api-access-bsbwg\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.337670 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.337704 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b433fc5a-ef90-4dc7-9648-f081946560f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.337769 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.337798 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.338831 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.339901 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.339950 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.339953 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8772a841facf27c028a2777758a67103ea1c530fdaed26926545981d1b0917a9/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.341039 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b433fc5a-ef90-4dc7-9648-f081946560f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.343034 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.343178 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.345341 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b433fc5a-ef90-4dc7-9648-f081946560f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.345836 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b433fc5a-ef90-4dc7-9648-f081946560f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.378275 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsbwg\" (UniqueName: \"kubernetes.io/projected/b433fc5a-ef90-4dc7-9648-f081946560f4-kube-api-access-bsbwg\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.409203 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be7a5f33-1e0e-414a-ae08-dafd820b1114\") pod \"rabbitmq-server-0\" (UID: \"b433fc5a-ef90-4dc7-9648-f081946560f4\") " pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.470135 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.512405 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.642771 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-confd\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643261 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643294 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-server-conf\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643349 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-plugins\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643382 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-plugins-conf\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643432 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-erlang-cookie\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643485 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99e335f0-247c-4012-8b7a-67147c2cb39f-erlang-cookie-secret\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643573 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rqmk\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-kube-api-access-4rqmk\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643593 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-config-data\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643658 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-tls\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.643682 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99e335f0-247c-4012-8b7a-67147c2cb39f-pod-info\") pod \"99e335f0-247c-4012-8b7a-67147c2cb39f\" (UID: \"99e335f0-247c-4012-8b7a-67147c2cb39f\") " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.644243 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.644547 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.644827 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.645444 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.649054 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-kube-api-access-4rqmk" (OuterVolumeSpecName: "kube-api-access-4rqmk") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "kube-api-access-4rqmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.659675 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e335f0-247c-4012-8b7a-67147c2cb39f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.666150 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/99e335f0-247c-4012-8b7a-67147c2cb39f-pod-info" (OuterVolumeSpecName: "pod-info") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.667994 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.686669 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02" (OuterVolumeSpecName: "persistence") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.692632 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-config-data" (OuterVolumeSpecName: "config-data") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.726997 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-server-conf" (OuterVolumeSpecName: "server-conf") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745771 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rqmk\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-kube-api-access-4rqmk\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745807 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745818 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745830 5129 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99e335f0-247c-4012-8b7a-67147c2cb39f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745856 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") on node \"crc\" " Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745868 5129 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745878 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745888 5129 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99e335f0-247c-4012-8b7a-67147c2cb39f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.745897 5129 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99e335f0-247c-4012-8b7a-67147c2cb39f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.755870 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "99e335f0-247c-4012-8b7a-67147c2cb39f" (UID: "99e335f0-247c-4012-8b7a-67147c2cb39f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.761031 5129 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.761192 5129 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02") on node "crc" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.847883 5129 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99e335f0-247c-4012-8b7a-67147c2cb39f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.847920 5129 reconciler_common.go:293] "Volume detached for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.990306 5129 generic.go:334] "Generic (PLEG): container finished" podID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerID="1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b" exitCode=0 Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.990429 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.990451 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99e335f0-247c-4012-8b7a-67147c2cb39f","Type":"ContainerDied","Data":"1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b"} Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.990539 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99e335f0-247c-4012-8b7a-67147c2cb39f","Type":"ContainerDied","Data":"05175ec5edd6f3b646f5bf3f4d4fc5066e05be578c504b49ff660248c423b671"} Mar 14 08:43:06 crc kubenswrapper[5129]: I0314 08:43:06.990563 5129 scope.go:117] "RemoveContainer" containerID="1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.030543 5129 scope.go:117] "RemoveContainer" containerID="5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.060002 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.073770 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.086451 5129 scope.go:117] "RemoveContainer" containerID="1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b" Mar 14 08:43:07 crc kubenswrapper[5129]: W0314 08:43:07.086577 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb433fc5a_ef90_4dc7_9648_f081946560f4.slice/crio-78e521016cac8d1f75061bf1434f2b03a51e46a2d192cc35331ba59c9a270be7 WatchSource:0}: Error finding container 78e521016cac8d1f75061bf1434f2b03a51e46a2d192cc35331ba59c9a270be7: Status 404 returned error can't find the container with id 78e521016cac8d1f75061bf1434f2b03a51e46a2d192cc35331ba59c9a270be7 Mar 14 08:43:07 crc kubenswrapper[5129]: E0314 08:43:07.086886 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b\": container with ID starting with 1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b not found: ID does not exist" containerID="1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.086933 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b"} err="failed to get container status \"1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b\": rpc error: code = NotFound desc = could not find container \"1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b\": container with ID starting with 1ee2fc0d44e9a2df72f1c3df7aeaac668d2927da9da58e994e7271ec2a3d275b not found: ID does not exist" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.086964 5129 scope.go:117] "RemoveContainer" containerID="5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd" Mar 14 08:43:07 crc kubenswrapper[5129]: E0314 08:43:07.087299 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd\": container with ID starting with 5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd not found: ID does not exist" containerID="5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.087327 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd"} err="failed to get container status \"5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd\": rpc error: code = NotFound desc = could not find container \"5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd\": container with ID starting with 5eac550ec9714ec97390113e2ca91079273673e5e24fb1a6e1ddb29d7a41e5bd not found: ID does not exist" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.119081 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.133502 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:43:07 crc kubenswrapper[5129]: E0314 08:43:07.134379 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerName="setup-container" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.134417 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerName="setup-container" Mar 14 08:43:07 crc kubenswrapper[5129]: E0314 08:43:07.134446 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerName="rabbitmq" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.134456 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerName="rabbitmq" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.134971 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e335f0-247c-4012-8b7a-67147c2cb39f" containerName="rabbitmq" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.137045 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.140297 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.142268 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.142336 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.142497 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.142704 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7gmct" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.143114 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.143285 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.151370 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258324 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258388 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258459 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258513 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258554 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x26l2\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-kube-api-access-x26l2\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258651 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258685 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258712 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258738 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258797 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.258839 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.360807 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.360857 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.360883 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.360909 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.360934 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x26l2\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-kube-api-access-x26l2\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.360967 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.360987 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.361007 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.361031 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.361068 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.361092 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.361661 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.362344 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.364029 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.364193 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.364520 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.367555 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.368542 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.368645 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/06af7259acb5e8dba4b6de6868139e8986e4b013821acf605ac634f3b027303c/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.370016 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.370028 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.372072 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.383903 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x26l2\" (UniqueName: \"kubernetes.io/projected/1c3e6ea4-bab0-434c-82b1-d5301345b1ac-kube-api-access-x26l2\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.413980 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc108ee3-c7ea-4d30-823f-01a01382ac02\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c3e6ea4-bab0-434c-82b1-d5301345b1ac\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:07 crc kubenswrapper[5129]: I0314 08:43:07.468437 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:08 crc kubenswrapper[5129]: I0314 08:43:08.014407 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b433fc5a-ef90-4dc7-9648-f081946560f4","Type":"ContainerStarted","Data":"78e521016cac8d1f75061bf1434f2b03a51e46a2d192cc35331ba59c9a270be7"} Mar 14 08:43:08 crc kubenswrapper[5129]: I0314 08:43:08.021242 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:43:08 crc kubenswrapper[5129]: W0314 08:43:08.023837 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c3e6ea4_bab0_434c_82b1_d5301345b1ac.slice/crio-0435983c69d0d6bf346a04f0f594df6faeb4e3ca4f86b9d3668eb085619ddfd2 WatchSource:0}: Error finding container 0435983c69d0d6bf346a04f0f594df6faeb4e3ca4f86b9d3668eb085619ddfd2: Status 404 returned error can't find the container with id 0435983c69d0d6bf346a04f0f594df6faeb4e3ca4f86b9d3668eb085619ddfd2 Mar 14 08:43:08 crc kubenswrapper[5129]: I0314 08:43:08.050437 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e335f0-247c-4012-8b7a-67147c2cb39f" path="/var/lib/kubelet/pods/99e335f0-247c-4012-8b7a-67147c2cb39f/volumes" Mar 14 08:43:08 crc kubenswrapper[5129]: I0314 08:43:08.051545 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10" path="/var/lib/kubelet/pods/dd6c5cb7-d4ca-48dc-88d7-9167e79a6e10/volumes" Mar 14 08:43:09 crc kubenswrapper[5129]: I0314 08:43:09.025114 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b433fc5a-ef90-4dc7-9648-f081946560f4","Type":"ContainerStarted","Data":"446c25524db80cd93daf69185b2bfa953c29c25f0edbb468e2a13c6e708cca84"} Mar 14 08:43:09 crc kubenswrapper[5129]: I0314 08:43:09.027303 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c3e6ea4-bab0-434c-82b1-d5301345b1ac","Type":"ContainerStarted","Data":"0435983c69d0d6bf346a04f0f594df6faeb4e3ca4f86b9d3668eb085619ddfd2"} Mar 14 08:43:10 crc kubenswrapper[5129]: I0314 08:43:10.054872 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c3e6ea4-bab0-434c-82b1-d5301345b1ac","Type":"ContainerStarted","Data":"ed8ab4de38d1a110fd5daa8992b0bc89f57aad7743284d118b5ead5371dcb880"} Mar 14 08:43:19 crc kubenswrapper[5129]: I0314 08:43:19.574503 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:43:19 crc kubenswrapper[5129]: I0314 08:43:19.574989 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:43:19 crc kubenswrapper[5129]: I0314 08:43:19.575048 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:43:19 crc kubenswrapper[5129]: I0314 08:43:19.575843 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7dd76c976b1c799fda9092d5895c529ac9cbd574cb01ccbcc80fb6c3c94c49a"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:43:19 crc kubenswrapper[5129]: I0314 08:43:19.575965 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://b7dd76c976b1c799fda9092d5895c529ac9cbd574cb01ccbcc80fb6c3c94c49a" gracePeriod=600 Mar 14 08:43:20 crc kubenswrapper[5129]: I0314 08:43:20.139145 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="b7dd76c976b1c799fda9092d5895c529ac9cbd574cb01ccbcc80fb6c3c94c49a" exitCode=0 Mar 14 08:43:20 crc kubenswrapper[5129]: I0314 08:43:20.139267 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"b7dd76c976b1c799fda9092d5895c529ac9cbd574cb01ccbcc80fb6c3c94c49a"} Mar 14 08:43:20 crc kubenswrapper[5129]: I0314 08:43:20.139870 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20"} Mar 14 08:43:20 crc kubenswrapper[5129]: I0314 08:43:20.139927 5129 scope.go:117] "RemoveContainer" containerID="2111cf5c0b2deef95f6efc6b6bf0b2917b4898f7d499d0bdd20c2f6eeda32cc0" Mar 14 08:43:42 crc kubenswrapper[5129]: I0314 08:43:42.379217 5129 generic.go:334] "Generic (PLEG): container finished" podID="1c3e6ea4-bab0-434c-82b1-d5301345b1ac" containerID="ed8ab4de38d1a110fd5daa8992b0bc89f57aad7743284d118b5ead5371dcb880" exitCode=0 Mar 14 08:43:42 crc kubenswrapper[5129]: I0314 08:43:42.379406 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c3e6ea4-bab0-434c-82b1-d5301345b1ac","Type":"ContainerDied","Data":"ed8ab4de38d1a110fd5daa8992b0bc89f57aad7743284d118b5ead5371dcb880"} Mar 14 08:43:42 crc kubenswrapper[5129]: I0314 08:43:42.382389 5129 generic.go:334] "Generic (PLEG): container finished" podID="b433fc5a-ef90-4dc7-9648-f081946560f4" containerID="446c25524db80cd93daf69185b2bfa953c29c25f0edbb468e2a13c6e708cca84" exitCode=0 Mar 14 08:43:42 crc kubenswrapper[5129]: I0314 08:43:42.382462 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b433fc5a-ef90-4dc7-9648-f081946560f4","Type":"ContainerDied","Data":"446c25524db80cd93daf69185b2bfa953c29c25f0edbb468e2a13c6e708cca84"} Mar 14 08:43:43 crc kubenswrapper[5129]: I0314 08:43:43.390771 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b433fc5a-ef90-4dc7-9648-f081946560f4","Type":"ContainerStarted","Data":"7ee86fc765a673a43d360734eae9d2856bf53b7d80f99aeec389bca3b2ed6040"} Mar 14 08:43:43 crc kubenswrapper[5129]: I0314 08:43:43.391562 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 08:43:43 crc kubenswrapper[5129]: I0314 08:43:43.393324 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c3e6ea4-bab0-434c-82b1-d5301345b1ac","Type":"ContainerStarted","Data":"9e6a4ca85ec77467db9171593211ef5bcac9b6af3030ae536d928f8a7fe82ff1"} Mar 14 08:43:43 crc kubenswrapper[5129]: I0314 08:43:43.393474 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:43 crc kubenswrapper[5129]: I0314 08:43:43.425162 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.425134419 podStartE2EDuration="37.425134419s" podCreationTimestamp="2026-03-14 08:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:43:43.417544394 +0000 UTC m=+6286.169459578" watchObservedRunningTime="2026-03-14 08:43:43.425134419 +0000 UTC m=+6286.177049603" Mar 14 08:43:56 crc kubenswrapper[5129]: I0314 08:43:56.518836 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 08:43:56 crc kubenswrapper[5129]: I0314 08:43:56.563037 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.562993807 podStartE2EDuration="49.562993807s" podCreationTimestamp="2026-03-14 08:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:43:43.448120601 +0000 UTC m=+6286.200035795" watchObservedRunningTime="2026-03-14 08:43:56.562993807 +0000 UTC m=+6299.314909031" Mar 14 08:43:57 crc kubenswrapper[5129]: I0314 08:43:57.472975 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:43:59 crc kubenswrapper[5129]: I0314 08:43:59.856439 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 14 08:43:59 crc kubenswrapper[5129]: I0314 08:43:59.858098 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:43:59 crc kubenswrapper[5129]: I0314 08:43:59.862728 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jslnd" Mar 14 08:43:59 crc kubenswrapper[5129]: I0314 08:43:59.863217 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhx6z\" (UniqueName: \"kubernetes.io/projected/41edad26-50a8-42b5-84fc-c201855f66ce-kube-api-access-vhx6z\") pod \"mariadb-client\" (UID: \"41edad26-50a8-42b5-84fc-c201855f66ce\") " pod="openstack/mariadb-client" Mar 14 08:43:59 crc kubenswrapper[5129]: I0314 08:43:59.877737 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:43:59 crc kubenswrapper[5129]: I0314 08:43:59.965368 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhx6z\" (UniqueName: \"kubernetes.io/projected/41edad26-50a8-42b5-84fc-c201855f66ce-kube-api-access-vhx6z\") pod \"mariadb-client\" (UID: \"41edad26-50a8-42b5-84fc-c201855f66ce\") " pod="openstack/mariadb-client" Mar 14 08:43:59 crc kubenswrapper[5129]: I0314 08:43:59.991398 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhx6z\" (UniqueName: \"kubernetes.io/projected/41edad26-50a8-42b5-84fc-c201855f66ce-kube-api-access-vhx6z\") pod \"mariadb-client\" (UID: \"41edad26-50a8-42b5-84fc-c201855f66ce\") " pod="openstack/mariadb-client" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.156058 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557964-xqqdk"] Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.160528 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.163347 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.163644 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.163829 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.169109 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-xqqdk"] Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.169512 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv2bs\" (UniqueName: \"kubernetes.io/projected/d0a97190-2a99-46cc-b51b-ce843d0c7dcc-kube-api-access-qv2bs\") pod \"auto-csr-approver-29557964-xqqdk\" (UID: \"d0a97190-2a99-46cc-b51b-ce843d0c7dcc\") " pod="openshift-infra/auto-csr-approver-29557964-xqqdk" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.202143 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.271304 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv2bs\" (UniqueName: \"kubernetes.io/projected/d0a97190-2a99-46cc-b51b-ce843d0c7dcc-kube-api-access-qv2bs\") pod \"auto-csr-approver-29557964-xqqdk\" (UID: \"d0a97190-2a99-46cc-b51b-ce843d0c7dcc\") " pod="openshift-infra/auto-csr-approver-29557964-xqqdk" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.297193 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv2bs\" (UniqueName: \"kubernetes.io/projected/d0a97190-2a99-46cc-b51b-ce843d0c7dcc-kube-api-access-qv2bs\") pod \"auto-csr-approver-29557964-xqqdk\" (UID: \"d0a97190-2a99-46cc-b51b-ce843d0c7dcc\") " pod="openshift-infra/auto-csr-approver-29557964-xqqdk" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.508079 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.777426 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:44:00 crc kubenswrapper[5129]: I0314 08:44:00.781738 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:44:01 crc kubenswrapper[5129]: I0314 08:44:01.007695 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-xqqdk"] Mar 14 08:44:01 crc kubenswrapper[5129]: W0314 08:44:01.022329 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a97190_2a99_46cc_b51b_ce843d0c7dcc.slice/crio-67cb433ce4aceaf03402ccc417f22c560ccbfde2c815627db142a65317cb52ed WatchSource:0}: Error finding container 67cb433ce4aceaf03402ccc417f22c560ccbfde2c815627db142a65317cb52ed: Status 404 returned error can't find the container with id 67cb433ce4aceaf03402ccc417f22c560ccbfde2c815627db142a65317cb52ed Mar 14 08:44:01 crc kubenswrapper[5129]: I0314 08:44:01.575894 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" event={"ID":"d0a97190-2a99-46cc-b51b-ce843d0c7dcc","Type":"ContainerStarted","Data":"67cb433ce4aceaf03402ccc417f22c560ccbfde2c815627db142a65317cb52ed"} Mar 14 08:44:01 crc kubenswrapper[5129]: I0314 08:44:01.578151 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"41edad26-50a8-42b5-84fc-c201855f66ce","Type":"ContainerStarted","Data":"8a4885128d39d10741bb9287f04b22e1f07202747e7ce209a743a12aa686112a"} Mar 14 08:44:02 crc kubenswrapper[5129]: I0314 08:44:02.593320 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" event={"ID":"d0a97190-2a99-46cc-b51b-ce843d0c7dcc","Type":"ContainerStarted","Data":"b34cb7c778702e7045381aa2e9cacf2d3de65cad2a37e49f600c4611adab8e92"} Mar 14 08:44:02 crc kubenswrapper[5129]: I0314 08:44:02.597908 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"41edad26-50a8-42b5-84fc-c201855f66ce","Type":"ContainerStarted","Data":"c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f"} Mar 14 08:44:02 crc kubenswrapper[5129]: I0314 08:44:02.649562 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=3.044387886 podStartE2EDuration="3.649532668s" podCreationTimestamp="2026-03-14 08:43:59 +0000 UTC" firstStartedPulling="2026-03-14 08:44:00.781352688 +0000 UTC m=+6303.533267872" lastFinishedPulling="2026-03-14 08:44:01.38649743 +0000 UTC m=+6304.138412654" observedRunningTime="2026-03-14 08:44:02.637495703 +0000 UTC m=+6305.389410937" watchObservedRunningTime="2026-03-14 08:44:02.649532668 +0000 UTC m=+6305.401447892" Mar 14 08:44:02 crc kubenswrapper[5129]: I0314 08:44:02.653645 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" podStartSLOduration=1.442098013 podStartE2EDuration="2.653629279s" podCreationTimestamp="2026-03-14 08:44:00 +0000 UTC" firstStartedPulling="2026-03-14 08:44:01.025255583 +0000 UTC m=+6303.777170767" lastFinishedPulling="2026-03-14 08:44:02.236786839 +0000 UTC m=+6304.988702033" observedRunningTime="2026-03-14 08:44:02.619857516 +0000 UTC m=+6305.371772720" watchObservedRunningTime="2026-03-14 08:44:02.653629279 +0000 UTC m=+6305.405544503" Mar 14 08:44:03 crc kubenswrapper[5129]: I0314 08:44:03.610969 5129 generic.go:334] "Generic (PLEG): container finished" podID="d0a97190-2a99-46cc-b51b-ce843d0c7dcc" containerID="b34cb7c778702e7045381aa2e9cacf2d3de65cad2a37e49f600c4611adab8e92" exitCode=0 Mar 14 08:44:03 crc kubenswrapper[5129]: I0314 08:44:03.611058 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" event={"ID":"d0a97190-2a99-46cc-b51b-ce843d0c7dcc","Type":"ContainerDied","Data":"b34cb7c778702e7045381aa2e9cacf2d3de65cad2a37e49f600c4611adab8e92"} Mar 14 08:44:04 crc kubenswrapper[5129]: I0314 08:44:04.993941 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.092325 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv2bs\" (UniqueName: \"kubernetes.io/projected/d0a97190-2a99-46cc-b51b-ce843d0c7dcc-kube-api-access-qv2bs\") pod \"d0a97190-2a99-46cc-b51b-ce843d0c7dcc\" (UID: \"d0a97190-2a99-46cc-b51b-ce843d0c7dcc\") " Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.099974 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a97190-2a99-46cc-b51b-ce843d0c7dcc-kube-api-access-qv2bs" (OuterVolumeSpecName: "kube-api-access-qv2bs") pod "d0a97190-2a99-46cc-b51b-ce843d0c7dcc" (UID: "d0a97190-2a99-46cc-b51b-ce843d0c7dcc"). InnerVolumeSpecName "kube-api-access-qv2bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.196114 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv2bs\" (UniqueName: \"kubernetes.io/projected/d0a97190-2a99-46cc-b51b-ce843d0c7dcc-kube-api-access-qv2bs\") on node \"crc\" DevicePath \"\"" Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.630182 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" event={"ID":"d0a97190-2a99-46cc-b51b-ce843d0c7dcc","Type":"ContainerDied","Data":"67cb433ce4aceaf03402ccc417f22c560ccbfde2c815627db142a65317cb52ed"} Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.630243 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67cb433ce4aceaf03402ccc417f22c560ccbfde2c815627db142a65317cb52ed" Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.630279 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-xqqdk" Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.694236 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-6nrrl"] Mar 14 08:44:05 crc kubenswrapper[5129]: I0314 08:44:05.701678 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-6nrrl"] Mar 14 08:44:05 crc kubenswrapper[5129]: E0314 08:44:05.732358 5129 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.113:50146->38.102.83.113:34059: write tcp 38.102.83.113:50146->38.102.83.113:34059: write: connection reset by peer Mar 14 08:44:06 crc kubenswrapper[5129]: I0314 08:44:06.060855 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28690fd-bf49-4e65-b34f-f45051a34f2c" path="/var/lib/kubelet/pods/b28690fd-bf49-4e65-b34f-f45051a34f2c/volumes" Mar 14 08:44:18 crc kubenswrapper[5129]: I0314 08:44:18.721255 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:44:18 crc kubenswrapper[5129]: I0314 08:44:18.722174 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="41edad26-50a8-42b5-84fc-c201855f66ce" containerName="mariadb-client" containerID="cri-o://c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f" gracePeriod=30 Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.266887 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.352368 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhx6z\" (UniqueName: \"kubernetes.io/projected/41edad26-50a8-42b5-84fc-c201855f66ce-kube-api-access-vhx6z\") pod \"41edad26-50a8-42b5-84fc-c201855f66ce\" (UID: \"41edad26-50a8-42b5-84fc-c201855f66ce\") " Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.359314 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41edad26-50a8-42b5-84fc-c201855f66ce-kube-api-access-vhx6z" (OuterVolumeSpecName: "kube-api-access-vhx6z") pod "41edad26-50a8-42b5-84fc-c201855f66ce" (UID: "41edad26-50a8-42b5-84fc-c201855f66ce"). InnerVolumeSpecName "kube-api-access-vhx6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.454788 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhx6z\" (UniqueName: \"kubernetes.io/projected/41edad26-50a8-42b5-84fc-c201855f66ce-kube-api-access-vhx6z\") on node \"crc\" DevicePath \"\"" Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.794976 5129 generic.go:334] "Generic (PLEG): container finished" podID="41edad26-50a8-42b5-84fc-c201855f66ce" containerID="c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f" exitCode=143 Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.795033 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"41edad26-50a8-42b5-84fc-c201855f66ce","Type":"ContainerDied","Data":"c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f"} Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.795075 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"41edad26-50a8-42b5-84fc-c201855f66ce","Type":"ContainerDied","Data":"8a4885128d39d10741bb9287f04b22e1f07202747e7ce209a743a12aa686112a"} Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.795085 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.795099 5129 scope.go:117] "RemoveContainer" containerID="c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f" Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.835903 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.837998 5129 scope.go:117] "RemoveContainer" containerID="c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f" Mar 14 08:44:19 crc kubenswrapper[5129]: E0314 08:44:19.838695 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f\": container with ID starting with c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f not found: ID does not exist" containerID="c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f" Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.838760 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f"} err="failed to get container status \"c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f\": rpc error: code = NotFound desc = could not find container \"c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f\": container with ID starting with c24df162ccd251cef038f2cb86a691564c6df52377c7bb857544d5f6bb8bb97f not found: ID does not exist" Mar 14 08:44:19 crc kubenswrapper[5129]: I0314 08:44:19.841160 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:44:20 crc kubenswrapper[5129]: I0314 08:44:20.046650 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41edad26-50a8-42b5-84fc-c201855f66ce" path="/var/lib/kubelet/pods/41edad26-50a8-42b5-84fc-c201855f66ce/volumes" Mar 14 08:44:26 crc kubenswrapper[5129]: I0314 08:44:26.556060 5129 scope.go:117] "RemoveContainer" containerID="dafa89101ae2bd589c432dd0531910c43f02e300b1ea7b27dcca88f636dec1f0" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.142528 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9"] Mar 14 08:45:00 crc kubenswrapper[5129]: E0314 08:45:00.143569 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a97190-2a99-46cc-b51b-ce843d0c7dcc" containerName="oc" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.143583 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a97190-2a99-46cc-b51b-ce843d0c7dcc" containerName="oc" Mar 14 08:45:00 crc kubenswrapper[5129]: E0314 08:45:00.143615 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41edad26-50a8-42b5-84fc-c201855f66ce" containerName="mariadb-client" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.143621 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="41edad26-50a8-42b5-84fc-c201855f66ce" containerName="mariadb-client" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.143755 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="41edad26-50a8-42b5-84fc-c201855f66ce" containerName="mariadb-client" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.143779 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a97190-2a99-46cc-b51b-ce843d0c7dcc" containerName="oc" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.144289 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.147068 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.147522 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.154939 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9"] Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.271868 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rz8\" (UniqueName: \"kubernetes.io/projected/210287bb-cc01-4243-ac52-f907bb970aa4-kube-api-access-c9rz8\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.271996 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210287bb-cc01-4243-ac52-f907bb970aa4-config-volume\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.272197 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210287bb-cc01-4243-ac52-f907bb970aa4-secret-volume\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.373706 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210287bb-cc01-4243-ac52-f907bb970aa4-config-volume\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.373769 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210287bb-cc01-4243-ac52-f907bb970aa4-secret-volume\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.373824 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rz8\" (UniqueName: \"kubernetes.io/projected/210287bb-cc01-4243-ac52-f907bb970aa4-kube-api-access-c9rz8\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.374645 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210287bb-cc01-4243-ac52-f907bb970aa4-config-volume\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.381325 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210287bb-cc01-4243-ac52-f907bb970aa4-secret-volume\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.399081 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rz8\" (UniqueName: \"kubernetes.io/projected/210287bb-cc01-4243-ac52-f907bb970aa4-kube-api-access-c9rz8\") pod \"collect-profiles-29557965-g7ld9\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.460512 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:00 crc kubenswrapper[5129]: I0314 08:45:00.928210 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9"] Mar 14 08:45:00 crc kubenswrapper[5129]: W0314 08:45:00.976639 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod210287bb_cc01_4243_ac52_f907bb970aa4.slice/crio-452d8edec303d3834903f96308b5cab6c64ff85d8e28012426cdf5f1004de2c7 WatchSource:0}: Error finding container 452d8edec303d3834903f96308b5cab6c64ff85d8e28012426cdf5f1004de2c7: Status 404 returned error can't find the container with id 452d8edec303d3834903f96308b5cab6c64ff85d8e28012426cdf5f1004de2c7 Mar 14 08:45:01 crc kubenswrapper[5129]: I0314 08:45:01.230487 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" event={"ID":"210287bb-cc01-4243-ac52-f907bb970aa4","Type":"ContainerStarted","Data":"3b4f9c4100a2d7748487245e7c1632f84d707d6d12c708883f11dfb9f537a6d9"} Mar 14 08:45:01 crc kubenswrapper[5129]: I0314 08:45:01.230952 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" event={"ID":"210287bb-cc01-4243-ac52-f907bb970aa4","Type":"ContainerStarted","Data":"452d8edec303d3834903f96308b5cab6c64ff85d8e28012426cdf5f1004de2c7"} Mar 14 08:45:02 crc kubenswrapper[5129]: I0314 08:45:02.243822 5129 generic.go:334] "Generic (PLEG): container finished" podID="210287bb-cc01-4243-ac52-f907bb970aa4" containerID="3b4f9c4100a2d7748487245e7c1632f84d707d6d12c708883f11dfb9f537a6d9" exitCode=0 Mar 14 08:45:02 crc kubenswrapper[5129]: I0314 08:45:02.243912 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" event={"ID":"210287bb-cc01-4243-ac52-f907bb970aa4","Type":"ContainerDied","Data":"3b4f9c4100a2d7748487245e7c1632f84d707d6d12c708883f11dfb9f537a6d9"} Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.614147 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.737970 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9rz8\" (UniqueName: \"kubernetes.io/projected/210287bb-cc01-4243-ac52-f907bb970aa4-kube-api-access-c9rz8\") pod \"210287bb-cc01-4243-ac52-f907bb970aa4\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.738118 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210287bb-cc01-4243-ac52-f907bb970aa4-config-volume\") pod \"210287bb-cc01-4243-ac52-f907bb970aa4\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.738160 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210287bb-cc01-4243-ac52-f907bb970aa4-secret-volume\") pod \"210287bb-cc01-4243-ac52-f907bb970aa4\" (UID: \"210287bb-cc01-4243-ac52-f907bb970aa4\") " Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.738727 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210287bb-cc01-4243-ac52-f907bb970aa4-config-volume" (OuterVolumeSpecName: "config-volume") pod "210287bb-cc01-4243-ac52-f907bb970aa4" (UID: "210287bb-cc01-4243-ac52-f907bb970aa4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.744340 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210287bb-cc01-4243-ac52-f907bb970aa4-kube-api-access-c9rz8" (OuterVolumeSpecName: "kube-api-access-c9rz8") pod "210287bb-cc01-4243-ac52-f907bb970aa4" (UID: "210287bb-cc01-4243-ac52-f907bb970aa4"). InnerVolumeSpecName "kube-api-access-c9rz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.744555 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210287bb-cc01-4243-ac52-f907bb970aa4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "210287bb-cc01-4243-ac52-f907bb970aa4" (UID: "210287bb-cc01-4243-ac52-f907bb970aa4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.840902 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210287bb-cc01-4243-ac52-f907bb970aa4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.840943 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210287bb-cc01-4243-ac52-f907bb970aa4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:03 crc kubenswrapper[5129]: I0314 08:45:03.840955 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9rz8\" (UniqueName: \"kubernetes.io/projected/210287bb-cc01-4243-ac52-f907bb970aa4-kube-api-access-c9rz8\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:04 crc kubenswrapper[5129]: I0314 08:45:04.265330 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" event={"ID":"210287bb-cc01-4243-ac52-f907bb970aa4","Type":"ContainerDied","Data":"452d8edec303d3834903f96308b5cab6c64ff85d8e28012426cdf5f1004de2c7"} Mar 14 08:45:04 crc kubenswrapper[5129]: I0314 08:45:04.265384 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="452d8edec303d3834903f96308b5cab6c64ff85d8e28012426cdf5f1004de2c7" Mar 14 08:45:04 crc kubenswrapper[5129]: I0314 08:45:04.265450 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9" Mar 14 08:45:04 crc kubenswrapper[5129]: I0314 08:45:04.337338 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg"] Mar 14 08:45:04 crc kubenswrapper[5129]: I0314 08:45:04.343075 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557920-rrqbg"] Mar 14 08:45:06 crc kubenswrapper[5129]: I0314 08:45:06.051200 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f53960c-a7ff-458b-a85c-bb252e8da404" path="/var/lib/kubelet/pods/6f53960c-a7ff-458b-a85c-bb252e8da404/volumes" Mar 14 08:45:19 crc kubenswrapper[5129]: I0314 08:45:19.574964 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:45:19 crc kubenswrapper[5129]: I0314 08:45:19.575760 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:45:26 crc kubenswrapper[5129]: I0314 08:45:26.694999 5129 scope.go:117] "RemoveContainer" containerID="327635a2f8d665b5ace7520c44a2bab8a326b59cc630f7dec69f389d499e6420" Mar 14 08:45:26 crc kubenswrapper[5129]: I0314 08:45:26.728536 5129 scope.go:117] "RemoveContainer" containerID="648ac160b0e5ef41c7d315fd4639e4deaf049211c1b21000a6726f06f5ede149" Mar 14 08:45:49 crc kubenswrapper[5129]: I0314 08:45:49.575690 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:45:49 crc kubenswrapper[5129]: I0314 08:45:49.576747 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.158879 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557966-5z52h"] Mar 14 08:46:00 crc kubenswrapper[5129]: E0314 08:46:00.159983 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="210287bb-cc01-4243-ac52-f907bb970aa4" containerName="collect-profiles" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.160002 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="210287bb-cc01-4243-ac52-f907bb970aa4" containerName="collect-profiles" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.160224 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="210287bb-cc01-4243-ac52-f907bb970aa4" containerName="collect-profiles" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.160972 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-5z52h" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.163289 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.172918 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.176309 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.183784 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-5z52h"] Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.304119 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zspl\" (UniqueName: \"kubernetes.io/projected/87de6820-6724-4f06-aa7d-9f19b17168be-kube-api-access-4zspl\") pod \"auto-csr-approver-29557966-5z52h\" (UID: \"87de6820-6724-4f06-aa7d-9f19b17168be\") " pod="openshift-infra/auto-csr-approver-29557966-5z52h" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.406599 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zspl\" (UniqueName: \"kubernetes.io/projected/87de6820-6724-4f06-aa7d-9f19b17168be-kube-api-access-4zspl\") pod \"auto-csr-approver-29557966-5z52h\" (UID: \"87de6820-6724-4f06-aa7d-9f19b17168be\") " pod="openshift-infra/auto-csr-approver-29557966-5z52h" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.433386 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zspl\" (UniqueName: \"kubernetes.io/projected/87de6820-6724-4f06-aa7d-9f19b17168be-kube-api-access-4zspl\") pod \"auto-csr-approver-29557966-5z52h\" (UID: \"87de6820-6724-4f06-aa7d-9f19b17168be\") " pod="openshift-infra/auto-csr-approver-29557966-5z52h" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.486787 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-5z52h" Mar 14 08:46:00 crc kubenswrapper[5129]: I0314 08:46:00.978757 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-5z52h"] Mar 14 08:46:00 crc kubenswrapper[5129]: W0314 08:46:00.984185 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87de6820_6724_4f06_aa7d_9f19b17168be.slice/crio-9d9bce098408ea250ce32efac7e8c1428a86753e9bcaf5ff14ef956b8b7586dc WatchSource:0}: Error finding container 9d9bce098408ea250ce32efac7e8c1428a86753e9bcaf5ff14ef956b8b7586dc: Status 404 returned error can't find the container with id 9d9bce098408ea250ce32efac7e8c1428a86753e9bcaf5ff14ef956b8b7586dc Mar 14 08:46:01 crc kubenswrapper[5129]: I0314 08:46:01.810866 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557966-5z52h" event={"ID":"87de6820-6724-4f06-aa7d-9f19b17168be","Type":"ContainerStarted","Data":"9d9bce098408ea250ce32efac7e8c1428a86753e9bcaf5ff14ef956b8b7586dc"} Mar 14 08:46:02 crc kubenswrapper[5129]: I0314 08:46:02.820286 5129 generic.go:334] "Generic (PLEG): container finished" podID="87de6820-6724-4f06-aa7d-9f19b17168be" containerID="c78c3127d06dbe143fb98bbb25ed91170dfb746c69c0fd486ad21bc6c6b328f8" exitCode=0 Mar 14 08:46:02 crc kubenswrapper[5129]: I0314 08:46:02.820393 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557966-5z52h" event={"ID":"87de6820-6724-4f06-aa7d-9f19b17168be","Type":"ContainerDied","Data":"c78c3127d06dbe143fb98bbb25ed91170dfb746c69c0fd486ad21bc6c6b328f8"} Mar 14 08:46:04 crc kubenswrapper[5129]: I0314 08:46:04.227835 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-5z52h" Mar 14 08:46:04 crc kubenswrapper[5129]: I0314 08:46:04.395350 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zspl\" (UniqueName: \"kubernetes.io/projected/87de6820-6724-4f06-aa7d-9f19b17168be-kube-api-access-4zspl\") pod \"87de6820-6724-4f06-aa7d-9f19b17168be\" (UID: \"87de6820-6724-4f06-aa7d-9f19b17168be\") " Mar 14 08:46:04 crc kubenswrapper[5129]: I0314 08:46:04.404716 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87de6820-6724-4f06-aa7d-9f19b17168be-kube-api-access-4zspl" (OuterVolumeSpecName: "kube-api-access-4zspl") pod "87de6820-6724-4f06-aa7d-9f19b17168be" (UID: "87de6820-6724-4f06-aa7d-9f19b17168be"). InnerVolumeSpecName "kube-api-access-4zspl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:46:04 crc kubenswrapper[5129]: I0314 08:46:04.497535 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zspl\" (UniqueName: \"kubernetes.io/projected/87de6820-6724-4f06-aa7d-9f19b17168be-kube-api-access-4zspl\") on node \"crc\" DevicePath \"\"" Mar 14 08:46:04 crc kubenswrapper[5129]: I0314 08:46:04.850059 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557966-5z52h" event={"ID":"87de6820-6724-4f06-aa7d-9f19b17168be","Type":"ContainerDied","Data":"9d9bce098408ea250ce32efac7e8c1428a86753e9bcaf5ff14ef956b8b7586dc"} Mar 14 08:46:04 crc kubenswrapper[5129]: I0314 08:46:04.850111 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d9bce098408ea250ce32efac7e8c1428a86753e9bcaf5ff14ef956b8b7586dc" Mar 14 08:46:04 crc kubenswrapper[5129]: I0314 08:46:04.850192 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-5z52h" Mar 14 08:46:05 crc kubenswrapper[5129]: I0314 08:46:05.315847 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-kkw5p"] Mar 14 08:46:05 crc kubenswrapper[5129]: I0314 08:46:05.323239 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-kkw5p"] Mar 14 08:46:06 crc kubenswrapper[5129]: I0314 08:46:06.056118 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4408d59-d046-40f4-a52d-8c87a07200f8" path="/var/lib/kubelet/pods/c4408d59-d046-40f4-a52d-8c87a07200f8/volumes" Mar 14 08:46:19 crc kubenswrapper[5129]: I0314 08:46:19.575351 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:46:19 crc kubenswrapper[5129]: I0314 08:46:19.576080 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:46:19 crc kubenswrapper[5129]: I0314 08:46:19.576139 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:46:19 crc kubenswrapper[5129]: I0314 08:46:19.576759 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:46:19 crc kubenswrapper[5129]: I0314 08:46:19.576808 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" gracePeriod=600 Mar 14 08:46:20 crc kubenswrapper[5129]: I0314 08:46:20.004146 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" exitCode=0 Mar 14 08:46:20 crc kubenswrapper[5129]: I0314 08:46:20.004216 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20"} Mar 14 08:46:20 crc kubenswrapper[5129]: I0314 08:46:20.004260 5129 scope.go:117] "RemoveContainer" containerID="b7dd76c976b1c799fda9092d5895c529ac9cbd574cb01ccbcc80fb6c3c94c49a" Mar 14 08:46:20 crc kubenswrapper[5129]: E0314 08:46:20.394937 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:46:21 crc kubenswrapper[5129]: I0314 08:46:21.024949 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:46:21 crc kubenswrapper[5129]: E0314 08:46:21.026063 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:46:26 crc kubenswrapper[5129]: I0314 08:46:26.802826 5129 scope.go:117] "RemoveContainer" containerID="48c2044f325a7aed9472dbaca322b356ea39d6fa5711d2bb573922ff271520e9" Mar 14 08:46:35 crc kubenswrapper[5129]: I0314 08:46:35.036685 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:46:35 crc kubenswrapper[5129]: E0314 08:46:35.037578 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:46:49 crc kubenswrapper[5129]: I0314 08:46:49.036378 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:46:49 crc kubenswrapper[5129]: E0314 08:46:49.037333 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:47:01 crc kubenswrapper[5129]: I0314 08:47:01.036554 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:47:01 crc kubenswrapper[5129]: E0314 08:47:01.037901 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:47:16 crc kubenswrapper[5129]: I0314 08:47:16.036488 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:47:16 crc kubenswrapper[5129]: E0314 08:47:16.037740 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:47:30 crc kubenswrapper[5129]: I0314 08:47:30.037245 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:47:30 crc kubenswrapper[5129]: E0314 08:47:30.038486 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:47:44 crc kubenswrapper[5129]: I0314 08:47:44.037207 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:47:44 crc kubenswrapper[5129]: E0314 08:47:44.039487 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:47:56 crc kubenswrapper[5129]: I0314 08:47:56.036999 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:47:56 crc kubenswrapper[5129]: E0314 08:47:56.039854 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.165244 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557968-k958l"] Mar 14 08:48:00 crc kubenswrapper[5129]: E0314 08:48:00.166489 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87de6820-6724-4f06-aa7d-9f19b17168be" containerName="oc" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.166529 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="87de6820-6724-4f06-aa7d-9f19b17168be" containerName="oc" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.167016 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="87de6820-6724-4f06-aa7d-9f19b17168be" containerName="oc" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.168189 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-k958l" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.171436 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.171638 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.172000 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.174794 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-k958l"] Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.354152 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkspw\" (UniqueName: \"kubernetes.io/projected/6c6372c0-b042-46c4-848b-c3dc35afc8ba-kube-api-access-vkspw\") pod \"auto-csr-approver-29557968-k958l\" (UID: \"6c6372c0-b042-46c4-848b-c3dc35afc8ba\") " pod="openshift-infra/auto-csr-approver-29557968-k958l" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.456195 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkspw\" (UniqueName: \"kubernetes.io/projected/6c6372c0-b042-46c4-848b-c3dc35afc8ba-kube-api-access-vkspw\") pod \"auto-csr-approver-29557968-k958l\" (UID: \"6c6372c0-b042-46c4-848b-c3dc35afc8ba\") " pod="openshift-infra/auto-csr-approver-29557968-k958l" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.483707 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkspw\" (UniqueName: \"kubernetes.io/projected/6c6372c0-b042-46c4-848b-c3dc35afc8ba-kube-api-access-vkspw\") pod \"auto-csr-approver-29557968-k958l\" (UID: \"6c6372c0-b042-46c4-848b-c3dc35afc8ba\") " pod="openshift-infra/auto-csr-approver-29557968-k958l" Mar 14 08:48:00 crc kubenswrapper[5129]: I0314 08:48:00.497571 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-k958l" Mar 14 08:48:01 crc kubenswrapper[5129]: I0314 08:48:01.008090 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-k958l"] Mar 14 08:48:01 crc kubenswrapper[5129]: I0314 08:48:01.059532 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557968-k958l" event={"ID":"6c6372c0-b042-46c4-848b-c3dc35afc8ba","Type":"ContainerStarted","Data":"3533b180541dd1108bc78257920535662d4f7fd765b2d320371e4dd68835ed0d"} Mar 14 08:48:03 crc kubenswrapper[5129]: I0314 08:48:03.094318 5129 generic.go:334] "Generic (PLEG): container finished" podID="6c6372c0-b042-46c4-848b-c3dc35afc8ba" containerID="f7797eb1a99fa91beda198f31bbc4a17b990033593a308b96aae9db21024380d" exitCode=0 Mar 14 08:48:03 crc kubenswrapper[5129]: I0314 08:48:03.094388 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557968-k958l" event={"ID":"6c6372c0-b042-46c4-848b-c3dc35afc8ba","Type":"ContainerDied","Data":"f7797eb1a99fa91beda198f31bbc4a17b990033593a308b96aae9db21024380d"} Mar 14 08:48:04 crc kubenswrapper[5129]: I0314 08:48:04.420421 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-k958l" Mar 14 08:48:04 crc kubenswrapper[5129]: I0314 08:48:04.531473 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkspw\" (UniqueName: \"kubernetes.io/projected/6c6372c0-b042-46c4-848b-c3dc35afc8ba-kube-api-access-vkspw\") pod \"6c6372c0-b042-46c4-848b-c3dc35afc8ba\" (UID: \"6c6372c0-b042-46c4-848b-c3dc35afc8ba\") " Mar 14 08:48:04 crc kubenswrapper[5129]: I0314 08:48:04.538343 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6372c0-b042-46c4-848b-c3dc35afc8ba-kube-api-access-vkspw" (OuterVolumeSpecName: "kube-api-access-vkspw") pod "6c6372c0-b042-46c4-848b-c3dc35afc8ba" (UID: "6c6372c0-b042-46c4-848b-c3dc35afc8ba"). InnerVolumeSpecName "kube-api-access-vkspw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:04 crc kubenswrapper[5129]: I0314 08:48:04.633651 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkspw\" (UniqueName: \"kubernetes.io/projected/6c6372c0-b042-46c4-848b-c3dc35afc8ba-kube-api-access-vkspw\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:05 crc kubenswrapper[5129]: I0314 08:48:05.116645 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557968-k958l" event={"ID":"6c6372c0-b042-46c4-848b-c3dc35afc8ba","Type":"ContainerDied","Data":"3533b180541dd1108bc78257920535662d4f7fd765b2d320371e4dd68835ed0d"} Mar 14 08:48:05 crc kubenswrapper[5129]: I0314 08:48:05.116687 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3533b180541dd1108bc78257920535662d4f7fd765b2d320371e4dd68835ed0d" Mar 14 08:48:05 crc kubenswrapper[5129]: I0314 08:48:05.116739 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-k958l" Mar 14 08:48:05 crc kubenswrapper[5129]: I0314 08:48:05.494471 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-5zh9s"] Mar 14 08:48:05 crc kubenswrapper[5129]: I0314 08:48:05.520967 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-5zh9s"] Mar 14 08:48:06 crc kubenswrapper[5129]: I0314 08:48:06.050706 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432b6523-ff6c-48e5-808f-8804f72613b8" path="/var/lib/kubelet/pods/432b6523-ff6c-48e5-808f-8804f72613b8/volumes" Mar 14 08:48:08 crc kubenswrapper[5129]: I0314 08:48:08.047948 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:48:08 crc kubenswrapper[5129]: E0314 08:48:08.048493 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:48:19 crc kubenswrapper[5129]: I0314 08:48:19.036774 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:48:19 crc kubenswrapper[5129]: E0314 08:48:19.037445 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:48:26 crc kubenswrapper[5129]: I0314 08:48:26.891973 5129 scope.go:117] "RemoveContainer" containerID="6f8d25f35db04ee5621c2b38142e20e8755a197450ff9f18bcbf0bc1548892a9" Mar 14 08:48:26 crc kubenswrapper[5129]: I0314 08:48:26.924444 5129 scope.go:117] "RemoveContainer" containerID="ae85c91f1f75bb4a2d7a9f3181fef3861049cc1dce0e6fa31dd44e32301a6f59" Mar 14 08:48:26 crc kubenswrapper[5129]: I0314 08:48:26.999516 5129 scope.go:117] "RemoveContainer" containerID="a06a86b6c2bf9a87fb67c83a6e8a76bb3bc86241b37ff0b9545d5d1c2486e570" Mar 14 08:48:27 crc kubenswrapper[5129]: I0314 08:48:27.027739 5129 scope.go:117] "RemoveContainer" containerID="46aceb00dcea7201ccb69d49331148ea33899d3634da21f6f09db104b438c6e7" Mar 14 08:48:34 crc kubenswrapper[5129]: I0314 08:48:34.037667 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:48:34 crc kubenswrapper[5129]: E0314 08:48:34.038490 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:48:47 crc kubenswrapper[5129]: I0314 08:48:47.037379 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:48:47 crc kubenswrapper[5129]: E0314 08:48:47.038782 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:48:58 crc kubenswrapper[5129]: I0314 08:48:58.037025 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:48:58 crc kubenswrapper[5129]: E0314 08:48:58.040395 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:49:09 crc kubenswrapper[5129]: I0314 08:49:09.882361 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7bz2g"] Mar 14 08:49:09 crc kubenswrapper[5129]: E0314 08:49:09.886447 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6372c0-b042-46c4-848b-c3dc35afc8ba" containerName="oc" Mar 14 08:49:09 crc kubenswrapper[5129]: I0314 08:49:09.886487 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6372c0-b042-46c4-848b-c3dc35afc8ba" containerName="oc" Mar 14 08:49:09 crc kubenswrapper[5129]: I0314 08:49:09.886844 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c6372c0-b042-46c4-848b-c3dc35afc8ba" containerName="oc" Mar 14 08:49:09 crc kubenswrapper[5129]: I0314 08:49:09.888814 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:09 crc kubenswrapper[5129]: I0314 08:49:09.895357 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bz2g"] Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.037543 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kchvp\" (UniqueName: \"kubernetes.io/projected/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-kube-api-access-kchvp\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.037841 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-catalog-content\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.037876 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-utilities\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.139392 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kchvp\" (UniqueName: \"kubernetes.io/projected/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-kube-api-access-kchvp\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.139520 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-catalog-content\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.139541 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-utilities\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.140069 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-utilities\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.140406 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-catalog-content\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.184317 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kchvp\" (UniqueName: \"kubernetes.io/projected/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-kube-api-access-kchvp\") pod \"community-operators-7bz2g\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.219681 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.593742 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bz2g"] Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.786914 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerStarted","Data":"bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a"} Mar 14 08:49:10 crc kubenswrapper[5129]: I0314 08:49:10.787325 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerStarted","Data":"fe76c3ebcdcc7e1499b834db707d0a0faeccf7563c1b0310cd1f51ce75df6cd5"} Mar 14 08:49:11 crc kubenswrapper[5129]: I0314 08:49:11.800715 5129 generic.go:334] "Generic (PLEG): container finished" podID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerID="bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a" exitCode=0 Mar 14 08:49:11 crc kubenswrapper[5129]: I0314 08:49:11.800804 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerDied","Data":"bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a"} Mar 14 08:49:11 crc kubenswrapper[5129]: I0314 08:49:11.803547 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:49:12 crc kubenswrapper[5129]: I0314 08:49:12.818896 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerStarted","Data":"50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16"} Mar 14 08:49:13 crc kubenswrapper[5129]: I0314 08:49:13.037153 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:49:13 crc kubenswrapper[5129]: E0314 08:49:13.037472 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:49:13 crc kubenswrapper[5129]: I0314 08:49:13.832081 5129 generic.go:334] "Generic (PLEG): container finished" podID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerID="50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16" exitCode=0 Mar 14 08:49:13 crc kubenswrapper[5129]: I0314 08:49:13.832246 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerDied","Data":"50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16"} Mar 14 08:49:14 crc kubenswrapper[5129]: I0314 08:49:14.845721 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerStarted","Data":"41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41"} Mar 14 08:49:14 crc kubenswrapper[5129]: I0314 08:49:14.870704 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7bz2g" podStartSLOduration=3.321953157 podStartE2EDuration="5.870679617s" podCreationTimestamp="2026-03-14 08:49:09 +0000 UTC" firstStartedPulling="2026-03-14 08:49:11.803088009 +0000 UTC m=+6614.555003223" lastFinishedPulling="2026-03-14 08:49:14.351814469 +0000 UTC m=+6617.103729683" observedRunningTime="2026-03-14 08:49:14.867910522 +0000 UTC m=+6617.619825746" watchObservedRunningTime="2026-03-14 08:49:14.870679617 +0000 UTC m=+6617.622594821" Mar 14 08:49:20 crc kubenswrapper[5129]: I0314 08:49:20.220090 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:20 crc kubenswrapper[5129]: I0314 08:49:20.220731 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:20 crc kubenswrapper[5129]: I0314 08:49:20.298309 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:20 crc kubenswrapper[5129]: I0314 08:49:20.954018 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:21 crc kubenswrapper[5129]: I0314 08:49:21.014321 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bz2g"] Mar 14 08:49:22 crc kubenswrapper[5129]: I0314 08:49:22.911899 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7bz2g" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="registry-server" containerID="cri-o://41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41" gracePeriod=2 Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.872680 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.927126 5129 generic.go:334] "Generic (PLEG): container finished" podID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerID="41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41" exitCode=0 Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.927222 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bz2g" Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.927211 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerDied","Data":"41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41"} Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.927307 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bz2g" event={"ID":"ffc0c3e7-ba61-48f0-9481-b10aac1a858e","Type":"ContainerDied","Data":"fe76c3ebcdcc7e1499b834db707d0a0faeccf7563c1b0310cd1f51ce75df6cd5"} Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.927333 5129 scope.go:117] "RemoveContainer" containerID="41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41" Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.946973 5129 scope.go:117] "RemoveContainer" containerID="50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16" Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.967528 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kchvp\" (UniqueName: \"kubernetes.io/projected/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-kube-api-access-kchvp\") pod \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.968113 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-catalog-content\") pod \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.968158 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-utilities\") pod \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\" (UID: \"ffc0c3e7-ba61-48f0-9481-b10aac1a858e\") " Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.968656 5129 scope.go:117] "RemoveContainer" containerID="bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a" Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.969521 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-utilities" (OuterVolumeSpecName: "utilities") pod "ffc0c3e7-ba61-48f0-9481-b10aac1a858e" (UID: "ffc0c3e7-ba61-48f0-9481-b10aac1a858e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:49:23 crc kubenswrapper[5129]: I0314 08:49:23.974357 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-kube-api-access-kchvp" (OuterVolumeSpecName: "kube-api-access-kchvp") pod "ffc0c3e7-ba61-48f0-9481-b10aac1a858e" (UID: "ffc0c3e7-ba61-48f0-9481-b10aac1a858e"). InnerVolumeSpecName "kube-api-access-kchvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.027718 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffc0c3e7-ba61-48f0-9481-b10aac1a858e" (UID: "ffc0c3e7-ba61-48f0-9481-b10aac1a858e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.047380 5129 scope.go:117] "RemoveContainer" containerID="41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41" Mar 14 08:49:24 crc kubenswrapper[5129]: E0314 08:49:24.047846 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41\": container with ID starting with 41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41 not found: ID does not exist" containerID="41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.047900 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41"} err="failed to get container status \"41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41\": rpc error: code = NotFound desc = could not find container \"41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41\": container with ID starting with 41feb1fef4cc308680a514017ab88a355cdc992022e6a45aa5a93789555feb41 not found: ID does not exist" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.047933 5129 scope.go:117] "RemoveContainer" containerID="50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16" Mar 14 08:49:24 crc kubenswrapper[5129]: E0314 08:49:24.048240 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16\": container with ID starting with 50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16 not found: ID does not exist" containerID="50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.048271 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16"} err="failed to get container status \"50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16\": rpc error: code = NotFound desc = could not find container \"50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16\": container with ID starting with 50fe860eb59c3078beb27146224be6bd1f2b8ae046941837a2fa85cfb9328f16 not found: ID does not exist" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.048288 5129 scope.go:117] "RemoveContainer" containerID="bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a" Mar 14 08:49:24 crc kubenswrapper[5129]: E0314 08:49:24.048930 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a\": container with ID starting with bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a not found: ID does not exist" containerID="bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.049026 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a"} err="failed to get container status \"bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a\": rpc error: code = NotFound desc = could not find container \"bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a\": container with ID starting with bce725c54b1d3326a70b25ba95d9086bc0f82d4bc13a65bc625a740cb7b4a75a not found: ID does not exist" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.070379 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kchvp\" (UniqueName: \"kubernetes.io/projected/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-kube-api-access-kchvp\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.070414 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.070424 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc0c3e7-ba61-48f0-9481-b10aac1a858e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.257796 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bz2g"] Mar 14 08:49:24 crc kubenswrapper[5129]: I0314 08:49:24.269477 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7bz2g"] Mar 14 08:49:26 crc kubenswrapper[5129]: I0314 08:49:26.052993 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" path="/var/lib/kubelet/pods/ffc0c3e7-ba61-48f0-9481-b10aac1a858e/volumes" Mar 14 08:49:27 crc kubenswrapper[5129]: I0314 08:49:27.037031 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:49:27 crc kubenswrapper[5129]: E0314 08:49:27.037928 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:49:42 crc kubenswrapper[5129]: I0314 08:49:42.037996 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:49:42 crc kubenswrapper[5129]: E0314 08:49:42.038927 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:49:53 crc kubenswrapper[5129]: I0314 08:49:53.036275 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:49:53 crc kubenswrapper[5129]: E0314 08:49:53.037110 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.165541 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557970-xr5ck"] Mar 14 08:50:00 crc kubenswrapper[5129]: E0314 08:50:00.168208 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="registry-server" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.168342 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="registry-server" Mar 14 08:50:00 crc kubenswrapper[5129]: E0314 08:50:00.168510 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="extract-utilities" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.168632 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="extract-utilities" Mar 14 08:50:00 crc kubenswrapper[5129]: E0314 08:50:00.168722 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="extract-content" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.168797 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="extract-content" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.169235 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc0c3e7-ba61-48f0-9481-b10aac1a858e" containerName="registry-server" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.170022 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.173358 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.173416 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.176630 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-xr5ck"] Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.176811 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.317442 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dvtl\" (UniqueName: \"kubernetes.io/projected/e66089f6-5a01-4d3a-b32c-155de9e2398b-kube-api-access-2dvtl\") pod \"auto-csr-approver-29557970-xr5ck\" (UID: \"e66089f6-5a01-4d3a-b32c-155de9e2398b\") " pod="openshift-infra/auto-csr-approver-29557970-xr5ck" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.419373 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dvtl\" (UniqueName: \"kubernetes.io/projected/e66089f6-5a01-4d3a-b32c-155de9e2398b-kube-api-access-2dvtl\") pod \"auto-csr-approver-29557970-xr5ck\" (UID: \"e66089f6-5a01-4d3a-b32c-155de9e2398b\") " pod="openshift-infra/auto-csr-approver-29557970-xr5ck" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.436648 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dvtl\" (UniqueName: \"kubernetes.io/projected/e66089f6-5a01-4d3a-b32c-155de9e2398b-kube-api-access-2dvtl\") pod \"auto-csr-approver-29557970-xr5ck\" (UID: \"e66089f6-5a01-4d3a-b32c-155de9e2398b\") " pod="openshift-infra/auto-csr-approver-29557970-xr5ck" Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.492102 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" Mar 14 08:50:00 crc kubenswrapper[5129]: W0314 08:50:00.956664 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode66089f6_5a01_4d3a_b32c_155de9e2398b.slice/crio-1c141bddaf3fed296b03d485a2c6ac42f16382b27c7ff27e12fb531a2bdd652a WatchSource:0}: Error finding container 1c141bddaf3fed296b03d485a2c6ac42f16382b27c7ff27e12fb531a2bdd652a: Status 404 returned error can't find the container with id 1c141bddaf3fed296b03d485a2c6ac42f16382b27c7ff27e12fb531a2bdd652a Mar 14 08:50:00 crc kubenswrapper[5129]: I0314 08:50:00.958437 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-xr5ck"] Mar 14 08:50:01 crc kubenswrapper[5129]: I0314 08:50:01.261913 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" event={"ID":"e66089f6-5a01-4d3a-b32c-155de9e2398b","Type":"ContainerStarted","Data":"1c141bddaf3fed296b03d485a2c6ac42f16382b27c7ff27e12fb531a2bdd652a"} Mar 14 08:50:02 crc kubenswrapper[5129]: I0314 08:50:02.270084 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" event={"ID":"e66089f6-5a01-4d3a-b32c-155de9e2398b","Type":"ContainerStarted","Data":"97ff25cc7053f5b7d0f960e6e69c338ba100db06b017e4e29d9b56e222cc7653"} Mar 14 08:50:02 crc kubenswrapper[5129]: I0314 08:50:02.286012 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" podStartSLOduration=1.290383109 podStartE2EDuration="2.285976728s" podCreationTimestamp="2026-03-14 08:50:00 +0000 UTC" firstStartedPulling="2026-03-14 08:50:00.961504778 +0000 UTC m=+6663.713419962" lastFinishedPulling="2026-03-14 08:50:01.957098387 +0000 UTC m=+6664.709013581" observedRunningTime="2026-03-14 08:50:02.283238764 +0000 UTC m=+6665.035153938" watchObservedRunningTime="2026-03-14 08:50:02.285976728 +0000 UTC m=+6665.037891902" Mar 14 08:50:03 crc kubenswrapper[5129]: I0314 08:50:03.280273 5129 generic.go:334] "Generic (PLEG): container finished" podID="e66089f6-5a01-4d3a-b32c-155de9e2398b" containerID="97ff25cc7053f5b7d0f960e6e69c338ba100db06b017e4e29d9b56e222cc7653" exitCode=0 Mar 14 08:50:03 crc kubenswrapper[5129]: I0314 08:50:03.280334 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" event={"ID":"e66089f6-5a01-4d3a-b32c-155de9e2398b","Type":"ContainerDied","Data":"97ff25cc7053f5b7d0f960e6e69c338ba100db06b017e4e29d9b56e222cc7653"} Mar 14 08:50:04 crc kubenswrapper[5129]: I0314 08:50:04.639031 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" Mar 14 08:50:04 crc kubenswrapper[5129]: I0314 08:50:04.784446 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dvtl\" (UniqueName: \"kubernetes.io/projected/e66089f6-5a01-4d3a-b32c-155de9e2398b-kube-api-access-2dvtl\") pod \"e66089f6-5a01-4d3a-b32c-155de9e2398b\" (UID: \"e66089f6-5a01-4d3a-b32c-155de9e2398b\") " Mar 14 08:50:04 crc kubenswrapper[5129]: I0314 08:50:04.792374 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66089f6-5a01-4d3a-b32c-155de9e2398b-kube-api-access-2dvtl" (OuterVolumeSpecName: "kube-api-access-2dvtl") pod "e66089f6-5a01-4d3a-b32c-155de9e2398b" (UID: "e66089f6-5a01-4d3a-b32c-155de9e2398b"). InnerVolumeSpecName "kube-api-access-2dvtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:04 crc kubenswrapper[5129]: I0314 08:50:04.887144 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dvtl\" (UniqueName: \"kubernetes.io/projected/e66089f6-5a01-4d3a-b32c-155de9e2398b-kube-api-access-2dvtl\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:05 crc kubenswrapper[5129]: I0314 08:50:05.036494 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:50:05 crc kubenswrapper[5129]: E0314 08:50:05.036913 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:50:05 crc kubenswrapper[5129]: I0314 08:50:05.293059 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" event={"ID":"e66089f6-5a01-4d3a-b32c-155de9e2398b","Type":"ContainerDied","Data":"1c141bddaf3fed296b03d485a2c6ac42f16382b27c7ff27e12fb531a2bdd652a"} Mar 14 08:50:05 crc kubenswrapper[5129]: I0314 08:50:05.293108 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c141bddaf3fed296b03d485a2c6ac42f16382b27c7ff27e12fb531a2bdd652a" Mar 14 08:50:05 crc kubenswrapper[5129]: I0314 08:50:05.293168 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-xr5ck" Mar 14 08:50:05 crc kubenswrapper[5129]: I0314 08:50:05.348880 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-xqqdk"] Mar 14 08:50:05 crc kubenswrapper[5129]: I0314 08:50:05.354279 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-xqqdk"] Mar 14 08:50:06 crc kubenswrapper[5129]: I0314 08:50:06.045220 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a97190-2a99-46cc-b51b-ce843d0c7dcc" path="/var/lib/kubelet/pods/d0a97190-2a99-46cc-b51b-ce843d0c7dcc/volumes" Mar 14 08:50:20 crc kubenswrapper[5129]: I0314 08:50:20.036667 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:50:20 crc kubenswrapper[5129]: E0314 08:50:20.037290 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:50:27 crc kubenswrapper[5129]: I0314 08:50:27.145245 5129 scope.go:117] "RemoveContainer" containerID="b34cb7c778702e7045381aa2e9cacf2d3de65cad2a37e49f600c4611adab8e92" Mar 14 08:50:34 crc kubenswrapper[5129]: I0314 08:50:34.036846 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:50:34 crc kubenswrapper[5129]: E0314 08:50:34.037352 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:50:49 crc kubenswrapper[5129]: I0314 08:50:49.037066 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:50:49 crc kubenswrapper[5129]: E0314 08:50:49.037908 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:51:01 crc kubenswrapper[5129]: I0314 08:51:01.037773 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:51:01 crc kubenswrapper[5129]: E0314 08:51:01.038896 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:51:12 crc kubenswrapper[5129]: I0314 08:51:12.036872 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:51:12 crc kubenswrapper[5129]: E0314 08:51:12.038030 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:51:23 crc kubenswrapper[5129]: I0314 08:51:23.037275 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:51:24 crc kubenswrapper[5129]: I0314 08:51:24.002248 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"08fb3f0642680c2408576300f89e73c1a127607f27f00eed4c184f6d6d9f3720"} Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.159795 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557972-g8qfb"] Mar 14 08:52:00 crc kubenswrapper[5129]: E0314 08:52:00.160713 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66089f6-5a01-4d3a-b32c-155de9e2398b" containerName="oc" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.160727 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66089f6-5a01-4d3a-b32c-155de9e2398b" containerName="oc" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.160889 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66089f6-5a01-4d3a-b32c-155de9e2398b" containerName="oc" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.161435 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-g8qfb" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.163562 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.164415 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.165430 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.170273 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-g8qfb"] Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.284622 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvxt\" (UniqueName: \"kubernetes.io/projected/38f1a22d-c7b8-4fd6-8111-55421a6d8b4e-kube-api-access-2wvxt\") pod \"auto-csr-approver-29557972-g8qfb\" (UID: \"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e\") " pod="openshift-infra/auto-csr-approver-29557972-g8qfb" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.385622 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvxt\" (UniqueName: \"kubernetes.io/projected/38f1a22d-c7b8-4fd6-8111-55421a6d8b4e-kube-api-access-2wvxt\") pod \"auto-csr-approver-29557972-g8qfb\" (UID: \"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e\") " pod="openshift-infra/auto-csr-approver-29557972-g8qfb" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.410400 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvxt\" (UniqueName: \"kubernetes.io/projected/38f1a22d-c7b8-4fd6-8111-55421a6d8b4e-kube-api-access-2wvxt\") pod \"auto-csr-approver-29557972-g8qfb\" (UID: \"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e\") " pod="openshift-infra/auto-csr-approver-29557972-g8qfb" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.486560 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-g8qfb" Mar 14 08:52:00 crc kubenswrapper[5129]: I0314 08:52:00.918089 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-g8qfb"] Mar 14 08:52:01 crc kubenswrapper[5129]: I0314 08:52:01.354621 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557972-g8qfb" event={"ID":"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e","Type":"ContainerStarted","Data":"a561c0d6a0c127892dd1b642a1d9c5cb002cd5c4c1a064a6e706541eb1c4e313"} Mar 14 08:52:03 crc kubenswrapper[5129]: I0314 08:52:03.372190 5129 generic.go:334] "Generic (PLEG): container finished" podID="38f1a22d-c7b8-4fd6-8111-55421a6d8b4e" containerID="75f40604138b8ecf4521cb7a52943b7c7bdc5060bfb188dd9eaf0ae68b6569fb" exitCode=0 Mar 14 08:52:03 crc kubenswrapper[5129]: I0314 08:52:03.372234 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557972-g8qfb" event={"ID":"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e","Type":"ContainerDied","Data":"75f40604138b8ecf4521cb7a52943b7c7bdc5060bfb188dd9eaf0ae68b6569fb"} Mar 14 08:52:04 crc kubenswrapper[5129]: I0314 08:52:04.722815 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-g8qfb" Mar 14 08:52:04 crc kubenswrapper[5129]: I0314 08:52:04.874481 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvxt\" (UniqueName: \"kubernetes.io/projected/38f1a22d-c7b8-4fd6-8111-55421a6d8b4e-kube-api-access-2wvxt\") pod \"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e\" (UID: \"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e\") " Mar 14 08:52:04 crc kubenswrapper[5129]: I0314 08:52:04.885348 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f1a22d-c7b8-4fd6-8111-55421a6d8b4e-kube-api-access-2wvxt" (OuterVolumeSpecName: "kube-api-access-2wvxt") pod "38f1a22d-c7b8-4fd6-8111-55421a6d8b4e" (UID: "38f1a22d-c7b8-4fd6-8111-55421a6d8b4e"). InnerVolumeSpecName "kube-api-access-2wvxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:04 crc kubenswrapper[5129]: I0314 08:52:04.976739 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvxt\" (UniqueName: \"kubernetes.io/projected/38f1a22d-c7b8-4fd6-8111-55421a6d8b4e-kube-api-access-2wvxt\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:05 crc kubenswrapper[5129]: I0314 08:52:05.393234 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557972-g8qfb" event={"ID":"38f1a22d-c7b8-4fd6-8111-55421a6d8b4e","Type":"ContainerDied","Data":"a561c0d6a0c127892dd1b642a1d9c5cb002cd5c4c1a064a6e706541eb1c4e313"} Mar 14 08:52:05 crc kubenswrapper[5129]: I0314 08:52:05.393265 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-g8qfb" Mar 14 08:52:05 crc kubenswrapper[5129]: I0314 08:52:05.393281 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a561c0d6a0c127892dd1b642a1d9c5cb002cd5c4c1a064a6e706541eb1c4e313" Mar 14 08:52:05 crc kubenswrapper[5129]: I0314 08:52:05.797228 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-5z52h"] Mar 14 08:52:05 crc kubenswrapper[5129]: I0314 08:52:05.805453 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-5z52h"] Mar 14 08:52:06 crc kubenswrapper[5129]: I0314 08:52:06.045966 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87de6820-6724-4f06-aa7d-9f19b17168be" path="/var/lib/kubelet/pods/87de6820-6724-4f06-aa7d-9f19b17168be/volumes" Mar 14 08:52:27 crc kubenswrapper[5129]: I0314 08:52:27.226269 5129 scope.go:117] "RemoveContainer" containerID="c78c3127d06dbe143fb98bbb25ed91170dfb746c69c0fd486ad21bc6c6b328f8" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.315383 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9jg8b"] Mar 14 08:52:30 crc kubenswrapper[5129]: E0314 08:52:30.316817 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f1a22d-c7b8-4fd6-8111-55421a6d8b4e" containerName="oc" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.316843 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f1a22d-c7b8-4fd6-8111-55421a6d8b4e" containerName="oc" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.317075 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f1a22d-c7b8-4fd6-8111-55421a6d8b4e" containerName="oc" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.318883 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.330525 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9jg8b"] Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.466688 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-utilities\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.466814 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-catalog-content\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.466872 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzhvs\" (UniqueName: \"kubernetes.io/projected/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-kube-api-access-qzhvs\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.569225 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzhvs\" (UniqueName: \"kubernetes.io/projected/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-kube-api-access-qzhvs\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.569383 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-utilities\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.569464 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-catalog-content\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.570156 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-catalog-content\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.570159 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-utilities\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.594224 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzhvs\" (UniqueName: \"kubernetes.io/projected/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-kube-api-access-qzhvs\") pod \"certified-operators-9jg8b\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:30 crc kubenswrapper[5129]: I0314 08:52:30.641272 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:31 crc kubenswrapper[5129]: I0314 08:52:31.056948 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5vnvv"] Mar 14 08:52:31 crc kubenswrapper[5129]: I0314 08:52:31.069009 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5vnvv"] Mar 14 08:52:31 crc kubenswrapper[5129]: I0314 08:52:31.199919 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9jg8b"] Mar 14 08:52:31 crc kubenswrapper[5129]: I0314 08:52:31.900576 5129 generic.go:334] "Generic (PLEG): container finished" podID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerID="2dca3d75dc6576fefd79e8a7c51e94df4d58ffbb401ac6371f67eb3a9873c48f" exitCode=0 Mar 14 08:52:31 crc kubenswrapper[5129]: I0314 08:52:31.900785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jg8b" event={"ID":"99fe8072-e11e-4959-8062-0f2f9e9d8bb0","Type":"ContainerDied","Data":"2dca3d75dc6576fefd79e8a7c51e94df4d58ffbb401ac6371f67eb3a9873c48f"} Mar 14 08:52:31 crc kubenswrapper[5129]: I0314 08:52:31.901237 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jg8b" event={"ID":"99fe8072-e11e-4959-8062-0f2f9e9d8bb0","Type":"ContainerStarted","Data":"7fa26262bf6ba53252988997d09cd6b371f320c15cd7914b0644f1a91b63c744"} Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.048972 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbaaa8fb-9971-4abf-aaca-46592954619d" path="/var/lib/kubelet/pods/bbaaa8fb-9971-4abf-aaca-46592954619d/volumes" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.136627 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mcpvc"] Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.138526 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.181087 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcpvc"] Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.203751 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-utilities\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.203819 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-catalog-content\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.203864 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flsz5\" (UniqueName: \"kubernetes.io/projected/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-kube-api-access-flsz5\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.309030 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flsz5\" (UniqueName: \"kubernetes.io/projected/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-kube-api-access-flsz5\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.309184 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-utilities\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.309260 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-catalog-content\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.309920 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-catalog-content\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.310298 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-utilities\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.348377 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flsz5\" (UniqueName: \"kubernetes.io/projected/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-kube-api-access-flsz5\") pod \"redhat-operators-mcpvc\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.455391 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.810705 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcpvc"] Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.913111 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jg8b" event={"ID":"99fe8072-e11e-4959-8062-0f2f9e9d8bb0","Type":"ContainerStarted","Data":"ef046be055e54d0ea5667afaf39909268c69367b0c2d2cfbd66e1bd087a59dfa"} Mar 14 08:52:32 crc kubenswrapper[5129]: I0314 08:52:32.918396 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcpvc" event={"ID":"c8ee7de1-2466-4b8f-b559-66e9f11efdeb","Type":"ContainerStarted","Data":"c84a1f21d555c813f24e02b9a2f28d44640de4b6705ebd906f58053231d4b0e0"} Mar 14 08:52:33 crc kubenswrapper[5129]: I0314 08:52:33.929738 5129 generic.go:334] "Generic (PLEG): container finished" podID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerID="ef046be055e54d0ea5667afaf39909268c69367b0c2d2cfbd66e1bd087a59dfa" exitCode=0 Mar 14 08:52:33 crc kubenswrapper[5129]: I0314 08:52:33.929845 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jg8b" event={"ID":"99fe8072-e11e-4959-8062-0f2f9e9d8bb0","Type":"ContainerDied","Data":"ef046be055e54d0ea5667afaf39909268c69367b0c2d2cfbd66e1bd087a59dfa"} Mar 14 08:52:33 crc kubenswrapper[5129]: I0314 08:52:33.934084 5129 generic.go:334] "Generic (PLEG): container finished" podID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerID="3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021" exitCode=0 Mar 14 08:52:33 crc kubenswrapper[5129]: I0314 08:52:33.934143 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcpvc" event={"ID":"c8ee7de1-2466-4b8f-b559-66e9f11efdeb","Type":"ContainerDied","Data":"3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021"} Mar 14 08:52:34 crc kubenswrapper[5129]: I0314 08:52:34.947679 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jg8b" event={"ID":"99fe8072-e11e-4959-8062-0f2f9e9d8bb0","Type":"ContainerStarted","Data":"58ec2a10c039d3a3725d139cb81955a9af406293fc11ee7f530f3b9d8b7922b6"} Mar 14 08:52:34 crc kubenswrapper[5129]: I0314 08:52:34.950423 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcpvc" event={"ID":"c8ee7de1-2466-4b8f-b559-66e9f11efdeb","Type":"ContainerStarted","Data":"97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc"} Mar 14 08:52:34 crc kubenswrapper[5129]: I0314 08:52:34.986792 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9jg8b" podStartSLOduration=2.577582707 podStartE2EDuration="4.986754864s" podCreationTimestamp="2026-03-14 08:52:30 +0000 UTC" firstStartedPulling="2026-03-14 08:52:31.904006336 +0000 UTC m=+6814.655921520" lastFinishedPulling="2026-03-14 08:52:34.313178493 +0000 UTC m=+6817.065093677" observedRunningTime="2026-03-14 08:52:34.972958931 +0000 UTC m=+6817.724874135" watchObservedRunningTime="2026-03-14 08:52:34.986754864 +0000 UTC m=+6817.738670058" Mar 14 08:52:35 crc kubenswrapper[5129]: I0314 08:52:35.962556 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcpvc" event={"ID":"c8ee7de1-2466-4b8f-b559-66e9f11efdeb","Type":"ContainerDied","Data":"97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc"} Mar 14 08:52:35 crc kubenswrapper[5129]: I0314 08:52:35.962482 5129 generic.go:334] "Generic (PLEG): container finished" podID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerID="97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc" exitCode=0 Mar 14 08:52:36 crc kubenswrapper[5129]: I0314 08:52:36.974332 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcpvc" event={"ID":"c8ee7de1-2466-4b8f-b559-66e9f11efdeb","Type":"ContainerStarted","Data":"7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288"} Mar 14 08:52:37 crc kubenswrapper[5129]: I0314 08:52:37.003658 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mcpvc" podStartSLOduration=2.613345258 podStartE2EDuration="5.003629444s" podCreationTimestamp="2026-03-14 08:52:32 +0000 UTC" firstStartedPulling="2026-03-14 08:52:33.936137448 +0000 UTC m=+6816.688052622" lastFinishedPulling="2026-03-14 08:52:36.326421624 +0000 UTC m=+6819.078336808" observedRunningTime="2026-03-14 08:52:36.997217141 +0000 UTC m=+6819.749132335" watchObservedRunningTime="2026-03-14 08:52:37.003629444 +0000 UTC m=+6819.755544628" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.495058 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.496853 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.503455 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jslnd" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.529118 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.548016 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx2kx\" (UniqueName: \"kubernetes.io/projected/e61aa93b-36ea-424e-b43d-ff07a45e91f5-kube-api-access-tx2kx\") pod \"mariadb-copy-data\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.548149 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\") pod \"mariadb-copy-data\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.649473 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\") pod \"mariadb-copy-data\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.649821 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx2kx\" (UniqueName: \"kubernetes.io/projected/e61aa93b-36ea-424e-b43d-ff07a45e91f5-kube-api-access-tx2kx\") pod \"mariadb-copy-data\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.656314 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.656381 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\") pod \"mariadb-copy-data\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3ffd7b8c321dbbb3cd52b3ba340ac26972321eccc9c093b153957db1cef996b9/globalmount\"" pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.678768 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx2kx\" (UniqueName: \"kubernetes.io/projected/e61aa93b-36ea-424e-b43d-ff07a45e91f5-kube-api-access-tx2kx\") pod \"mariadb-copy-data\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.721702 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\") pod \"mariadb-copy-data\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " pod="openstack/mariadb-copy-data" Mar 14 08:52:38 crc kubenswrapper[5129]: I0314 08:52:38.826025 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 14 08:52:39 crc kubenswrapper[5129]: I0314 08:52:39.465354 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 14 08:52:39 crc kubenswrapper[5129]: W0314 08:52:39.476317 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode61aa93b_36ea_424e_b43d_ff07a45e91f5.slice/crio-a432328f72157ceffec8575fc3e687d500ae043a3ba78122ff461f8cd183a68b WatchSource:0}: Error finding container a432328f72157ceffec8575fc3e687d500ae043a3ba78122ff461f8cd183a68b: Status 404 returned error can't find the container with id a432328f72157ceffec8575fc3e687d500ae043a3ba78122ff461f8cd183a68b Mar 14 08:52:40 crc kubenswrapper[5129]: I0314 08:52:40.069095 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.069058533 podStartE2EDuration="3.069058533s" podCreationTimestamp="2026-03-14 08:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:40.065256991 +0000 UTC m=+6822.817172175" watchObservedRunningTime="2026-03-14 08:52:40.069058533 +0000 UTC m=+6822.820973707" Mar 14 08:52:40 crc kubenswrapper[5129]: I0314 08:52:40.103659 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e61aa93b-36ea-424e-b43d-ff07a45e91f5","Type":"ContainerStarted","Data":"00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8"} Mar 14 08:52:40 crc kubenswrapper[5129]: I0314 08:52:40.103728 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e61aa93b-36ea-424e-b43d-ff07a45e91f5","Type":"ContainerStarted","Data":"a432328f72157ceffec8575fc3e687d500ae043a3ba78122ff461f8cd183a68b"} Mar 14 08:52:40 crc kubenswrapper[5129]: I0314 08:52:40.641504 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:40 crc kubenswrapper[5129]: I0314 08:52:40.641715 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:40 crc kubenswrapper[5129]: I0314 08:52:40.705085 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:41 crc kubenswrapper[5129]: I0314 08:52:41.121300 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:41 crc kubenswrapper[5129]: I0314 08:52:41.894398 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9jg8b"] Mar 14 08:52:42 crc kubenswrapper[5129]: I0314 08:52:42.455710 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:42 crc kubenswrapper[5129]: I0314 08:52:42.455802 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.077807 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9jg8b" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="registry-server" containerID="cri-o://58ec2a10c039d3a3725d139cb81955a9af406293fc11ee7f530f3b9d8b7922b6" gracePeriod=2 Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.433860 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.435018 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.456250 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.497425 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mcpvc" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="registry-server" probeResult="failure" output=< Mar 14 08:52:43 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 08:52:43 crc kubenswrapper[5129]: > Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.552167 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9v8\" (UniqueName: \"kubernetes.io/projected/73a99ff9-25d4-42bb-8b96-a8ac58b130ac-kube-api-access-4p9v8\") pod \"mariadb-client\" (UID: \"73a99ff9-25d4-42bb-8b96-a8ac58b130ac\") " pod="openstack/mariadb-client" Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.654411 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9v8\" (UniqueName: \"kubernetes.io/projected/73a99ff9-25d4-42bb-8b96-a8ac58b130ac-kube-api-access-4p9v8\") pod \"mariadb-client\" (UID: \"73a99ff9-25d4-42bb-8b96-a8ac58b130ac\") " pod="openstack/mariadb-client" Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.678741 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9v8\" (UniqueName: \"kubernetes.io/projected/73a99ff9-25d4-42bb-8b96-a8ac58b130ac-kube-api-access-4p9v8\") pod \"mariadb-client\" (UID: \"73a99ff9-25d4-42bb-8b96-a8ac58b130ac\") " pod="openstack/mariadb-client" Mar 14 08:52:43 crc kubenswrapper[5129]: I0314 08:52:43.781000 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.094014 5129 generic.go:334] "Generic (PLEG): container finished" podID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerID="58ec2a10c039d3a3725d139cb81955a9af406293fc11ee7f530f3b9d8b7922b6" exitCode=0 Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.094089 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jg8b" event={"ID":"99fe8072-e11e-4959-8062-0f2f9e9d8bb0","Type":"ContainerDied","Data":"58ec2a10c039d3a3725d139cb81955a9af406293fc11ee7f530f3b9d8b7922b6"} Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.245126 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.351568 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.469080 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-utilities\") pod \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.469165 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzhvs\" (UniqueName: \"kubernetes.io/projected/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-kube-api-access-qzhvs\") pod \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.469295 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-catalog-content\") pod \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\" (UID: \"99fe8072-e11e-4959-8062-0f2f9e9d8bb0\") " Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.470788 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-utilities" (OuterVolumeSpecName: "utilities") pod "99fe8072-e11e-4959-8062-0f2f9e9d8bb0" (UID: "99fe8072-e11e-4959-8062-0f2f9e9d8bb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.477802 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-kube-api-access-qzhvs" (OuterVolumeSpecName: "kube-api-access-qzhvs") pod "99fe8072-e11e-4959-8062-0f2f9e9d8bb0" (UID: "99fe8072-e11e-4959-8062-0f2f9e9d8bb0"). InnerVolumeSpecName "kube-api-access-qzhvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.540145 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99fe8072-e11e-4959-8062-0f2f9e9d8bb0" (UID: "99fe8072-e11e-4959-8062-0f2f9e9d8bb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.571896 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzhvs\" (UniqueName: \"kubernetes.io/projected/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-kube-api-access-qzhvs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.571978 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[5129]: I0314 08:52:44.572006 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fe8072-e11e-4959-8062-0f2f9e9d8bb0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.105336 5129 generic.go:334] "Generic (PLEG): container finished" podID="73a99ff9-25d4-42bb-8b96-a8ac58b130ac" containerID="f31c05ff22a0a586c3c03277071c6ddb9534c1b40411b6420de498964f888eb6" exitCode=0 Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.105438 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"73a99ff9-25d4-42bb-8b96-a8ac58b130ac","Type":"ContainerDied","Data":"f31c05ff22a0a586c3c03277071c6ddb9534c1b40411b6420de498964f888eb6"} Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.105551 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"73a99ff9-25d4-42bb-8b96-a8ac58b130ac","Type":"ContainerStarted","Data":"f9c57482b723cc01f9b5855de059f014512f3fb3e7552f826b3a57121ccfc87d"} Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.110016 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jg8b" event={"ID":"99fe8072-e11e-4959-8062-0f2f9e9d8bb0","Type":"ContainerDied","Data":"7fa26262bf6ba53252988997d09cd6b371f320c15cd7914b0644f1a91b63c744"} Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.110093 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jg8b" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.110098 5129 scope.go:117] "RemoveContainer" containerID="58ec2a10c039d3a3725d139cb81955a9af406293fc11ee7f530f3b9d8b7922b6" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.140118 5129 scope.go:117] "RemoveContainer" containerID="ef046be055e54d0ea5667afaf39909268c69367b0c2d2cfbd66e1bd087a59dfa" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.167766 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9jg8b"] Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.169392 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9jg8b"] Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.187719 5129 scope.go:117] "RemoveContainer" containerID="2dca3d75dc6576fefd79e8a7c51e94df4d58ffbb401ac6371f67eb3a9873c48f" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.311370 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2z7fv"] Mar 14 08:52:45 crc kubenswrapper[5129]: E0314 08:52:45.312403 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="extract-utilities" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.312449 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="extract-utilities" Mar 14 08:52:45 crc kubenswrapper[5129]: E0314 08:52:45.312487 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="extract-content" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.312497 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="extract-content" Mar 14 08:52:45 crc kubenswrapper[5129]: E0314 08:52:45.312512 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="registry-server" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.312522 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="registry-server" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.312771 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" containerName="registry-server" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.314543 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.332865 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z7fv"] Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.486137 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-catalog-content\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.486519 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-utilities\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.486567 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wzt\" (UniqueName: \"kubernetes.io/projected/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-kube-api-access-w2wzt\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.589029 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-catalog-content\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.589168 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-utilities\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.589266 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wzt\" (UniqueName: \"kubernetes.io/projected/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-kube-api-access-w2wzt\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.591123 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-catalog-content\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.591246 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-utilities\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.612147 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wzt\" (UniqueName: \"kubernetes.io/projected/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-kube-api-access-w2wzt\") pod \"redhat-marketplace-2z7fv\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.661261 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:45 crc kubenswrapper[5129]: I0314 08:52:45.952885 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z7fv"] Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.050898 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99fe8072-e11e-4959-8062-0f2f9e9d8bb0" path="/var/lib/kubelet/pods/99fe8072-e11e-4959-8062-0f2f9e9d8bb0/volumes" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.122273 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z7fv" event={"ID":"2f3fb4c7-4c5e-40d5-be83-67e7a0279965","Type":"ContainerStarted","Data":"c1e0e6099c3269df08794c578c3dfb5a2644be2b88adc6628416faebbc7ccfcc"} Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.422229 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.444918 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_73a99ff9-25d4-42bb-8b96-a8ac58b130ac/mariadb-client/0.log" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.472129 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.480132 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.504456 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p9v8\" (UniqueName: \"kubernetes.io/projected/73a99ff9-25d4-42bb-8b96-a8ac58b130ac-kube-api-access-4p9v8\") pod \"73a99ff9-25d4-42bb-8b96-a8ac58b130ac\" (UID: \"73a99ff9-25d4-42bb-8b96-a8ac58b130ac\") " Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.511625 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a99ff9-25d4-42bb-8b96-a8ac58b130ac-kube-api-access-4p9v8" (OuterVolumeSpecName: "kube-api-access-4p9v8") pod "73a99ff9-25d4-42bb-8b96-a8ac58b130ac" (UID: "73a99ff9-25d4-42bb-8b96-a8ac58b130ac"). InnerVolumeSpecName "kube-api-access-4p9v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.606919 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p9v8\" (UniqueName: \"kubernetes.io/projected/73a99ff9-25d4-42bb-8b96-a8ac58b130ac-kube-api-access-4p9v8\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.631774 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:46 crc kubenswrapper[5129]: E0314 08:52:46.632226 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a99ff9-25d4-42bb-8b96-a8ac58b130ac" containerName="mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.632272 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a99ff9-25d4-42bb-8b96-a8ac58b130ac" containerName="mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.632440 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a99ff9-25d4-42bb-8b96-a8ac58b130ac" containerName="mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.633023 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.645356 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.708588 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk2pv\" (UniqueName: \"kubernetes.io/projected/9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb-kube-api-access-qk2pv\") pod \"mariadb-client\" (UID: \"9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb\") " pod="openstack/mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.810268 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk2pv\" (UniqueName: \"kubernetes.io/projected/9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb-kube-api-access-qk2pv\") pod \"mariadb-client\" (UID: \"9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb\") " pod="openstack/mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.844577 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk2pv\" (UniqueName: \"kubernetes.io/projected/9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb-kube-api-access-qk2pv\") pod \"mariadb-client\" (UID: \"9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb\") " pod="openstack/mariadb-client" Mar 14 08:52:46 crc kubenswrapper[5129]: I0314 08:52:46.975360 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:47 crc kubenswrapper[5129]: I0314 08:52:47.134199 5129 generic.go:334] "Generic (PLEG): container finished" podID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerID="db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204" exitCode=0 Mar 14 08:52:47 crc kubenswrapper[5129]: I0314 08:52:47.134321 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z7fv" event={"ID":"2f3fb4c7-4c5e-40d5-be83-67e7a0279965","Type":"ContainerDied","Data":"db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204"} Mar 14 08:52:47 crc kubenswrapper[5129]: I0314 08:52:47.143902 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9c57482b723cc01f9b5855de059f014512f3fb3e7552f826b3a57121ccfc87d" Mar 14 08:52:47 crc kubenswrapper[5129]: I0314 08:52:47.144012 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:47 crc kubenswrapper[5129]: I0314 08:52:47.189044 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="73a99ff9-25d4-42bb-8b96-a8ac58b130ac" podUID="9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb" Mar 14 08:52:47 crc kubenswrapper[5129]: I0314 08:52:47.447242 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:47 crc kubenswrapper[5129]: W0314 08:52:47.450740 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9550bc71_abe0_4fca_bdbe_ed4c02d7d8eb.slice/crio-5af07ea5092ea8e2ee30de2fc523f441bb3f34ad49b274af2931a1f4bbf66bd1 WatchSource:0}: Error finding container 5af07ea5092ea8e2ee30de2fc523f441bb3f34ad49b274af2931a1f4bbf66bd1: Status 404 returned error can't find the container with id 5af07ea5092ea8e2ee30de2fc523f441bb3f34ad49b274af2931a1f4bbf66bd1 Mar 14 08:52:48 crc kubenswrapper[5129]: I0314 08:52:48.051517 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a99ff9-25d4-42bb-8b96-a8ac58b130ac" path="/var/lib/kubelet/pods/73a99ff9-25d4-42bb-8b96-a8ac58b130ac/volumes" Mar 14 08:52:48 crc kubenswrapper[5129]: I0314 08:52:48.159884 5129 generic.go:334] "Generic (PLEG): container finished" podID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerID="6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62" exitCode=0 Mar 14 08:52:48 crc kubenswrapper[5129]: I0314 08:52:48.159991 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z7fv" event={"ID":"2f3fb4c7-4c5e-40d5-be83-67e7a0279965","Type":"ContainerDied","Data":"6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62"} Mar 14 08:52:48 crc kubenswrapper[5129]: I0314 08:52:48.166869 5129 generic.go:334] "Generic (PLEG): container finished" podID="9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb" containerID="3061c746513933ee96010e2b1a741f4f421b83a078b3c1f8383ac8ce46bdbde3" exitCode=0 Mar 14 08:52:48 crc kubenswrapper[5129]: I0314 08:52:48.166966 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb","Type":"ContainerDied","Data":"3061c746513933ee96010e2b1a741f4f421b83a078b3c1f8383ac8ce46bdbde3"} Mar 14 08:52:48 crc kubenswrapper[5129]: I0314 08:52:48.167065 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb","Type":"ContainerStarted","Data":"5af07ea5092ea8e2ee30de2fc523f441bb3f34ad49b274af2931a1f4bbf66bd1"} Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.181967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z7fv" event={"ID":"2f3fb4c7-4c5e-40d5-be83-67e7a0279965","Type":"ContainerStarted","Data":"c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697"} Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.210816 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2z7fv" podStartSLOduration=2.562308998 podStartE2EDuration="4.210787658s" podCreationTimestamp="2026-03-14 08:52:45 +0000 UTC" firstStartedPulling="2026-03-14 08:52:47.138449958 +0000 UTC m=+6829.890365162" lastFinishedPulling="2026-03-14 08:52:48.786928638 +0000 UTC m=+6831.538843822" observedRunningTime="2026-03-14 08:52:49.200652924 +0000 UTC m=+6831.952568128" watchObservedRunningTime="2026-03-14 08:52:49.210787658 +0000 UTC m=+6831.962702852" Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.563886 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.593248 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb/mariadb-client/0.log" Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.619303 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.632234 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.673080 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk2pv\" (UniqueName: \"kubernetes.io/projected/9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb-kube-api-access-qk2pv\") pod \"9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb\" (UID: \"9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb\") " Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.680214 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb-kube-api-access-qk2pv" (OuterVolumeSpecName: "kube-api-access-qk2pv") pod "9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb" (UID: "9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb"). InnerVolumeSpecName "kube-api-access-qk2pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:49 crc kubenswrapper[5129]: I0314 08:52:49.775502 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk2pv\" (UniqueName: \"kubernetes.io/projected/9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb-kube-api-access-qk2pv\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:50 crc kubenswrapper[5129]: I0314 08:52:50.047261 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb" path="/var/lib/kubelet/pods/9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb/volumes" Mar 14 08:52:50 crc kubenswrapper[5129]: I0314 08:52:50.196923 5129 scope.go:117] "RemoveContainer" containerID="3061c746513933ee96010e2b1a741f4f421b83a078b3c1f8383ac8ce46bdbde3" Mar 14 08:52:50 crc kubenswrapper[5129]: I0314 08:52:50.196985 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 14 08:52:52 crc kubenswrapper[5129]: I0314 08:52:52.532976 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:52 crc kubenswrapper[5129]: I0314 08:52:52.619666 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:52 crc kubenswrapper[5129]: I0314 08:52:52.784710 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mcpvc"] Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.233036 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mcpvc" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="registry-server" containerID="cri-o://7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288" gracePeriod=2 Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.695670 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.768196 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-utilities\") pod \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.769091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-catalog-content\") pod \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.769317 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-utilities" (OuterVolumeSpecName: "utilities") pod "c8ee7de1-2466-4b8f-b559-66e9f11efdeb" (UID: "c8ee7de1-2466-4b8f-b559-66e9f11efdeb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.769433 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flsz5\" (UniqueName: \"kubernetes.io/projected/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-kube-api-access-flsz5\") pod \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\" (UID: \"c8ee7de1-2466-4b8f-b559-66e9f11efdeb\") " Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.769949 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.776922 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-kube-api-access-flsz5" (OuterVolumeSpecName: "kube-api-access-flsz5") pod "c8ee7de1-2466-4b8f-b559-66e9f11efdeb" (UID: "c8ee7de1-2466-4b8f-b559-66e9f11efdeb"). InnerVolumeSpecName "kube-api-access-flsz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.871619 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flsz5\" (UniqueName: \"kubernetes.io/projected/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-kube-api-access-flsz5\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.907055 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8ee7de1-2466-4b8f-b559-66e9f11efdeb" (UID: "c8ee7de1-2466-4b8f-b559-66e9f11efdeb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:54 crc kubenswrapper[5129]: I0314 08:52:54.973431 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee7de1-2466-4b8f-b559-66e9f11efdeb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.246002 5129 generic.go:334] "Generic (PLEG): container finished" podID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerID="7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288" exitCode=0 Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.246051 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcpvc" event={"ID":"c8ee7de1-2466-4b8f-b559-66e9f11efdeb","Type":"ContainerDied","Data":"7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288"} Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.246079 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcpvc" event={"ID":"c8ee7de1-2466-4b8f-b559-66e9f11efdeb","Type":"ContainerDied","Data":"c84a1f21d555c813f24e02b9a2f28d44640de4b6705ebd906f58053231d4b0e0"} Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.246129 5129 scope.go:117] "RemoveContainer" containerID="7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.246340 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcpvc" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.278635 5129 scope.go:117] "RemoveContainer" containerID="97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.295191 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mcpvc"] Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.302580 5129 scope.go:117] "RemoveContainer" containerID="3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.306481 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mcpvc"] Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.352973 5129 scope.go:117] "RemoveContainer" containerID="7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288" Mar 14 08:52:55 crc kubenswrapper[5129]: E0314 08:52:55.353852 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288\": container with ID starting with 7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288 not found: ID does not exist" containerID="7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.353899 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288"} err="failed to get container status \"7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288\": rpc error: code = NotFound desc = could not find container \"7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288\": container with ID starting with 7d20f4d49c196c7b87c322876aa2fe8eb98d4d005d48ebe98a2baf2feea43288 not found: ID does not exist" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.353929 5129 scope.go:117] "RemoveContainer" containerID="97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc" Mar 14 08:52:55 crc kubenswrapper[5129]: E0314 08:52:55.356443 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc\": container with ID starting with 97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc not found: ID does not exist" containerID="97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.356515 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc"} err="failed to get container status \"97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc\": rpc error: code = NotFound desc = could not find container \"97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc\": container with ID starting with 97c2ed868dec0e7f8f957ad75b5d89bf220eb5509153fdf637dd21c90ae272dc not found: ID does not exist" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.356559 5129 scope.go:117] "RemoveContainer" containerID="3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021" Mar 14 08:52:55 crc kubenswrapper[5129]: E0314 08:52:55.358072 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021\": container with ID starting with 3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021 not found: ID does not exist" containerID="3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.358105 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021"} err="failed to get container status \"3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021\": rpc error: code = NotFound desc = could not find container \"3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021\": container with ID starting with 3e8748b9de7c94d47cddee54908ec3557cfdc93f474bab19f367f4f578ddd021 not found: ID does not exist" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.662955 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.663102 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:55 crc kubenswrapper[5129]: I0314 08:52:55.723423 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:56 crc kubenswrapper[5129]: I0314 08:52:56.053493 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" path="/var/lib/kubelet/pods/c8ee7de1-2466-4b8f-b559-66e9f11efdeb/volumes" Mar 14 08:52:56 crc kubenswrapper[5129]: I0314 08:52:56.323142 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:57 crc kubenswrapper[5129]: I0314 08:52:57.984563 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z7fv"] Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.284401 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2z7fv" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="registry-server" containerID="cri-o://c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697" gracePeriod=2 Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.821954 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.854361 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-catalog-content\") pod \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.854526 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-utilities\") pod \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.854575 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2wzt\" (UniqueName: \"kubernetes.io/projected/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-kube-api-access-w2wzt\") pod \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\" (UID: \"2f3fb4c7-4c5e-40d5-be83-67e7a0279965\") " Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.856417 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-utilities" (OuterVolumeSpecName: "utilities") pod "2f3fb4c7-4c5e-40d5-be83-67e7a0279965" (UID: "2f3fb4c7-4c5e-40d5-be83-67e7a0279965"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.860564 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-kube-api-access-w2wzt" (OuterVolumeSpecName: "kube-api-access-w2wzt") pod "2f3fb4c7-4c5e-40d5-be83-67e7a0279965" (UID: "2f3fb4c7-4c5e-40d5-be83-67e7a0279965"). InnerVolumeSpecName "kube-api-access-w2wzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.892815 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f3fb4c7-4c5e-40d5-be83-67e7a0279965" (UID: "2f3fb4c7-4c5e-40d5-be83-67e7a0279965"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.957106 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.957153 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2wzt\" (UniqueName: \"kubernetes.io/projected/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-kube-api-access-w2wzt\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:58 crc kubenswrapper[5129]: I0314 08:52:58.957187 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3fb4c7-4c5e-40d5-be83-67e7a0279965-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.297822 5129 generic.go:334] "Generic (PLEG): container finished" podID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerID="c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697" exitCode=0 Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.297890 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z7fv" event={"ID":"2f3fb4c7-4c5e-40d5-be83-67e7a0279965","Type":"ContainerDied","Data":"c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697"} Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.297931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z7fv" event={"ID":"2f3fb4c7-4c5e-40d5-be83-67e7a0279965","Type":"ContainerDied","Data":"c1e0e6099c3269df08794c578c3dfb5a2644be2b88adc6628416faebbc7ccfcc"} Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.297957 5129 scope.go:117] "RemoveContainer" containerID="c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.298101 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z7fv" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.362264 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z7fv"] Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.365087 5129 scope.go:117] "RemoveContainer" containerID="6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.372419 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z7fv"] Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.391169 5129 scope.go:117] "RemoveContainer" containerID="db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.425452 5129 scope.go:117] "RemoveContainer" containerID="c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697" Mar 14 08:52:59 crc kubenswrapper[5129]: E0314 08:52:59.426299 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697\": container with ID starting with c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697 not found: ID does not exist" containerID="c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.426378 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697"} err="failed to get container status \"c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697\": rpc error: code = NotFound desc = could not find container \"c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697\": container with ID starting with c76d58534cdde3f18b59f76b462e3f49eede1f20b9559280cb2414ebac8f9697 not found: ID does not exist" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.426425 5129 scope.go:117] "RemoveContainer" containerID="6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62" Mar 14 08:52:59 crc kubenswrapper[5129]: E0314 08:52:59.431194 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62\": container with ID starting with 6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62 not found: ID does not exist" containerID="6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.431260 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62"} err="failed to get container status \"6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62\": rpc error: code = NotFound desc = could not find container \"6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62\": container with ID starting with 6c8dce5172edc3b73456aa1b85bf6cccbaf655cfa9f81d3ad64ab891d24adb62 not found: ID does not exist" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.431300 5129 scope.go:117] "RemoveContainer" containerID="db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204" Mar 14 08:52:59 crc kubenswrapper[5129]: E0314 08:52:59.432033 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204\": container with ID starting with db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204 not found: ID does not exist" containerID="db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204" Mar 14 08:52:59 crc kubenswrapper[5129]: I0314 08:52:59.432118 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204"} err="failed to get container status \"db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204\": rpc error: code = NotFound desc = could not find container \"db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204\": container with ID starting with db536d3169a154ecf6cbb5478d9a6a3a89891cd93ce846fdcf56e30ace15f204 not found: ID does not exist" Mar 14 08:53:00 crc kubenswrapper[5129]: I0314 08:53:00.048126 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" path="/var/lib/kubelet/pods/2f3fb4c7-4c5e-40d5-be83-67e7a0279965/volumes" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.656704 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 08:53:25 crc kubenswrapper[5129]: E0314 08:53:25.658284 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="extract-content" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658310 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="extract-content" Mar 14 08:53:25 crc kubenswrapper[5129]: E0314 08:53:25.658336 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb" containerName="mariadb-client" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658351 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb" containerName="mariadb-client" Mar 14 08:53:25 crc kubenswrapper[5129]: E0314 08:53:25.658385 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="extract-content" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658401 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="extract-content" Mar 14 08:53:25 crc kubenswrapper[5129]: E0314 08:53:25.658424 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="extract-utilities" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658437 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="extract-utilities" Mar 14 08:53:25 crc kubenswrapper[5129]: E0314 08:53:25.658468 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="registry-server" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658481 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="registry-server" Mar 14 08:53:25 crc kubenswrapper[5129]: E0314 08:53:25.658501 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="extract-utilities" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658513 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="extract-utilities" Mar 14 08:53:25 crc kubenswrapper[5129]: E0314 08:53:25.658534 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="registry-server" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658547 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="registry-server" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658834 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3fb4c7-4c5e-40d5-be83-67e7a0279965" containerName="registry-server" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658872 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ee7de1-2466-4b8f-b559-66e9f11efdeb" containerName="registry-server" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.658895 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9550bc71-abe0-4fca-bdbe-ed4c02d7d8eb" containerName="mariadb-client" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.660488 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.664529 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.665004 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kzp5c" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.665188 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.666108 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.666865 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.678223 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.681692 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.693732 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.695575 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.710014 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.724988 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.745684 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.823903 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.823958 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.823978 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824003 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824033 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e80a5513-d89e-43d3-9d57-d95ee6b3295c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824072 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3371490-9878-4c69-8390-4b2aed82dd1d-config\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824098 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824126 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824156 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824184 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824212 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3371490-9878-4c69-8390-4b2aed82dd1d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824256 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80a5513-d89e-43d3-9d57-d95ee6b3295c-config\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824278 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2nj\" (UniqueName: \"kubernetes.io/projected/db6190b7-2620-4a00-b2bd-ce56d2c94069-kube-api-access-zc2nj\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824302 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824323 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e80a5513-d89e-43d3-9d57-d95ee6b3295c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824346 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6190b7-2620-4a00-b2bd-ce56d2c94069-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824368 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824402 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3371490-9878-4c69-8390-4b2aed82dd1d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824469 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6190b7-2620-4a00-b2bd-ce56d2c94069-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824495 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzdt\" (UniqueName: \"kubernetes.io/projected/e80a5513-d89e-43d3-9d57-d95ee6b3295c-kube-api-access-pfzdt\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824526 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjl97\" (UniqueName: \"kubernetes.io/projected/c3371490-9878-4c69-8390-4b2aed82dd1d-kube-api-access-zjl97\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824546 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6190b7-2620-4a00-b2bd-ce56d2c94069-config\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824565 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.824591 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926013 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80a5513-d89e-43d3-9d57-d95ee6b3295c-config\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926087 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2nj\" (UniqueName: \"kubernetes.io/projected/db6190b7-2620-4a00-b2bd-ce56d2c94069-kube-api-access-zc2nj\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926131 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926176 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e80a5513-d89e-43d3-9d57-d95ee6b3295c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926210 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6190b7-2620-4a00-b2bd-ce56d2c94069-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926252 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926516 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3371490-9878-4c69-8390-4b2aed82dd1d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926565 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6190b7-2620-4a00-b2bd-ce56d2c94069-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926626 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzdt\" (UniqueName: \"kubernetes.io/projected/e80a5513-d89e-43d3-9d57-d95ee6b3295c-kube-api-access-pfzdt\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926837 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e80a5513-d89e-43d3-9d57-d95ee6b3295c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.926679 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjl97\" (UniqueName: \"kubernetes.io/projected/c3371490-9878-4c69-8390-4b2aed82dd1d-kube-api-access-zjl97\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.927706 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db6190b7-2620-4a00-b2bd-ce56d2c94069-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.927782 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3371490-9878-4c69-8390-4b2aed82dd1d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.927808 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6190b7-2620-4a00-b2bd-ce56d2c94069-config\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.927918 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928034 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928205 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928301 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928357 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928440 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928454 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80a5513-d89e-43d3-9d57-d95ee6b3295c-config\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928492 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6190b7-2620-4a00-b2bd-ce56d2c94069-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928562 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e80a5513-d89e-43d3-9d57-d95ee6b3295c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928579 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db6190b7-2620-4a00-b2bd-ce56d2c94069-config\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928671 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3371490-9878-4c69-8390-4b2aed82dd1d-config\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928789 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928879 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.928988 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.929098 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.929220 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3371490-9878-4c69-8390-4b2aed82dd1d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.930015 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3371490-9878-4c69-8390-4b2aed82dd1d-config\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.932268 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e80a5513-d89e-43d3-9d57-d95ee6b3295c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.932388 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3371490-9878-4c69-8390-4b2aed82dd1d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.933832 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.933866 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/61e87ff92c1d2e4e6202543ebf229856c924670e4dae8f18e99cfd3bac55e663/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.934026 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.934062 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01bbee6a8dae9d5fef219e94952e93a8df45bd63e18da9a7a8a66b4daa077486/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.935111 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.935172 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2820b61e7b92e90d79fb4963efe4d0ea1ab096c3b2b1cf8adc94bd9cddd43a6d/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.936989 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.942284 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.942577 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.944267 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.945111 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.945193 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3371490-9878-4c69-8390-4b2aed82dd1d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.947209 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.947898 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db6190b7-2620-4a00-b2bd-ce56d2c94069-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.948342 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2nj\" (UniqueName: \"kubernetes.io/projected/db6190b7-2620-4a00-b2bd-ce56d2c94069-kube-api-access-zc2nj\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.953546 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80a5513-d89e-43d3-9d57-d95ee6b3295c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.954679 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzdt\" (UniqueName: \"kubernetes.io/projected/e80a5513-d89e-43d3-9d57-d95ee6b3295c-kube-api-access-pfzdt\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.967130 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjl97\" (UniqueName: \"kubernetes.io/projected/c3371490-9878-4c69-8390-4b2aed82dd1d-kube-api-access-zjl97\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.977365 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ae47225-e7ff-479c-bbb0-9ac65775cf53\") pod \"ovsdbserver-sb-1\" (UID: \"e80a5513-d89e-43d3-9d57-d95ee6b3295c\") " pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.981747 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86b9ed6a-5ead-4404-96d3-428a399174f8\") pod \"ovsdbserver-sb-0\" (UID: \"c3371490-9878-4c69-8390-4b2aed82dd1d\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:25 crc kubenswrapper[5129]: I0314 08:53:25.987083 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3a3f319-1a0b-475f-874b-607bbc0f843d\") pod \"ovsdbserver-sb-2\" (UID: \"db6190b7-2620-4a00-b2bd-ce56d2c94069\") " pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.021104 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.045535 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.054885 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.347751 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.349666 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.352567 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.359417 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.359855 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.360027 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fnzrn" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.370883 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.372597 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.386209 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.394235 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.395944 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.404322 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.413754 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447243 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447311 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxw2h\" (UniqueName: \"kubernetes.io/projected/8ecea838-e1a9-4aa0-8602-ffe9621ff137-kube-api-access-wxw2h\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447358 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447398 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447731 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447801 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsk7\" (UniqueName: \"kubernetes.io/projected/92b6acaf-6100-4355-abb5-08af58c73021-kube-api-access-thsk7\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447894 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92b6acaf-6100-4355-abb5-08af58c73021-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447923 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92b6acaf-6100-4355-abb5-08af58c73021-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.447984 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ecea838-e1a9-4aa0-8602-ffe9621ff137-config\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.448013 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.448080 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b6acaf-6100-4355-abb5-08af58c73021-config\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.448168 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.448189 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecea838-e1a9-4aa0-8602-ffe9621ff137-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.448208 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ecea838-e1a9-4aa0-8602-ffe9621ff137-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.448276 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.448301 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553006 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553062 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsk7\" (UniqueName: \"kubernetes.io/projected/92b6acaf-6100-4355-abb5-08af58c73021-kube-api-access-thsk7\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553104 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553130 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92b6acaf-6100-4355-abb5-08af58c73021-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553162 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92b6acaf-6100-4355-abb5-08af58c73021-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553185 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ecea838-e1a9-4aa0-8602-ffe9621ff137-config\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553206 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553234 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b6acaf-6100-4355-abb5-08af58c73021-config\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553257 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553285 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553301 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecea838-e1a9-4aa0-8602-ffe9621ff137-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553322 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ecea838-e1a9-4aa0-8602-ffe9621ff137-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553345 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553366 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553388 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-config\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553412 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553430 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553455 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553476 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxw2h\" (UniqueName: \"kubernetes.io/projected/8ecea838-e1a9-4aa0-8602-ffe9621ff137-kube-api-access-wxw2h\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553506 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553531 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553563 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553638 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.553668 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79p8c\" (UniqueName: \"kubernetes.io/projected/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-kube-api-access-79p8c\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.554294 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92b6acaf-6100-4355-abb5-08af58c73021-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.556654 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92b6acaf-6100-4355-abb5-08af58c73021-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.556783 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ecea838-e1a9-4aa0-8602-ffe9621ff137-config\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.556962 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8ecea838-e1a9-4aa0-8602-ffe9621ff137-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.558827 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.558863 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f48547e51ba15124c028427b57cacc40518f969ff09ab039ccc79e45d096420d/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.560243 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.560930 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cd466ea7416ee2c141f0bfacf3c32fd48034cb01f123c37a47f5729aa143d8d0/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.562090 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.562210 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.564312 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.565297 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.565302 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b6acaf-6100-4355-abb5-08af58c73021-config\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.568106 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92b6acaf-6100-4355-abb5-08af58c73021-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.568502 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecea838-e1a9-4aa0-8602-ffe9621ff137-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.568901 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecea838-e1a9-4aa0-8602-ffe9621ff137-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.573454 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsk7\" (UniqueName: \"kubernetes.io/projected/92b6acaf-6100-4355-abb5-08af58c73021-kube-api-access-thsk7\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.592994 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxw2h\" (UniqueName: \"kubernetes.io/projected/8ecea838-e1a9-4aa0-8602-ffe9621ff137-kube-api-access-wxw2h\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.609432 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d26bdbd0-d471-4c00-91c1-c1e8b14e3012\") pod \"ovsdbserver-nb-1\" (UID: \"92b6acaf-6100-4355-abb5-08af58c73021\") " pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.628222 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5eb6b31e-90b7-4772-8523-2e852d76069d\") pod \"ovsdbserver-nb-0\" (UID: \"8ecea838-e1a9-4aa0-8602-ffe9621ff137\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655341 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655444 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655480 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655504 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655522 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-config\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655558 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655591 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.655638 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79p8c\" (UniqueName: \"kubernetes.io/projected/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-kube-api-access-79p8c\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.656032 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.657126 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-config\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.657126 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.659882 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.661337 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.661373 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f6ff80e3b6cee41909e703fca5e3822247e7abde11d4b1d39636d971c0f3bfde/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.661515 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.672194 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.675989 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79p8c\" (UniqueName: \"kubernetes.io/projected/0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa-kube-api-access-79p8c\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.682167 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.702085 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.711749 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b08aec5-042d-4767-8c7d-bb5ed44af50d\") pod \"ovsdbserver-nb-2\" (UID: \"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa\") " pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.712919 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.728275 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:26 crc kubenswrapper[5129]: I0314 08:53:26.817400 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.087195 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 08:53:27 crc kubenswrapper[5129]: W0314 08:53:27.088103 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecea838_e1a9_4aa0_8602_ffe9621ff137.slice/crio-d780e28fc98bff9614c0a56815bf71af6289a4a8ba2628f850a8ee58af41406b WatchSource:0}: Error finding container d780e28fc98bff9614c0a56815bf71af6289a4a8ba2628f850a8ee58af41406b: Status 404 returned error can't find the container with id d780e28fc98bff9614c0a56815bf71af6289a4a8ba2628f850a8ee58af41406b Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.177986 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 14 08:53:27 crc kubenswrapper[5129]: W0314 08:53:27.180837 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92b6acaf_6100_4355_abb5_08af58c73021.slice/crio-fc595b0649691ad451b278108ea3bbb743479d158d06180777cae7c258cd83ec WatchSource:0}: Error finding container fc595b0649691ad451b278108ea3bbb743479d158d06180777cae7c258cd83ec: Status 404 returned error can't find the container with id fc595b0649691ad451b278108ea3bbb743479d158d06180777cae7c258cd83ec Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.318694 5129 scope.go:117] "RemoveContainer" containerID="8322e2efc2a30956f62de53e5aa643432d764624734a4387ed77f572e4226c06" Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.410755 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 14 08:53:27 crc kubenswrapper[5129]: W0314 08:53:27.420314 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f7f65bf_8ebe_48e5_933e_8fa422a7dbfa.slice/crio-da667bd5353b2d97f28e2172e30bd3ce8d4f00d3346bba6b23e7f6c90c41b0a2 WatchSource:0}: Error finding container da667bd5353b2d97f28e2172e30bd3ce8d4f00d3346bba6b23e7f6c90c41b0a2: Status 404 returned error can't find the container with id da667bd5353b2d97f28e2172e30bd3ce8d4f00d3346bba6b23e7f6c90c41b0a2 Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.586139 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c3371490-9878-4c69-8390-4b2aed82dd1d","Type":"ContainerStarted","Data":"3a15f6e96f288206d7096816f53231b6d49f3f0e81641a111aacbe3f9657bd03"} Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.589123 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8ecea838-e1a9-4aa0-8602-ffe9621ff137","Type":"ContainerStarted","Data":"d780e28fc98bff9614c0a56815bf71af6289a4a8ba2628f850a8ee58af41406b"} Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.590763 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa","Type":"ContainerStarted","Data":"da667bd5353b2d97f28e2172e30bd3ce8d4f00d3346bba6b23e7f6c90c41b0a2"} Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.592215 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"e80a5513-d89e-43d3-9d57-d95ee6b3295c","Type":"ContainerStarted","Data":"8d3f9a8c30dfaf297a4eaff2f638cd72819aff71381a52b36be7c9f12630a9e4"} Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.593798 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"92b6acaf-6100-4355-abb5-08af58c73021","Type":"ContainerStarted","Data":"fc595b0649691ad451b278108ea3bbb743479d158d06180777cae7c258cd83ec"} Mar 14 08:53:27 crc kubenswrapper[5129]: I0314 08:53:27.635554 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 14 08:53:28 crc kubenswrapper[5129]: I0314 08:53:28.613761 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"db6190b7-2620-4a00-b2bd-ce56d2c94069","Type":"ContainerStarted","Data":"3794058d380f7c1078cfd4e845a41f45e18ddd582fd5c4a04cf89c8accf52c84"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.706522 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c3371490-9878-4c69-8390-4b2aed82dd1d","Type":"ContainerStarted","Data":"824adc5afbe4e9c5ece49190b2f4cb44f07299a25d9f9a156541cefb394e0d16"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.707638 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c3371490-9878-4c69-8390-4b2aed82dd1d","Type":"ContainerStarted","Data":"ca77be00df11acebde716885558f16e10481e8b1a10070a43b9a490c10f5e850"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.710313 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8ecea838-e1a9-4aa0-8602-ffe9621ff137","Type":"ContainerStarted","Data":"231739c26246d92bc4ad88ab6e9620ea219787774d0d2066b4c6c47e4b2b3175"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.710405 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8ecea838-e1a9-4aa0-8602-ffe9621ff137","Type":"ContainerStarted","Data":"9df60d12e2224c8c79a4fa4b804dd44298c271fcb91e5c0e896e333f79493c0c"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.712800 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa","Type":"ContainerStarted","Data":"e1d8c134c24b3ecf34fb3dd424d944a90348ea6fba8756d3c1533a7425072c3c"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.712855 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa","Type":"ContainerStarted","Data":"260d57afd505d3153d54b1199978daf12e00c76d87b2c707b0bb5c812491b6cd"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.716474 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"e80a5513-d89e-43d3-9d57-d95ee6b3295c","Type":"ContainerStarted","Data":"462de9f43595f77cc745e5ba6a4e0cc9b6cd89235c43a73e151af1b2f737711d"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.716504 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"e80a5513-d89e-43d3-9d57-d95ee6b3295c","Type":"ContainerStarted","Data":"9b4e26c7e3773eb507c09b3a9f6c5c486483e3952b0987db8745bc3acd493511"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.718668 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"db6190b7-2620-4a00-b2bd-ce56d2c94069","Type":"ContainerStarted","Data":"d17f724368e4aaf704b731c33de4842c5503ae15f7bc25445a1febe3dc1bf791"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.718742 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"db6190b7-2620-4a00-b2bd-ce56d2c94069","Type":"ContainerStarted","Data":"5204af9852f1c190194f850f30fe7347c6d14aef6b313ce4c6383be3837afb10"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.720876 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"92b6acaf-6100-4355-abb5-08af58c73021","Type":"ContainerStarted","Data":"7220418bfb271313373f5726e5e79a1131435ded4548e232342e808e84d4998f"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.720904 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"92b6acaf-6100-4355-abb5-08af58c73021","Type":"ContainerStarted","Data":"3fde200fde09a0d467761fc95456da65608fbc96c53efc5b38cfa620fbbe4149"} Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.734946 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.741858 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.622653466 podStartE2EDuration="11.741824574s" podCreationTimestamp="2026-03-14 08:53:24 +0000 UTC" firstStartedPulling="2026-03-14 08:53:26.735194711 +0000 UTC m=+6869.487109895" lastFinishedPulling="2026-03-14 08:53:34.854365819 +0000 UTC m=+6877.606281003" observedRunningTime="2026-03-14 08:53:35.735464171 +0000 UTC m=+6878.487379365" watchObservedRunningTime="2026-03-14 08:53:35.741824574 +0000 UTC m=+6878.493739788" Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.772678 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.6742710800000005 podStartE2EDuration="11.772655407s" podCreationTimestamp="2026-03-14 08:53:24 +0000 UTC" firstStartedPulling="2026-03-14 08:53:27.643204982 +0000 UTC m=+6870.395120166" lastFinishedPulling="2026-03-14 08:53:34.741589279 +0000 UTC m=+6877.493504493" observedRunningTime="2026-03-14 08:53:35.760443316 +0000 UTC m=+6878.512358520" watchObservedRunningTime="2026-03-14 08:53:35.772655407 +0000 UTC m=+6878.524570591" Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.793953 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.8838576590000002 podStartE2EDuration="11.793931512s" podCreationTimestamp="2026-03-14 08:53:24 +0000 UTC" firstStartedPulling="2026-03-14 08:53:26.844806646 +0000 UTC m=+6869.596721830" lastFinishedPulling="2026-03-14 08:53:34.754880489 +0000 UTC m=+6877.506795683" observedRunningTime="2026-03-14 08:53:35.787316413 +0000 UTC m=+6878.539231607" watchObservedRunningTime="2026-03-14 08:53:35.793931512 +0000 UTC m=+6878.545846706" Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.829293 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.246516134 podStartE2EDuration="10.829275918s" podCreationTimestamp="2026-03-14 08:53:25 +0000 UTC" firstStartedPulling="2026-03-14 08:53:27.183948745 +0000 UTC m=+6869.935863929" lastFinishedPulling="2026-03-14 08:53:34.766708509 +0000 UTC m=+6877.518623713" observedRunningTime="2026-03-14 08:53:35.818512067 +0000 UTC m=+6878.570427251" watchObservedRunningTime="2026-03-14 08:53:35.829275918 +0000 UTC m=+6878.581191102" Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.847867 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.5347649370000003 podStartE2EDuration="10.847833359s" podCreationTimestamp="2026-03-14 08:53:25 +0000 UTC" firstStartedPulling="2026-03-14 08:53:27.43007022 +0000 UTC m=+6870.181985404" lastFinishedPulling="2026-03-14 08:53:34.743138612 +0000 UTC m=+6877.495053826" observedRunningTime="2026-03-14 08:53:35.844188851 +0000 UTC m=+6878.596104045" watchObservedRunningTime="2026-03-14 08:53:35.847833359 +0000 UTC m=+6878.599748553" Mar 14 08:53:35 crc kubenswrapper[5129]: I0314 08:53:35.874646 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.232472543 podStartE2EDuration="10.874617673s" podCreationTimestamp="2026-03-14 08:53:25 +0000 UTC" firstStartedPulling="2026-03-14 08:53:27.092385329 +0000 UTC m=+6869.844300513" lastFinishedPulling="2026-03-14 08:53:34.734530449 +0000 UTC m=+6877.486445643" observedRunningTime="2026-03-14 08:53:35.86819763 +0000 UTC m=+6878.620112814" watchObservedRunningTime="2026-03-14 08:53:35.874617673 +0000 UTC m=+6878.626532857" Mar 14 08:53:36 crc kubenswrapper[5129]: I0314 08:53:36.021269 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:36 crc kubenswrapper[5129]: I0314 08:53:36.054672 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:36 crc kubenswrapper[5129]: I0314 08:53:36.055830 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:36 crc kubenswrapper[5129]: I0314 08:53:36.683910 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:36 crc kubenswrapper[5129]: I0314 08:53:36.714718 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:36 crc kubenswrapper[5129]: I0314 08:53:36.728473 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.021375 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.052174 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.056677 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.098536 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.101922 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.108558 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.683276 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.714219 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.735136 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.774007 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:38 crc kubenswrapper[5129]: I0314 08:53:38.792280 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.103986 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.110277 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.122907 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.378821 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cc994fdc5-45cpr"] Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.380434 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.383241 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.405554 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cc994fdc5-45cpr"] Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.426258 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-dns-svc\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.426340 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-config\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.426377 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.426437 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjrj4\" (UniqueName: \"kubernetes.io/projected/947a1300-3d48-44a1-b9d0-f93876a4ffab-kube-api-access-fjrj4\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.528257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjrj4\" (UniqueName: \"kubernetes.io/projected/947a1300-3d48-44a1-b9d0-f93876a4ffab-kube-api-access-fjrj4\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.528337 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-dns-svc\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.528395 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-config\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.528431 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.529401 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-dns-svc\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.529448 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-ovsdbserver-sb\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.529711 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-config\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.548524 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjrj4\" (UniqueName: \"kubernetes.io/projected/947a1300-3d48-44a1-b9d0-f93876a4ffab-kube-api-access-fjrj4\") pod \"dnsmasq-dns-7cc994fdc5-45cpr\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.699248 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.750370 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.764524 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 14 08:53:41 crc kubenswrapper[5129]: I0314 08:53:41.778939 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.073033 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cc994fdc5-45cpr"] Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.081994 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69448b775f-qphrw"] Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.090751 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.093998 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.116380 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69448b775f-qphrw"] Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.246339 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-sb\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.246747 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-config\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.246903 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45sn4\" (UniqueName: \"kubernetes.io/projected/6543f565-2287-4972-8a36-41b99d7248fb-kube-api-access-45sn4\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.247036 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-nb\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.247174 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-dns-svc\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.348980 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-dns-svc\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.349038 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-sb\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.349102 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-config\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.349125 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45sn4\" (UniqueName: \"kubernetes.io/projected/6543f565-2287-4972-8a36-41b99d7248fb-kube-api-access-45sn4\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.349184 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-nb\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.350221 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-nb\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.350305 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-dns-svc\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.350483 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-sb\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.350890 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-config\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.371903 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cc994fdc5-45cpr"] Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.373793 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45sn4\" (UniqueName: \"kubernetes.io/projected/6543f565-2287-4972-8a36-41b99d7248fb-kube-api-access-45sn4\") pod \"dnsmasq-dns-69448b775f-qphrw\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.419355 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:42 crc kubenswrapper[5129]: E0314 08:53:42.749488 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod947a1300_3d48_44a1_b9d0_f93876a4ffab.slice/crio-1b24fc70f218e1c9f2f1436423ea8c7b74594cecb76c143b81971cbc13d4ed01.scope\": RecentStats: unable to find data in memory cache]" Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.840063 5129 generic.go:334] "Generic (PLEG): container finished" podID="947a1300-3d48-44a1-b9d0-f93876a4ffab" containerID="1b24fc70f218e1c9f2f1436423ea8c7b74594cecb76c143b81971cbc13d4ed01" exitCode=0 Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.840170 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" event={"ID":"947a1300-3d48-44a1-b9d0-f93876a4ffab","Type":"ContainerDied","Data":"1b24fc70f218e1c9f2f1436423ea8c7b74594cecb76c143b81971cbc13d4ed01"} Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.841036 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" event={"ID":"947a1300-3d48-44a1-b9d0-f93876a4ffab","Type":"ContainerStarted","Data":"b54d83853a9b746a5690e56b7fb013d6a036f56cd749377259fd93845e2d7adb"} Mar 14 08:53:42 crc kubenswrapper[5129]: I0314 08:53:42.901843 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69448b775f-qphrw"] Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.117623 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.273430 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-ovsdbserver-sb\") pod \"947a1300-3d48-44a1-b9d0-f93876a4ffab\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.273490 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-config\") pod \"947a1300-3d48-44a1-b9d0-f93876a4ffab\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.273564 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-dns-svc\") pod \"947a1300-3d48-44a1-b9d0-f93876a4ffab\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.273803 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjrj4\" (UniqueName: \"kubernetes.io/projected/947a1300-3d48-44a1-b9d0-f93876a4ffab-kube-api-access-fjrj4\") pod \"947a1300-3d48-44a1-b9d0-f93876a4ffab\" (UID: \"947a1300-3d48-44a1-b9d0-f93876a4ffab\") " Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.280717 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947a1300-3d48-44a1-b9d0-f93876a4ffab-kube-api-access-fjrj4" (OuterVolumeSpecName: "kube-api-access-fjrj4") pod "947a1300-3d48-44a1-b9d0-f93876a4ffab" (UID: "947a1300-3d48-44a1-b9d0-f93876a4ffab"). InnerVolumeSpecName "kube-api-access-fjrj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.295383 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "947a1300-3d48-44a1-b9d0-f93876a4ffab" (UID: "947a1300-3d48-44a1-b9d0-f93876a4ffab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.298236 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "947a1300-3d48-44a1-b9d0-f93876a4ffab" (UID: "947a1300-3d48-44a1-b9d0-f93876a4ffab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.308055 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-config" (OuterVolumeSpecName: "config") pod "947a1300-3d48-44a1-b9d0-f93876a4ffab" (UID: "947a1300-3d48-44a1-b9d0-f93876a4ffab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.375953 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjrj4\" (UniqueName: \"kubernetes.io/projected/947a1300-3d48-44a1-b9d0-f93876a4ffab-kube-api-access-fjrj4\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.375996 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.376010 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.376022 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/947a1300-3d48-44a1-b9d0-f93876a4ffab-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.859101 5129 generic.go:334] "Generic (PLEG): container finished" podID="6543f565-2287-4972-8a36-41b99d7248fb" containerID="066a2034b27e9f457e9df59f540573ddce8281c5c528a8863f5d8fca855166e0" exitCode=0 Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.859184 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69448b775f-qphrw" event={"ID":"6543f565-2287-4972-8a36-41b99d7248fb","Type":"ContainerDied","Data":"066a2034b27e9f457e9df59f540573ddce8281c5c528a8863f5d8fca855166e0"} Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.859245 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69448b775f-qphrw" event={"ID":"6543f565-2287-4972-8a36-41b99d7248fb","Type":"ContainerStarted","Data":"6671da50617c80a8636ca7fecfc4447461c73aa02eba1e66e5af95b9425e976f"} Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.862274 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" event={"ID":"947a1300-3d48-44a1-b9d0-f93876a4ffab","Type":"ContainerDied","Data":"b54d83853a9b746a5690e56b7fb013d6a036f56cd749377259fd93845e2d7adb"} Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.862324 5129 scope.go:117] "RemoveContainer" containerID="1b24fc70f218e1c9f2f1436423ea8c7b74594cecb76c143b81971cbc13d4ed01" Mar 14 08:53:43 crc kubenswrapper[5129]: I0314 08:53:43.862353 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cc994fdc5-45cpr" Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.121035 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cc994fdc5-45cpr"] Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.132589 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cc994fdc5-45cpr"] Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.842495 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 14 08:53:44 crc kubenswrapper[5129]: E0314 08:53:44.843549 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947a1300-3d48-44a1-b9d0-f93876a4ffab" containerName="init" Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.843579 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="947a1300-3d48-44a1-b9d0-f93876a4ffab" containerName="init" Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.843787 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="947a1300-3d48-44a1-b9d0-f93876a4ffab" containerName="init" Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.844521 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.849448 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.868351 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.887237 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69448b775f-qphrw" event={"ID":"6543f565-2287-4972-8a36-41b99d7248fb","Type":"ContainerStarted","Data":"2e150d503562c8d2b7eaa0e8557dec957bd831b2c1071495f691f01609d53b11"} Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.887414 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:44 crc kubenswrapper[5129]: I0314 08:53:44.912435 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69448b775f-qphrw" podStartSLOduration=2.912410906 podStartE2EDuration="2.912410906s" podCreationTimestamp="2026-03-14 08:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:53:44.910896726 +0000 UTC m=+6887.662811910" watchObservedRunningTime="2026-03-14 08:53:44.912410906 +0000 UTC m=+6887.664326090" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.022938 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/25dd8792-1393-4803-9806-21f7292348fa-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.023082 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7clt6\" (UniqueName: \"kubernetes.io/projected/25dd8792-1393-4803-9806-21f7292348fa-kube-api-access-7clt6\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.023125 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.125128 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/25dd8792-1393-4803-9806-21f7292348fa-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.125217 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7clt6\" (UniqueName: \"kubernetes.io/projected/25dd8792-1393-4803-9806-21f7292348fa-kube-api-access-7clt6\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.125250 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.130808 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.130852 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4721078b5f9896934b1ed2cbd930f11559abfd3b47811042222f984817f3931f/globalmount\"" pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.131090 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/25dd8792-1393-4803-9806-21f7292348fa-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.145816 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7clt6\" (UniqueName: \"kubernetes.io/projected/25dd8792-1393-4803-9806-21f7292348fa-kube-api-access-7clt6\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.179553 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\") pod \"ovn-copy-data\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " pod="openstack/ovn-copy-data" Mar 14 08:53:45 crc kubenswrapper[5129]: I0314 08:53:45.470152 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 14 08:53:46 crc kubenswrapper[5129]: I0314 08:53:46.066158 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947a1300-3d48-44a1-b9d0-f93876a4ffab" path="/var/lib/kubelet/pods/947a1300-3d48-44a1-b9d0-f93876a4ffab/volumes" Mar 14 08:53:46 crc kubenswrapper[5129]: I0314 08:53:46.112723 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 14 08:53:46 crc kubenswrapper[5129]: W0314 08:53:46.117140 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25dd8792_1393_4803_9806_21f7292348fa.slice/crio-983c4e5e01f2bb37d3eccfc8df582b82b32421964016bcabfa9003fddad3fa81 WatchSource:0}: Error finding container 983c4e5e01f2bb37d3eccfc8df582b82b32421964016bcabfa9003fddad3fa81: Status 404 returned error can't find the container with id 983c4e5e01f2bb37d3eccfc8df582b82b32421964016bcabfa9003fddad3fa81 Mar 14 08:53:46 crc kubenswrapper[5129]: I0314 08:53:46.908908 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"25dd8792-1393-4803-9806-21f7292348fa","Type":"ContainerStarted","Data":"3a0f42f7bf412981c76a99dff2bc54b03126680968e96ba6091f2fdc212216a1"} Mar 14 08:53:46 crc kubenswrapper[5129]: I0314 08:53:46.909485 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"25dd8792-1393-4803-9806-21f7292348fa","Type":"ContainerStarted","Data":"983c4e5e01f2bb37d3eccfc8df582b82b32421964016bcabfa9003fddad3fa81"} Mar 14 08:53:46 crc kubenswrapper[5129]: I0314 08:53:46.933803 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.711206651 podStartE2EDuration="3.933776378s" podCreationTimestamp="2026-03-14 08:53:43 +0000 UTC" firstStartedPulling="2026-03-14 08:53:46.121215859 +0000 UTC m=+6888.873131063" lastFinishedPulling="2026-03-14 08:53:46.343785606 +0000 UTC m=+6889.095700790" observedRunningTime="2026-03-14 08:53:46.932627516 +0000 UTC m=+6889.684542710" watchObservedRunningTime="2026-03-14 08:53:46.933776378 +0000 UTC m=+6889.685691562" Mar 14 08:53:49 crc kubenswrapper[5129]: I0314 08:53:49.575206 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:53:49 crc kubenswrapper[5129]: I0314 08:53:49.575926 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.421903 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.521431 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-747f589bdf-44rx6"] Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.521863 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" podUID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerName="dnsmasq-dns" containerID="cri-o://330b60cd4436a3db2514f18a24a29aaaa77016a838850e8250931a8b47b3c13f" gracePeriod=10 Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.990906 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerID="330b60cd4436a3db2514f18a24a29aaaa77016a838850e8250931a8b47b3c13f" exitCode=0 Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.990996 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" event={"ID":"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc","Type":"ContainerDied","Data":"330b60cd4436a3db2514f18a24a29aaaa77016a838850e8250931a8b47b3c13f"} Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.992200 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" event={"ID":"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc","Type":"ContainerDied","Data":"c65bf0a6e5c20c398cb7ac2060b57f69583270ae5f22757d8190591fd083bcc2"} Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.992332 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c65bf0a6e5c20c398cb7ac2060b57f69583270ae5f22757d8190591fd083bcc2" Mar 14 08:53:52 crc kubenswrapper[5129]: I0314 08:53:52.999037 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.107579 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-config\") pod \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.107939 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsx7j\" (UniqueName: \"kubernetes.io/projected/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-kube-api-access-wsx7j\") pod \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.108045 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-dns-svc\") pod \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\" (UID: \"b2fc83fc-253f-436b-b34c-f8e31aaa9ebc\") " Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.115813 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-kube-api-access-wsx7j" (OuterVolumeSpecName: "kube-api-access-wsx7j") pod "b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" (UID: "b2fc83fc-253f-436b-b34c-f8e31aaa9ebc"). InnerVolumeSpecName "kube-api-access-wsx7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.163692 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" (UID: "b2fc83fc-253f-436b-b34c-f8e31aaa9ebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.170563 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-config" (OuterVolumeSpecName: "config") pod "b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" (UID: "b2fc83fc-253f-436b-b34c-f8e31aaa9ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.211313 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.211383 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsx7j\" (UniqueName: \"kubernetes.io/projected/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-kube-api-access-wsx7j\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:53 crc kubenswrapper[5129]: I0314 08:53:53.211402 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:54 crc kubenswrapper[5129]: I0314 08:53:54.003006 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-747f589bdf-44rx6" Mar 14 08:53:54 crc kubenswrapper[5129]: I0314 08:53:54.064041 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-747f589bdf-44rx6"] Mar 14 08:53:54 crc kubenswrapper[5129]: I0314 08:53:54.069688 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-747f589bdf-44rx6"] Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.250062 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 14 08:53:55 crc kubenswrapper[5129]: E0314 08:53:55.250576 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerName="dnsmasq-dns" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.250594 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerName="dnsmasq-dns" Mar 14 08:53:55 crc kubenswrapper[5129]: E0314 08:53:55.250629 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerName="init" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.250637 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerName="init" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.250903 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" containerName="dnsmasq-dns" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.253130 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.265169 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.265191 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.265214 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.265754 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5jkrz" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.266928 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.354894 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00027bfa-9ff4-4472-8f9d-3763d68530d3-config\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.354996 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.355051 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00027bfa-9ff4-4472-8f9d-3763d68530d3-scripts\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.355072 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hthzj\" (UniqueName: \"kubernetes.io/projected/00027bfa-9ff4-4472-8f9d-3763d68530d3-kube-api-access-hthzj\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.355103 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.355147 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00027bfa-9ff4-4472-8f9d-3763d68530d3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.355187 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.456742 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.457115 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00027bfa-9ff4-4472-8f9d-3763d68530d3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.457289 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.457445 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00027bfa-9ff4-4472-8f9d-3763d68530d3-config\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.457593 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.457766 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00027bfa-9ff4-4472-8f9d-3763d68530d3-scripts\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.457864 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hthzj\" (UniqueName: \"kubernetes.io/projected/00027bfa-9ff4-4472-8f9d-3763d68530d3-kube-api-access-hthzj\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.457893 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00027bfa-9ff4-4472-8f9d-3763d68530d3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.458382 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00027bfa-9ff4-4472-8f9d-3763d68530d3-config\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.458548 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00027bfa-9ff4-4472-8f9d-3763d68530d3-scripts\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.465186 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.466210 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.466742 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00027bfa-9ff4-4472-8f9d-3763d68530d3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.479366 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hthzj\" (UniqueName: \"kubernetes.io/projected/00027bfa-9ff4-4472-8f9d-3763d68530d3-kube-api-access-hthzj\") pod \"ovn-northd-0\" (UID: \"00027bfa-9ff4-4472-8f9d-3763d68530d3\") " pod="openstack/ovn-northd-0" Mar 14 08:53:55 crc kubenswrapper[5129]: I0314 08:53:55.579890 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 08:53:56 crc kubenswrapper[5129]: I0314 08:53:56.057291 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2fc83fc-253f-436b-b34c-f8e31aaa9ebc" path="/var/lib/kubelet/pods/b2fc83fc-253f-436b-b34c-f8e31aaa9ebc/volumes" Mar 14 08:53:56 crc kubenswrapper[5129]: I0314 08:53:56.102582 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 08:53:56 crc kubenswrapper[5129]: W0314 08:53:56.111695 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00027bfa_9ff4_4472_8f9d_3763d68530d3.slice/crio-60ede3e51c11e35a197e4ac7d66e7e2a948379699e50865ab2c76e762ac64df1 WatchSource:0}: Error finding container 60ede3e51c11e35a197e4ac7d66e7e2a948379699e50865ab2c76e762ac64df1: Status 404 returned error can't find the container with id 60ede3e51c11e35a197e4ac7d66e7e2a948379699e50865ab2c76e762ac64df1 Mar 14 08:53:57 crc kubenswrapper[5129]: I0314 08:53:57.036775 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00027bfa-9ff4-4472-8f9d-3763d68530d3","Type":"ContainerStarted","Data":"60ede3e51c11e35a197e4ac7d66e7e2a948379699e50865ab2c76e762ac64df1"} Mar 14 08:53:58 crc kubenswrapper[5129]: I0314 08:53:58.066557 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00027bfa-9ff4-4472-8f9d-3763d68530d3","Type":"ContainerStarted","Data":"0b7de22bae169477b1fc590371770c33b0a44270f03c00f4b10ed97ff1a5aeec"} Mar 14 08:53:58 crc kubenswrapper[5129]: I0314 08:53:58.067806 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00027bfa-9ff4-4472-8f9d-3763d68530d3","Type":"ContainerStarted","Data":"55bc1f7b03f454aa956b3f65a9b84c354bf5dde3f7c6918d0b2339f2e2bad2ee"} Mar 14 08:53:58 crc kubenswrapper[5129]: I0314 08:53:58.097653 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 14 08:53:58 crc kubenswrapper[5129]: I0314 08:53:58.134144 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.384031021 podStartE2EDuration="3.13409429s" podCreationTimestamp="2026-03-14 08:53:55 +0000 UTC" firstStartedPulling="2026-03-14 08:53:56.115669588 +0000 UTC m=+6898.867584782" lastFinishedPulling="2026-03-14 08:53:56.865732857 +0000 UTC m=+6899.617648051" observedRunningTime="2026-03-14 08:53:58.126963627 +0000 UTC m=+6900.878878881" watchObservedRunningTime="2026-03-14 08:53:58.13409429 +0000 UTC m=+6900.886009484" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.146216 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557974-jj4g6"] Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.148498 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-jj4g6" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.151881 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.151837 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.157805 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-jj4g6"] Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.157914 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.265528 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q955n\" (UniqueName: \"kubernetes.io/projected/6e411d5e-ec75-4c09-a8e7-ba6495e8683b-kube-api-access-q955n\") pod \"auto-csr-approver-29557974-jj4g6\" (UID: \"6e411d5e-ec75-4c09-a8e7-ba6495e8683b\") " pod="openshift-infra/auto-csr-approver-29557974-jj4g6" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.368211 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q955n\" (UniqueName: \"kubernetes.io/projected/6e411d5e-ec75-4c09-a8e7-ba6495e8683b-kube-api-access-q955n\") pod \"auto-csr-approver-29557974-jj4g6\" (UID: \"6e411d5e-ec75-4c09-a8e7-ba6495e8683b\") " pod="openshift-infra/auto-csr-approver-29557974-jj4g6" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.393735 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q955n\" (UniqueName: \"kubernetes.io/projected/6e411d5e-ec75-4c09-a8e7-ba6495e8683b-kube-api-access-q955n\") pod \"auto-csr-approver-29557974-jj4g6\" (UID: \"6e411d5e-ec75-4c09-a8e7-ba6495e8683b\") " pod="openshift-infra/auto-csr-approver-29557974-jj4g6" Mar 14 08:54:00 crc kubenswrapper[5129]: I0314 08:54:00.472045 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-jj4g6" Mar 14 08:54:01 crc kubenswrapper[5129]: I0314 08:54:01.043897 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-jj4g6"] Mar 14 08:54:01 crc kubenswrapper[5129]: I0314 08:54:01.102064 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557974-jj4g6" event={"ID":"6e411d5e-ec75-4c09-a8e7-ba6495e8683b","Type":"ContainerStarted","Data":"0d5feda9265fcab3118b9f01640da2a3fa1ebaac807307a4ca70e90089af5853"} Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.129480 5129 generic.go:334] "Generic (PLEG): container finished" podID="6e411d5e-ec75-4c09-a8e7-ba6495e8683b" containerID="81b19590fff4577e99e55f0dcee9c9e14bec6db8ab6030801260e633071123bb" exitCode=0 Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.129571 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557974-jj4g6" event={"ID":"6e411d5e-ec75-4c09-a8e7-ba6495e8683b","Type":"ContainerDied","Data":"81b19590fff4577e99e55f0dcee9c9e14bec6db8ab6030801260e633071123bb"} Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.722466 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-77ds8"] Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.724538 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.738767 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-77ds8"] Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.750226 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4e64-account-create-update-ztnfr"] Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.752348 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.758181 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.764127 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4e64-account-create-update-ztnfr"] Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.837745 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l87t\" (UniqueName: \"kubernetes.io/projected/8e1ef377-9a00-46b1-b508-be733492d498-kube-api-access-6l87t\") pod \"keystone-db-create-77ds8\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.837824 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pbzw\" (UniqueName: \"kubernetes.io/projected/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-kube-api-access-5pbzw\") pod \"keystone-4e64-account-create-update-ztnfr\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.837879 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-operator-scripts\") pod \"keystone-4e64-account-create-update-ztnfr\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.838535 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1ef377-9a00-46b1-b508-be733492d498-operator-scripts\") pod \"keystone-db-create-77ds8\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.940238 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1ef377-9a00-46b1-b508-be733492d498-operator-scripts\") pod \"keystone-db-create-77ds8\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.940326 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l87t\" (UniqueName: \"kubernetes.io/projected/8e1ef377-9a00-46b1-b508-be733492d498-kube-api-access-6l87t\") pod \"keystone-db-create-77ds8\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.940356 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pbzw\" (UniqueName: \"kubernetes.io/projected/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-kube-api-access-5pbzw\") pod \"keystone-4e64-account-create-update-ztnfr\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.940402 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-operator-scripts\") pod \"keystone-4e64-account-create-update-ztnfr\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.941491 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1ef377-9a00-46b1-b508-be733492d498-operator-scripts\") pod \"keystone-db-create-77ds8\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.941582 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-operator-scripts\") pod \"keystone-4e64-account-create-update-ztnfr\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.968207 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l87t\" (UniqueName: \"kubernetes.io/projected/8e1ef377-9a00-46b1-b508-be733492d498-kube-api-access-6l87t\") pod \"keystone-db-create-77ds8\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:03 crc kubenswrapper[5129]: I0314 08:54:03.970091 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pbzw\" (UniqueName: \"kubernetes.io/projected/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-kube-api-access-5pbzw\") pod \"keystone-4e64-account-create-update-ztnfr\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.046038 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.072461 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.579796 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-77ds8"] Mar 14 08:54:04 crc kubenswrapper[5129]: W0314 08:54:04.581265 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cab510a_bf82_4dcd_bf7e_b4e0ccf3fd5b.slice/crio-a59cd5bfc206505843d2b65bf45b03dbe17557f0b8006e0375cae5fcbd9ecbb4 WatchSource:0}: Error finding container a59cd5bfc206505843d2b65bf45b03dbe17557f0b8006e0375cae5fcbd9ecbb4: Status 404 returned error can't find the container with id a59cd5bfc206505843d2b65bf45b03dbe17557f0b8006e0375cae5fcbd9ecbb4 Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.588500 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4e64-account-create-update-ztnfr"] Mar 14 08:54:04 crc kubenswrapper[5129]: W0314 08:54:04.629214 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e1ef377_9a00_46b1_b508_be733492d498.slice/crio-5992a53514ac765dd837549719b054a95c8edfdd9722bc15ceed8e2ac92d7887 WatchSource:0}: Error finding container 5992a53514ac765dd837549719b054a95c8edfdd9722bc15ceed8e2ac92d7887: Status 404 returned error can't find the container with id 5992a53514ac765dd837549719b054a95c8edfdd9722bc15ceed8e2ac92d7887 Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.659235 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-jj4g6" Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.777072 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q955n\" (UniqueName: \"kubernetes.io/projected/6e411d5e-ec75-4c09-a8e7-ba6495e8683b-kube-api-access-q955n\") pod \"6e411d5e-ec75-4c09-a8e7-ba6495e8683b\" (UID: \"6e411d5e-ec75-4c09-a8e7-ba6495e8683b\") " Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.784940 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e411d5e-ec75-4c09-a8e7-ba6495e8683b-kube-api-access-q955n" (OuterVolumeSpecName: "kube-api-access-q955n") pod "6e411d5e-ec75-4c09-a8e7-ba6495e8683b" (UID: "6e411d5e-ec75-4c09-a8e7-ba6495e8683b"). InnerVolumeSpecName "kube-api-access-q955n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:04 crc kubenswrapper[5129]: I0314 08:54:04.880330 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q955n\" (UniqueName: \"kubernetes.io/projected/6e411d5e-ec75-4c09-a8e7-ba6495e8683b-kube-api-access-q955n\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.173063 5129 generic.go:334] "Generic (PLEG): container finished" podID="8e1ef377-9a00-46b1-b508-be733492d498" containerID="414d9c5c05d97bc88d53f026d62517aa4927eb14d00fb3e8cbbcd138777f70dd" exitCode=0 Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.173130 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-77ds8" event={"ID":"8e1ef377-9a00-46b1-b508-be733492d498","Type":"ContainerDied","Data":"414d9c5c05d97bc88d53f026d62517aa4927eb14d00fb3e8cbbcd138777f70dd"} Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.173569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-77ds8" event={"ID":"8e1ef377-9a00-46b1-b508-be733492d498","Type":"ContainerStarted","Data":"5992a53514ac765dd837549719b054a95c8edfdd9722bc15ceed8e2ac92d7887"} Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.176546 5129 generic.go:334] "Generic (PLEG): container finished" podID="8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b" containerID="1385d9621809d4f434148307c54f316ee93c6020b6d6a59a8fb650aabc0b0935" exitCode=0 Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.176658 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e64-account-create-update-ztnfr" event={"ID":"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b","Type":"ContainerDied","Data":"1385d9621809d4f434148307c54f316ee93c6020b6d6a59a8fb650aabc0b0935"} Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.176684 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e64-account-create-update-ztnfr" event={"ID":"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b","Type":"ContainerStarted","Data":"a59cd5bfc206505843d2b65bf45b03dbe17557f0b8006e0375cae5fcbd9ecbb4"} Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.178551 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557974-jj4g6" event={"ID":"6e411d5e-ec75-4c09-a8e7-ba6495e8683b","Type":"ContainerDied","Data":"0d5feda9265fcab3118b9f01640da2a3fa1ebaac807307a4ca70e90089af5853"} Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.178580 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d5feda9265fcab3118b9f01640da2a3fa1ebaac807307a4ca70e90089af5853" Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.178721 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-jj4g6" Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.766641 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-k958l"] Mar 14 08:54:05 crc kubenswrapper[5129]: I0314 08:54:05.778053 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-k958l"] Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.048612 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6372c0-b042-46c4-848b-c3dc35afc8ba" path="/var/lib/kubelet/pods/6c6372c0-b042-46c4-848b-c3dc35afc8ba/volumes" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.641029 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.648235 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.818561 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pbzw\" (UniqueName: \"kubernetes.io/projected/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-kube-api-access-5pbzw\") pod \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.819022 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l87t\" (UniqueName: \"kubernetes.io/projected/8e1ef377-9a00-46b1-b508-be733492d498-kube-api-access-6l87t\") pod \"8e1ef377-9a00-46b1-b508-be733492d498\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.819065 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1ef377-9a00-46b1-b508-be733492d498-operator-scripts\") pod \"8e1ef377-9a00-46b1-b508-be733492d498\" (UID: \"8e1ef377-9a00-46b1-b508-be733492d498\") " Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.819308 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-operator-scripts\") pod \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\" (UID: \"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b\") " Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.820489 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1ef377-9a00-46b1-b508-be733492d498-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e1ef377-9a00-46b1-b508-be733492d498" (UID: "8e1ef377-9a00-46b1-b508-be733492d498"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.820636 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b" (UID: "8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.825764 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-kube-api-access-5pbzw" (OuterVolumeSpecName: "kube-api-access-5pbzw") pod "8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b" (UID: "8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b"). InnerVolumeSpecName "kube-api-access-5pbzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.826461 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1ef377-9a00-46b1-b508-be733492d498-kube-api-access-6l87t" (OuterVolumeSpecName: "kube-api-access-6l87t") pod "8e1ef377-9a00-46b1-b508-be733492d498" (UID: "8e1ef377-9a00-46b1-b508-be733492d498"). InnerVolumeSpecName "kube-api-access-6l87t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.922675 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pbzw\" (UniqueName: \"kubernetes.io/projected/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-kube-api-access-5pbzw\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.922737 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l87t\" (UniqueName: \"kubernetes.io/projected/8e1ef377-9a00-46b1-b508-be733492d498-kube-api-access-6l87t\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.922758 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e1ef377-9a00-46b1-b508-be733492d498-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:06 crc kubenswrapper[5129]: I0314 08:54:06.922781 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:07 crc kubenswrapper[5129]: I0314 08:54:07.206691 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-77ds8" event={"ID":"8e1ef377-9a00-46b1-b508-be733492d498","Type":"ContainerDied","Data":"5992a53514ac765dd837549719b054a95c8edfdd9722bc15ceed8e2ac92d7887"} Mar 14 08:54:07 crc kubenswrapper[5129]: I0314 08:54:07.206764 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5992a53514ac765dd837549719b054a95c8edfdd9722bc15ceed8e2ac92d7887" Mar 14 08:54:07 crc kubenswrapper[5129]: I0314 08:54:07.206786 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-77ds8" Mar 14 08:54:07 crc kubenswrapper[5129]: I0314 08:54:07.210179 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e64-account-create-update-ztnfr" event={"ID":"8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b","Type":"ContainerDied","Data":"a59cd5bfc206505843d2b65bf45b03dbe17557f0b8006e0375cae5fcbd9ecbb4"} Mar 14 08:54:07 crc kubenswrapper[5129]: I0314 08:54:07.210224 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a59cd5bfc206505843d2b65bf45b03dbe17557f0b8006e0375cae5fcbd9ecbb4" Mar 14 08:54:07 crc kubenswrapper[5129]: I0314 08:54:07.210314 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e64-account-create-update-ztnfr" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.328067 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-lvvc4"] Mar 14 08:54:09 crc kubenswrapper[5129]: E0314 08:54:09.329442 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b" containerName="mariadb-account-create-update" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.329474 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b" containerName="mariadb-account-create-update" Mar 14 08:54:09 crc kubenswrapper[5129]: E0314 08:54:09.329516 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e411d5e-ec75-4c09-a8e7-ba6495e8683b" containerName="oc" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.329527 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e411d5e-ec75-4c09-a8e7-ba6495e8683b" containerName="oc" Mar 14 08:54:09 crc kubenswrapper[5129]: E0314 08:54:09.329566 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1ef377-9a00-46b1-b508-be733492d498" containerName="mariadb-database-create" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.329580 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1ef377-9a00-46b1-b508-be733492d498" containerName="mariadb-database-create" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.329945 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1ef377-9a00-46b1-b508-be733492d498" containerName="mariadb-database-create" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.329983 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e411d5e-ec75-4c09-a8e7-ba6495e8683b" containerName="oc" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.330003 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b" containerName="mariadb-account-create-update" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.331128 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.337927 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.338149 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.343743 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.345675 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8rfd" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.346777 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lvvc4"] Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.478477 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-config-data\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.478565 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-combined-ca-bundle\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.478718 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97dgf\" (UniqueName: \"kubernetes.io/projected/a5241695-295b-4ce3-809e-79a7790a54c3-kube-api-access-97dgf\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.581241 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-config-data\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.581358 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-combined-ca-bundle\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.581471 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97dgf\" (UniqueName: \"kubernetes.io/projected/a5241695-295b-4ce3-809e-79a7790a54c3-kube-api-access-97dgf\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.591946 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-combined-ca-bundle\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.592042 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-config-data\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.613375 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97dgf\" (UniqueName: \"kubernetes.io/projected/a5241695-295b-4ce3-809e-79a7790a54c3-kube-api-access-97dgf\") pod \"keystone-db-sync-lvvc4\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.661448 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:09 crc kubenswrapper[5129]: I0314 08:54:09.928920 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lvvc4"] Mar 14 08:54:09 crc kubenswrapper[5129]: W0314 08:54:09.942914 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5241695_295b_4ce3_809e_79a7790a54c3.slice/crio-d69841404f18892efb4ff6514dc94d11d91a67a8541932e5952004bf116642e3 WatchSource:0}: Error finding container d69841404f18892efb4ff6514dc94d11d91a67a8541932e5952004bf116642e3: Status 404 returned error can't find the container with id d69841404f18892efb4ff6514dc94d11d91a67a8541932e5952004bf116642e3 Mar 14 08:54:10 crc kubenswrapper[5129]: I0314 08:54:10.239924 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvvc4" event={"ID":"a5241695-295b-4ce3-809e-79a7790a54c3","Type":"ContainerStarted","Data":"d69841404f18892efb4ff6514dc94d11d91a67a8541932e5952004bf116642e3"} Mar 14 08:54:15 crc kubenswrapper[5129]: I0314 08:54:15.667593 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 14 08:54:19 crc kubenswrapper[5129]: I0314 08:54:19.335552 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvvc4" event={"ID":"a5241695-295b-4ce3-809e-79a7790a54c3","Type":"ContainerStarted","Data":"382d7502da58377d9faf810b28da60562ff0d64caac9ea231ad2d6fd4230293f"} Mar 14 08:54:19 crc kubenswrapper[5129]: I0314 08:54:19.371256 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-lvvc4" podStartSLOduration=1.59040902 podStartE2EDuration="10.371223005s" podCreationTimestamp="2026-03-14 08:54:09 +0000 UTC" firstStartedPulling="2026-03-14 08:54:09.954414645 +0000 UTC m=+6912.706329839" lastFinishedPulling="2026-03-14 08:54:18.7352286 +0000 UTC m=+6921.487143824" observedRunningTime="2026-03-14 08:54:19.367234037 +0000 UTC m=+6922.119149261" watchObservedRunningTime="2026-03-14 08:54:19.371223005 +0000 UTC m=+6922.123138229" Mar 14 08:54:19 crc kubenswrapper[5129]: I0314 08:54:19.574137 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:54:19 crc kubenswrapper[5129]: I0314 08:54:19.574644 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:54:21 crc kubenswrapper[5129]: I0314 08:54:21.365657 5129 generic.go:334] "Generic (PLEG): container finished" podID="a5241695-295b-4ce3-809e-79a7790a54c3" containerID="382d7502da58377d9faf810b28da60562ff0d64caac9ea231ad2d6fd4230293f" exitCode=0 Mar 14 08:54:21 crc kubenswrapper[5129]: I0314 08:54:21.365748 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvvc4" event={"ID":"a5241695-295b-4ce3-809e-79a7790a54c3","Type":"ContainerDied","Data":"382d7502da58377d9faf810b28da60562ff0d64caac9ea231ad2d6fd4230293f"} Mar 14 08:54:22 crc kubenswrapper[5129]: I0314 08:54:22.766260 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:22 crc kubenswrapper[5129]: I0314 08:54:22.958750 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97dgf\" (UniqueName: \"kubernetes.io/projected/a5241695-295b-4ce3-809e-79a7790a54c3-kube-api-access-97dgf\") pod \"a5241695-295b-4ce3-809e-79a7790a54c3\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " Mar 14 08:54:22 crc kubenswrapper[5129]: I0314 08:54:22.958963 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-config-data\") pod \"a5241695-295b-4ce3-809e-79a7790a54c3\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " Mar 14 08:54:22 crc kubenswrapper[5129]: I0314 08:54:22.960065 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-combined-ca-bundle\") pod \"a5241695-295b-4ce3-809e-79a7790a54c3\" (UID: \"a5241695-295b-4ce3-809e-79a7790a54c3\") " Mar 14 08:54:22 crc kubenswrapper[5129]: I0314 08:54:22.965925 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5241695-295b-4ce3-809e-79a7790a54c3-kube-api-access-97dgf" (OuterVolumeSpecName: "kube-api-access-97dgf") pod "a5241695-295b-4ce3-809e-79a7790a54c3" (UID: "a5241695-295b-4ce3-809e-79a7790a54c3"). InnerVolumeSpecName "kube-api-access-97dgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.010889 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5241695-295b-4ce3-809e-79a7790a54c3" (UID: "a5241695-295b-4ce3-809e-79a7790a54c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.030257 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-config-data" (OuterVolumeSpecName: "config-data") pod "a5241695-295b-4ce3-809e-79a7790a54c3" (UID: "a5241695-295b-4ce3-809e-79a7790a54c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.064254 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.064313 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97dgf\" (UniqueName: \"kubernetes.io/projected/a5241695-295b-4ce3-809e-79a7790a54c3-kube-api-access-97dgf\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.064336 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5241695-295b-4ce3-809e-79a7790a54c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.392562 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lvvc4" event={"ID":"a5241695-295b-4ce3-809e-79a7790a54c3","Type":"ContainerDied","Data":"d69841404f18892efb4ff6514dc94d11d91a67a8541932e5952004bf116642e3"} Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.392655 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d69841404f18892efb4ff6514dc94d11d91a67a8541932e5952004bf116642e3" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.392698 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lvvc4" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.784265 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dcc7fd9c5-xx776"] Mar 14 08:54:23 crc kubenswrapper[5129]: E0314 08:54:23.786830 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5241695-295b-4ce3-809e-79a7790a54c3" containerName="keystone-db-sync" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.786891 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5241695-295b-4ce3-809e-79a7790a54c3" containerName="keystone-db-sync" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.787490 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5241695-295b-4ce3-809e-79a7790a54c3" containerName="keystone-db-sync" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.792706 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.836003 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5m76h"] Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.837859 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.840334 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8rfd" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.842271 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.842679 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.847045 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.848034 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.849218 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcc7fd9c5-xx776"] Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.863363 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5m76h"] Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.895838 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-scripts\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.895899 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nfk\" (UniqueName: \"kubernetes.io/projected/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-kube-api-access-l5nfk\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.895932 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-config-data\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.896216 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-combined-ca-bundle\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.896323 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-credential-keys\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.896396 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.896545 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-config\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.896577 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.896916 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wgpp\" (UniqueName: \"kubernetes.io/projected/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-kube-api-access-2wgpp\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.897004 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-dns-svc\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:23 crc kubenswrapper[5129]: I0314 08:54:23.897047 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-fernet-keys\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000150 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-scripts\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000253 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nfk\" (UniqueName: \"kubernetes.io/projected/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-kube-api-access-l5nfk\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000299 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-config-data\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000361 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-combined-ca-bundle\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000403 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-credential-keys\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000442 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000494 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-config\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000521 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000624 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wgpp\" (UniqueName: \"kubernetes.io/projected/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-kube-api-access-2wgpp\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000654 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-dns-svc\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.000675 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-fernet-keys\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.002474 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.002505 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-dns-svc\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.002640 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.003255 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-config\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.005512 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-scripts\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.006510 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-fernet-keys\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.007008 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-credential-keys\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.007780 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-config-data\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.009138 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-combined-ca-bundle\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.019035 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wgpp\" (UniqueName: \"kubernetes.io/projected/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-kube-api-access-2wgpp\") pod \"keystone-bootstrap-5m76h\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.023391 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nfk\" (UniqueName: \"kubernetes.io/projected/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-kube-api-access-l5nfk\") pod \"dnsmasq-dns-6dcc7fd9c5-xx776\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.130222 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.160252 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.490033 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5m76h"] Mar 14 08:54:24 crc kubenswrapper[5129]: I0314 08:54:24.552286 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcc7fd9c5-xx776"] Mar 14 08:54:24 crc kubenswrapper[5129]: W0314 08:54:24.563137 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d6fa6c_d1e7_47d2_9f15_b482e6d61975.slice/crio-bb3a36c5e1edb7338ab5f3a49259c90275b6cfad6d204227da5697c125c78f56 WatchSource:0}: Error finding container bb3a36c5e1edb7338ab5f3a49259c90275b6cfad6d204227da5697c125c78f56: Status 404 returned error can't find the container with id bb3a36c5e1edb7338ab5f3a49259c90275b6cfad6d204227da5697c125c78f56 Mar 14 08:54:25 crc kubenswrapper[5129]: I0314 08:54:25.418305 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5m76h" event={"ID":"d5dc8699-66d4-4ed6-ab61-a7b15a15249f","Type":"ContainerStarted","Data":"86d45daa624ee8fe82612390d6e7f96cb88e6cb16950f3ac0d74dc50b5c805f8"} Mar 14 08:54:25 crc kubenswrapper[5129]: I0314 08:54:25.419155 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5m76h" event={"ID":"d5dc8699-66d4-4ed6-ab61-a7b15a15249f","Type":"ContainerStarted","Data":"e595204412ad1fc8a3bdb2d7e79028d807c7ea7c6bc834f67c0f3e489a2694d0"} Mar 14 08:54:25 crc kubenswrapper[5129]: I0314 08:54:25.421145 5129 generic.go:334] "Generic (PLEG): container finished" podID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerID="7f336ec20dd0c117c58d41547a262b708c4159207646c854ab1ada0c6040d0b8" exitCode=0 Mar 14 08:54:25 crc kubenswrapper[5129]: I0314 08:54:25.421211 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" event={"ID":"31d6fa6c-d1e7-47d2-9f15-b482e6d61975","Type":"ContainerDied","Data":"7f336ec20dd0c117c58d41547a262b708c4159207646c854ab1ada0c6040d0b8"} Mar 14 08:54:25 crc kubenswrapper[5129]: I0314 08:54:25.421273 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" event={"ID":"31d6fa6c-d1e7-47d2-9f15-b482e6d61975","Type":"ContainerStarted","Data":"bb3a36c5e1edb7338ab5f3a49259c90275b6cfad6d204227da5697c125c78f56"} Mar 14 08:54:25 crc kubenswrapper[5129]: I0314 08:54:25.457553 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5m76h" podStartSLOduration=2.457516399 podStartE2EDuration="2.457516399s" podCreationTimestamp="2026-03-14 08:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:25.455064263 +0000 UTC m=+6928.206979457" watchObservedRunningTime="2026-03-14 08:54:25.457516399 +0000 UTC m=+6928.209431623" Mar 14 08:54:26 crc kubenswrapper[5129]: I0314 08:54:26.435411 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" event={"ID":"31d6fa6c-d1e7-47d2-9f15-b482e6d61975","Type":"ContainerStarted","Data":"e52c828c4c947bb847c88584cc67f0e447afabeb1af44ca5cdd3e1c6a5960b72"} Mar 14 08:54:26 crc kubenswrapper[5129]: I0314 08:54:26.436020 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:27 crc kubenswrapper[5129]: I0314 08:54:27.492052 5129 scope.go:117] "RemoveContainer" containerID="330b60cd4436a3db2514f18a24a29aaaa77016a838850e8250931a8b47b3c13f" Mar 14 08:54:27 crc kubenswrapper[5129]: I0314 08:54:27.521323 5129 scope.go:117] "RemoveContainer" containerID="f7797eb1a99fa91beda198f31bbc4a17b990033593a308b96aae9db21024380d" Mar 14 08:54:27 crc kubenswrapper[5129]: I0314 08:54:27.584651 5129 scope.go:117] "RemoveContainer" containerID="4777da02711b53675af955a8a09b74acefa8843e11c53e62343f8e797638e915" Mar 14 08:54:28 crc kubenswrapper[5129]: I0314 08:54:28.464185 5129 generic.go:334] "Generic (PLEG): container finished" podID="d5dc8699-66d4-4ed6-ab61-a7b15a15249f" containerID="86d45daa624ee8fe82612390d6e7f96cb88e6cb16950f3ac0d74dc50b5c805f8" exitCode=0 Mar 14 08:54:28 crc kubenswrapper[5129]: I0314 08:54:28.464316 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5m76h" event={"ID":"d5dc8699-66d4-4ed6-ab61-a7b15a15249f","Type":"ContainerDied","Data":"86d45daa624ee8fe82612390d6e7f96cb88e6cb16950f3ac0d74dc50b5c805f8"} Mar 14 08:54:28 crc kubenswrapper[5129]: I0314 08:54:28.495560 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" podStartSLOduration=5.495529898 podStartE2EDuration="5.495529898s" podCreationTimestamp="2026-03-14 08:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:26.472737258 +0000 UTC m=+6929.224652462" watchObservedRunningTime="2026-03-14 08:54:28.495529898 +0000 UTC m=+6931.247445082" Mar 14 08:54:29 crc kubenswrapper[5129]: I0314 08:54:29.914046 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.023622 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-scripts\") pod \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.023777 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-combined-ca-bundle\") pod \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.023877 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-config-data\") pod \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.023942 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-credential-keys\") pod \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.024080 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wgpp\" (UniqueName: \"kubernetes.io/projected/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-kube-api-access-2wgpp\") pod \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.024142 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-fernet-keys\") pod \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\" (UID: \"d5dc8699-66d4-4ed6-ab61-a7b15a15249f\") " Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.031877 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d5dc8699-66d4-4ed6-ab61-a7b15a15249f" (UID: "d5dc8699-66d4-4ed6-ab61-a7b15a15249f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.032465 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-kube-api-access-2wgpp" (OuterVolumeSpecName: "kube-api-access-2wgpp") pod "d5dc8699-66d4-4ed6-ab61-a7b15a15249f" (UID: "d5dc8699-66d4-4ed6-ab61-a7b15a15249f"). InnerVolumeSpecName "kube-api-access-2wgpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.033788 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-scripts" (OuterVolumeSpecName: "scripts") pod "d5dc8699-66d4-4ed6-ab61-a7b15a15249f" (UID: "d5dc8699-66d4-4ed6-ab61-a7b15a15249f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.034163 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d5dc8699-66d4-4ed6-ab61-a7b15a15249f" (UID: "d5dc8699-66d4-4ed6-ab61-a7b15a15249f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.059570 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-config-data" (OuterVolumeSpecName: "config-data") pod "d5dc8699-66d4-4ed6-ab61-a7b15a15249f" (UID: "d5dc8699-66d4-4ed6-ab61-a7b15a15249f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.065792 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5dc8699-66d4-4ed6-ab61-a7b15a15249f" (UID: "d5dc8699-66d4-4ed6-ab61-a7b15a15249f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.127660 5129 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.127722 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wgpp\" (UniqueName: \"kubernetes.io/projected/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-kube-api-access-2wgpp\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.127744 5129 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.127765 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.127784 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.127802 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc8699-66d4-4ed6-ab61-a7b15a15249f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.490523 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5m76h" event={"ID":"d5dc8699-66d4-4ed6-ab61-a7b15a15249f","Type":"ContainerDied","Data":"e595204412ad1fc8a3bdb2d7e79028d807c7ea7c6bc834f67c0f3e489a2694d0"} Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.490566 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e595204412ad1fc8a3bdb2d7e79028d807c7ea7c6bc834f67c0f3e489a2694d0" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.490789 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5m76h" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.583909 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5m76h"] Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.589929 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5m76h"] Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.682199 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lfzdp"] Mar 14 08:54:30 crc kubenswrapper[5129]: E0314 08:54:30.683029 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dc8699-66d4-4ed6-ab61-a7b15a15249f" containerName="keystone-bootstrap" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.683131 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dc8699-66d4-4ed6-ab61-a7b15a15249f" containerName="keystone-bootstrap" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.683489 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dc8699-66d4-4ed6-ab61-a7b15a15249f" containerName="keystone-bootstrap" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.684370 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.688734 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.688936 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.689121 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.690378 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.690773 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8rfd" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.699811 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lfzdp"] Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.842803 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-combined-ca-bundle\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.842872 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-credential-keys\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.843009 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9m4\" (UniqueName: \"kubernetes.io/projected/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-kube-api-access-4p9m4\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.843099 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-fernet-keys\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.843146 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-config-data\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.843220 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-scripts\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.945777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-combined-ca-bundle\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.945852 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-credential-keys\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.945953 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9m4\" (UniqueName: \"kubernetes.io/projected/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-kube-api-access-4p9m4\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.946049 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-fernet-keys\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.946097 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-config-data\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.946139 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-scripts\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.952474 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-combined-ca-bundle\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.952895 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-credential-keys\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.953908 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-fernet-keys\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.954503 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-config-data\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.955225 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-scripts\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:30 crc kubenswrapper[5129]: I0314 08:54:30.973881 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9m4\" (UniqueName: \"kubernetes.io/projected/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-kube-api-access-4p9m4\") pod \"keystone-bootstrap-lfzdp\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:31 crc kubenswrapper[5129]: I0314 08:54:31.017075 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:31 crc kubenswrapper[5129]: I0314 08:54:31.545368 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lfzdp"] Mar 14 08:54:32 crc kubenswrapper[5129]: I0314 08:54:32.047272 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dc8699-66d4-4ed6-ab61-a7b15a15249f" path="/var/lib/kubelet/pods/d5dc8699-66d4-4ed6-ab61-a7b15a15249f/volumes" Mar 14 08:54:32 crc kubenswrapper[5129]: I0314 08:54:32.511198 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lfzdp" event={"ID":"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9","Type":"ContainerStarted","Data":"9aaa0f15556aaa444aa66ef937a46622e6b85e36772b38a226cbe5609d296c72"} Mar 14 08:54:32 crc kubenswrapper[5129]: I0314 08:54:32.511673 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lfzdp" event={"ID":"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9","Type":"ContainerStarted","Data":"224359ef4600986ed27207ccbbbde1c298976f27572b65d5944e8c229faec5a5"} Mar 14 08:54:32 crc kubenswrapper[5129]: I0314 08:54:32.555960 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lfzdp" podStartSLOduration=2.555930269 podStartE2EDuration="2.555930269s" podCreationTimestamp="2026-03-14 08:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:32.54559643 +0000 UTC m=+6935.297511614" watchObservedRunningTime="2026-03-14 08:54:32.555930269 +0000 UTC m=+6935.307845463" Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.133031 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.243030 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69448b775f-qphrw"] Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.244518 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69448b775f-qphrw" podUID="6543f565-2287-4972-8a36-41b99d7248fb" containerName="dnsmasq-dns" containerID="cri-o://2e150d503562c8d2b7eaa0e8557dec957bd831b2c1071495f691f01609d53b11" gracePeriod=10 Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.539009 5129 generic.go:334] "Generic (PLEG): container finished" podID="6543f565-2287-4972-8a36-41b99d7248fb" containerID="2e150d503562c8d2b7eaa0e8557dec957bd831b2c1071495f691f01609d53b11" exitCode=0 Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.539104 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69448b775f-qphrw" event={"ID":"6543f565-2287-4972-8a36-41b99d7248fb","Type":"ContainerDied","Data":"2e150d503562c8d2b7eaa0e8557dec957bd831b2c1071495f691f01609d53b11"} Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.541089 5129 generic.go:334] "Generic (PLEG): container finished" podID="b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" containerID="9aaa0f15556aaa444aa66ef937a46622e6b85e36772b38a226cbe5609d296c72" exitCode=0 Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.541148 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lfzdp" event={"ID":"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9","Type":"ContainerDied","Data":"9aaa0f15556aaa444aa66ef937a46622e6b85e36772b38a226cbe5609d296c72"} Mar 14 08:54:34 crc kubenswrapper[5129]: I0314 08:54:34.919032 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.037803 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-config\") pod \"6543f565-2287-4972-8a36-41b99d7248fb\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.037891 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-sb\") pod \"6543f565-2287-4972-8a36-41b99d7248fb\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.038078 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-dns-svc\") pod \"6543f565-2287-4972-8a36-41b99d7248fb\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.038117 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45sn4\" (UniqueName: \"kubernetes.io/projected/6543f565-2287-4972-8a36-41b99d7248fb-kube-api-access-45sn4\") pod \"6543f565-2287-4972-8a36-41b99d7248fb\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.038231 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-nb\") pod \"6543f565-2287-4972-8a36-41b99d7248fb\" (UID: \"6543f565-2287-4972-8a36-41b99d7248fb\") " Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.045343 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6543f565-2287-4972-8a36-41b99d7248fb-kube-api-access-45sn4" (OuterVolumeSpecName: "kube-api-access-45sn4") pod "6543f565-2287-4972-8a36-41b99d7248fb" (UID: "6543f565-2287-4972-8a36-41b99d7248fb"). InnerVolumeSpecName "kube-api-access-45sn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.081490 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6543f565-2287-4972-8a36-41b99d7248fb" (UID: "6543f565-2287-4972-8a36-41b99d7248fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.082566 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-config" (OuterVolumeSpecName: "config") pod "6543f565-2287-4972-8a36-41b99d7248fb" (UID: "6543f565-2287-4972-8a36-41b99d7248fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.089657 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6543f565-2287-4972-8a36-41b99d7248fb" (UID: "6543f565-2287-4972-8a36-41b99d7248fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.092402 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6543f565-2287-4972-8a36-41b99d7248fb" (UID: "6543f565-2287-4972-8a36-41b99d7248fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.141742 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.141776 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.141803 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45sn4\" (UniqueName: \"kubernetes.io/projected/6543f565-2287-4972-8a36-41b99d7248fb-kube-api-access-45sn4\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.141815 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.141823 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6543f565-2287-4972-8a36-41b99d7248fb-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.555173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69448b775f-qphrw" event={"ID":"6543f565-2287-4972-8a36-41b99d7248fb","Type":"ContainerDied","Data":"6671da50617c80a8636ca7fecfc4447461c73aa02eba1e66e5af95b9425e976f"} Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.555303 5129 scope.go:117] "RemoveContainer" containerID="2e150d503562c8d2b7eaa0e8557dec957bd831b2c1071495f691f01609d53b11" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.555809 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69448b775f-qphrw" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.587325 5129 scope.go:117] "RemoveContainer" containerID="066a2034b27e9f457e9df59f540573ddce8281c5c528a8863f5d8fca855166e0" Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.646114 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69448b775f-qphrw"] Mar 14 08:54:35 crc kubenswrapper[5129]: I0314 08:54:35.655155 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69448b775f-qphrw"] Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.043149 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.049782 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6543f565-2287-4972-8a36-41b99d7248fb" path="/var/lib/kubelet/pods/6543f565-2287-4972-8a36-41b99d7248fb/volumes" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.165086 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-credential-keys\") pod \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.165143 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-combined-ca-bundle\") pod \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.165217 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p9m4\" (UniqueName: \"kubernetes.io/projected/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-kube-api-access-4p9m4\") pod \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.165349 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-fernet-keys\") pod \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.165395 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-config-data\") pod \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.165491 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-scripts\") pod \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\" (UID: \"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9\") " Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.171661 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-kube-api-access-4p9m4" (OuterVolumeSpecName: "kube-api-access-4p9m4") pod "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" (UID: "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9"). InnerVolumeSpecName "kube-api-access-4p9m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.173830 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-scripts" (OuterVolumeSpecName: "scripts") pod "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" (UID: "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.173873 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" (UID: "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.173966 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" (UID: "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.192822 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" (UID: "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.197871 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-config-data" (OuterVolumeSpecName: "config-data") pod "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" (UID: "b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.267068 5129 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.267098 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.267108 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p9m4\" (UniqueName: \"kubernetes.io/projected/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-kube-api-access-4p9m4\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.267118 5129 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.267129 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.267137 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.570358 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lfzdp" event={"ID":"b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9","Type":"ContainerDied","Data":"224359ef4600986ed27207ccbbbde1c298976f27572b65d5944e8c229faec5a5"} Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.570437 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="224359ef4600986ed27207ccbbbde1c298976f27572b65d5944e8c229faec5a5" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.571837 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lfzdp" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.761798 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6b78d4f6dc-hl89h"] Mar 14 08:54:36 crc kubenswrapper[5129]: E0314 08:54:36.762254 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6543f565-2287-4972-8a36-41b99d7248fb" containerName="dnsmasq-dns" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.762276 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6543f565-2287-4972-8a36-41b99d7248fb" containerName="dnsmasq-dns" Mar 14 08:54:36 crc kubenswrapper[5129]: E0314 08:54:36.762296 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6543f565-2287-4972-8a36-41b99d7248fb" containerName="init" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.762304 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6543f565-2287-4972-8a36-41b99d7248fb" containerName="init" Mar 14 08:54:36 crc kubenswrapper[5129]: E0314 08:54:36.762333 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" containerName="keystone-bootstrap" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.762343 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" containerName="keystone-bootstrap" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.762554 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" containerName="keystone-bootstrap" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.762572 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6543f565-2287-4972-8a36-41b99d7248fb" containerName="dnsmasq-dns" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.763989 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.773358 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.774182 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.774298 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.774194 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8rfd" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.774412 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.774450 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.815969 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b78d4f6dc-hl89h"] Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879458 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-public-tls-certs\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879517 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-credential-keys\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879541 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-fernet-keys\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879619 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mw7\" (UniqueName: \"kubernetes.io/projected/59ccb607-3439-4284-b171-93d68e3ee432-kube-api-access-c2mw7\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879666 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-combined-ca-bundle\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879696 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-scripts\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879730 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-internal-tls-certs\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.879772 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-config-data\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981104 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mw7\" (UniqueName: \"kubernetes.io/projected/59ccb607-3439-4284-b171-93d68e3ee432-kube-api-access-c2mw7\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981752 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-combined-ca-bundle\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981800 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-scripts\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981834 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-internal-tls-certs\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981875 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-config-data\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981917 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-public-tls-certs\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981934 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-credential-keys\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.981950 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-fernet-keys\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.989222 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-public-tls-certs\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.989688 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-internal-tls-certs\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.989837 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-credential-keys\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.990431 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-combined-ca-bundle\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.992246 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-scripts\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.993204 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-config-data\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:36 crc kubenswrapper[5129]: I0314 08:54:36.994444 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59ccb607-3439-4284-b171-93d68e3ee432-fernet-keys\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:37 crc kubenswrapper[5129]: I0314 08:54:37.001411 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mw7\" (UniqueName: \"kubernetes.io/projected/59ccb607-3439-4284-b171-93d68e3ee432-kube-api-access-c2mw7\") pod \"keystone-6b78d4f6dc-hl89h\" (UID: \"59ccb607-3439-4284-b171-93d68e3ee432\") " pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:37 crc kubenswrapper[5129]: I0314 08:54:37.119203 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:37 crc kubenswrapper[5129]: I0314 08:54:37.631842 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b78d4f6dc-hl89h"] Mar 14 08:54:38 crc kubenswrapper[5129]: I0314 08:54:38.599834 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b78d4f6dc-hl89h" event={"ID":"59ccb607-3439-4284-b171-93d68e3ee432","Type":"ContainerStarted","Data":"8b2d0bf9ed6b49d2f503cbb9c3ecf09b2b5121506524fab49af7a855bd5b1544"} Mar 14 08:54:38 crc kubenswrapper[5129]: I0314 08:54:38.600403 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b78d4f6dc-hl89h" event={"ID":"59ccb607-3439-4284-b171-93d68e3ee432","Type":"ContainerStarted","Data":"391474929c7c6c9a345c0534191f02b31134c069be880d7f256067281be68e1a"} Mar 14 08:54:38 crc kubenswrapper[5129]: I0314 08:54:38.600430 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:54:38 crc kubenswrapper[5129]: I0314 08:54:38.629109 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6b78d4f6dc-hl89h" podStartSLOduration=2.629074628 podStartE2EDuration="2.629074628s" podCreationTimestamp="2026-03-14 08:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:38.62208911 +0000 UTC m=+6941.374004304" watchObservedRunningTime="2026-03-14 08:54:38.629074628 +0000 UTC m=+6941.380989812" Mar 14 08:54:49 crc kubenswrapper[5129]: I0314 08:54:49.575147 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:54:49 crc kubenswrapper[5129]: I0314 08:54:49.576112 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:54:49 crc kubenswrapper[5129]: I0314 08:54:49.576286 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:54:49 crc kubenswrapper[5129]: I0314 08:54:49.578391 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08fb3f0642680c2408576300f89e73c1a127607f27f00eed4c184f6d6d9f3720"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:54:49 crc kubenswrapper[5129]: I0314 08:54:49.578555 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://08fb3f0642680c2408576300f89e73c1a127607f27f00eed4c184f6d6d9f3720" gracePeriod=600 Mar 14 08:54:50 crc kubenswrapper[5129]: I0314 08:54:50.730784 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="08fb3f0642680c2408576300f89e73c1a127607f27f00eed4c184f6d6d9f3720" exitCode=0 Mar 14 08:54:50 crc kubenswrapper[5129]: I0314 08:54:50.730935 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"08fb3f0642680c2408576300f89e73c1a127607f27f00eed4c184f6d6d9f3720"} Mar 14 08:54:50 crc kubenswrapper[5129]: I0314 08:54:50.731732 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d"} Mar 14 08:54:50 crc kubenswrapper[5129]: I0314 08:54:50.731811 5129 scope.go:117] "RemoveContainer" containerID="436e315030bbeab719847d430a994aa2accd65b3ebcbc216c5f15c633e06ef20" Mar 14 08:55:08 crc kubenswrapper[5129]: I0314 08:55:08.802130 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6b78d4f6dc-hl89h" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.661710 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.664272 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.668194 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.668192 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.668325 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-l2cx5" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.683366 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.781000 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config-secret\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.781677 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzghg\" (UniqueName: \"kubernetes.io/projected/092fbd6e-d672-4e5b-8513-ac36f7b7615d-kube-api-access-lzghg\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.781848 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.782248 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.884567 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config-secret\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.884704 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzghg\" (UniqueName: \"kubernetes.io/projected/092fbd6e-d672-4e5b-8513-ac36f7b7615d-kube-api-access-lzghg\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.884730 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.884810 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.885813 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.894291 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config-secret\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.898492 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.911695 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzghg\" (UniqueName: \"kubernetes.io/projected/092fbd6e-d672-4e5b-8513-ac36f7b7615d-kube-api-access-lzghg\") pod \"openstackclient\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " pod="openstack/openstackclient" Mar 14 08:55:13 crc kubenswrapper[5129]: I0314 08:55:13.990917 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 08:55:14 crc kubenswrapper[5129]: I0314 08:55:14.507762 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 08:55:14 crc kubenswrapper[5129]: I0314 08:55:14.518088 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:55:14 crc kubenswrapper[5129]: I0314 08:55:14.998838 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"092fbd6e-d672-4e5b-8513-ac36f7b7615d","Type":"ContainerStarted","Data":"d5318d614a748ada90e6950c22f04e654e21a607b97012d1d0261b9731bc0b96"} Mar 14 08:55:26 crc kubenswrapper[5129]: I0314 08:55:26.145034 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"092fbd6e-d672-4e5b-8513-ac36f7b7615d","Type":"ContainerStarted","Data":"8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e"} Mar 14 08:55:26 crc kubenswrapper[5129]: I0314 08:55:26.176505 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.236889629 podStartE2EDuration="13.176480832s" podCreationTimestamp="2026-03-14 08:55:13 +0000 UTC" firstStartedPulling="2026-03-14 08:55:14.517840019 +0000 UTC m=+6977.269755193" lastFinishedPulling="2026-03-14 08:55:25.457431222 +0000 UTC m=+6988.209346396" observedRunningTime="2026-03-14 08:55:26.167481229 +0000 UTC m=+6988.919396423" watchObservedRunningTime="2026-03-14 08:55:26.176480832 +0000 UTC m=+6988.928396026" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.156966 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557976-rdx55"] Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.159405 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-rdx55" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.163381 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.163764 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.163929 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.171770 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-rdx55"] Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.241862 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj8nk\" (UniqueName: \"kubernetes.io/projected/eefd94af-b3c6-42db-9fdd-186870c8a943-kube-api-access-dj8nk\") pod \"auto-csr-approver-29557976-rdx55\" (UID: \"eefd94af-b3c6-42db-9fdd-186870c8a943\") " pod="openshift-infra/auto-csr-approver-29557976-rdx55" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.343641 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj8nk\" (UniqueName: \"kubernetes.io/projected/eefd94af-b3c6-42db-9fdd-186870c8a943-kube-api-access-dj8nk\") pod \"auto-csr-approver-29557976-rdx55\" (UID: \"eefd94af-b3c6-42db-9fdd-186870c8a943\") " pod="openshift-infra/auto-csr-approver-29557976-rdx55" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.369413 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj8nk\" (UniqueName: \"kubernetes.io/projected/eefd94af-b3c6-42db-9fdd-186870c8a943-kube-api-access-dj8nk\") pod \"auto-csr-approver-29557976-rdx55\" (UID: \"eefd94af-b3c6-42db-9fdd-186870c8a943\") " pod="openshift-infra/auto-csr-approver-29557976-rdx55" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.481520 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-rdx55" Mar 14 08:56:00 crc kubenswrapper[5129]: I0314 08:56:00.963681 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-rdx55"] Mar 14 08:56:01 crc kubenswrapper[5129]: I0314 08:56:01.559394 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557976-rdx55" event={"ID":"eefd94af-b3c6-42db-9fdd-186870c8a943","Type":"ContainerStarted","Data":"eb04c83ac41b6dc3b0f547fdb47729ec476953b1ec5370e623f801baa5a61377"} Mar 14 08:56:02 crc kubenswrapper[5129]: I0314 08:56:02.579002 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557976-rdx55" event={"ID":"eefd94af-b3c6-42db-9fdd-186870c8a943","Type":"ContainerStarted","Data":"f67b6b39b3736a02fcef7bb8a9cb3062492b089bba26db5fd87f34580a741cbe"} Mar 14 08:56:02 crc kubenswrapper[5129]: I0314 08:56:02.609183 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557976-rdx55" podStartSLOduration=1.512835787 podStartE2EDuration="2.609155948s" podCreationTimestamp="2026-03-14 08:56:00 +0000 UTC" firstStartedPulling="2026-03-14 08:56:00.972906429 +0000 UTC m=+7023.724821613" lastFinishedPulling="2026-03-14 08:56:02.06922658 +0000 UTC m=+7024.821141774" observedRunningTime="2026-03-14 08:56:02.597065591 +0000 UTC m=+7025.348980775" watchObservedRunningTime="2026-03-14 08:56:02.609155948 +0000 UTC m=+7025.361071132" Mar 14 08:56:03 crc kubenswrapper[5129]: I0314 08:56:03.589524 5129 generic.go:334] "Generic (PLEG): container finished" podID="eefd94af-b3c6-42db-9fdd-186870c8a943" containerID="f67b6b39b3736a02fcef7bb8a9cb3062492b089bba26db5fd87f34580a741cbe" exitCode=0 Mar 14 08:56:03 crc kubenswrapper[5129]: I0314 08:56:03.589714 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557976-rdx55" event={"ID":"eefd94af-b3c6-42db-9fdd-186870c8a943","Type":"ContainerDied","Data":"f67b6b39b3736a02fcef7bb8a9cb3062492b089bba26db5fd87f34580a741cbe"} Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.028729 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-rdx55" Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.157091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj8nk\" (UniqueName: \"kubernetes.io/projected/eefd94af-b3c6-42db-9fdd-186870c8a943-kube-api-access-dj8nk\") pod \"eefd94af-b3c6-42db-9fdd-186870c8a943\" (UID: \"eefd94af-b3c6-42db-9fdd-186870c8a943\") " Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.163134 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eefd94af-b3c6-42db-9fdd-186870c8a943-kube-api-access-dj8nk" (OuterVolumeSpecName: "kube-api-access-dj8nk") pod "eefd94af-b3c6-42db-9fdd-186870c8a943" (UID: "eefd94af-b3c6-42db-9fdd-186870c8a943"). InnerVolumeSpecName "kube-api-access-dj8nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.260362 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj8nk\" (UniqueName: \"kubernetes.io/projected/eefd94af-b3c6-42db-9fdd-186870c8a943-kube-api-access-dj8nk\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.610978 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557976-rdx55" event={"ID":"eefd94af-b3c6-42db-9fdd-186870c8a943","Type":"ContainerDied","Data":"eb04c83ac41b6dc3b0f547fdb47729ec476953b1ec5370e623f801baa5a61377"} Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.611026 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb04c83ac41b6dc3b0f547fdb47729ec476953b1ec5370e623f801baa5a61377" Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.611151 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-rdx55" Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.669124 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-xr5ck"] Mar 14 08:56:05 crc kubenswrapper[5129]: I0314 08:56:05.675474 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-xr5ck"] Mar 14 08:56:06 crc kubenswrapper[5129]: I0314 08:56:06.053831 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66089f6-5a01-4d3a-b32c-155de9e2398b" path="/var/lib/kubelet/pods/e66089f6-5a01-4d3a-b32c-155de9e2398b/volumes" Mar 14 08:56:27 crc kubenswrapper[5129]: I0314 08:56:27.716014 5129 scope.go:117] "RemoveContainer" containerID="97ff25cc7053f5b7d0f960e6e69c338ba100db06b017e4e29d9b56e222cc7653" Mar 14 08:56:46 crc kubenswrapper[5129]: E0314 08:56:46.080176 5129 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.113:55218->38.102.83.113:34059: read tcp 38.102.83.113:55218->38.102.83.113:34059: read: connection reset by peer Mar 14 08:56:49 crc kubenswrapper[5129]: I0314 08:56:49.574754 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:56:49 crc kubenswrapper[5129]: I0314 08:56:49.575436 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.613457 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xb87m"] Mar 14 08:56:52 crc kubenswrapper[5129]: E0314 08:56:52.615576 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eefd94af-b3c6-42db-9fdd-186870c8a943" containerName="oc" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.615623 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="eefd94af-b3c6-42db-9fdd-186870c8a943" containerName="oc" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.615988 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="eefd94af-b3c6-42db-9fdd-186870c8a943" containerName="oc" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.616875 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.624776 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2l9b\" (UniqueName: \"kubernetes.io/projected/117258eb-cadd-4d21-b2fe-6b2901131e5f-kube-api-access-n2l9b\") pod \"neutron-db-create-xb87m\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.624853 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117258eb-cadd-4d21-b2fe-6b2901131e5f-operator-scripts\") pod \"neutron-db-create-xb87m\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.644992 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xb87m"] Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.709061 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8475-account-create-update-nq4b8"] Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.710266 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.715086 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.726135 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95dn\" (UniqueName: \"kubernetes.io/projected/da2020c4-b598-4f6d-9867-337f481bab41-kube-api-access-c95dn\") pod \"neutron-8475-account-create-update-nq4b8\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.726215 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da2020c4-b598-4f6d-9867-337f481bab41-operator-scripts\") pod \"neutron-8475-account-create-update-nq4b8\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.726245 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2l9b\" (UniqueName: \"kubernetes.io/projected/117258eb-cadd-4d21-b2fe-6b2901131e5f-kube-api-access-n2l9b\") pod \"neutron-db-create-xb87m\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.726281 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117258eb-cadd-4d21-b2fe-6b2901131e5f-operator-scripts\") pod \"neutron-db-create-xb87m\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.727267 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117258eb-cadd-4d21-b2fe-6b2901131e5f-operator-scripts\") pod \"neutron-db-create-xb87m\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.729109 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8475-account-create-update-nq4b8"] Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.755105 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2l9b\" (UniqueName: \"kubernetes.io/projected/117258eb-cadd-4d21-b2fe-6b2901131e5f-kube-api-access-n2l9b\") pod \"neutron-db-create-xb87m\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.827904 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da2020c4-b598-4f6d-9867-337f481bab41-operator-scripts\") pod \"neutron-8475-account-create-update-nq4b8\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.828059 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c95dn\" (UniqueName: \"kubernetes.io/projected/da2020c4-b598-4f6d-9867-337f481bab41-kube-api-access-c95dn\") pod \"neutron-8475-account-create-update-nq4b8\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.828652 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da2020c4-b598-4f6d-9867-337f481bab41-operator-scripts\") pod \"neutron-8475-account-create-update-nq4b8\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.849885 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95dn\" (UniqueName: \"kubernetes.io/projected/da2020c4-b598-4f6d-9867-337f481bab41-kube-api-access-c95dn\") pod \"neutron-8475-account-create-update-nq4b8\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:52 crc kubenswrapper[5129]: I0314 08:56:52.948441 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:53 crc kubenswrapper[5129]: I0314 08:56:53.034162 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:53 crc kubenswrapper[5129]: I0314 08:56:53.385328 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xb87m"] Mar 14 08:56:53 crc kubenswrapper[5129]: I0314 08:56:53.501362 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8475-account-create-update-nq4b8"] Mar 14 08:56:53 crc kubenswrapper[5129]: W0314 08:56:53.507062 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda2020c4_b598_4f6d_9867_337f481bab41.slice/crio-9a8d82015a1c19d67f37fa0097bd04001537ae71bb9d50aab3527e0775994486 WatchSource:0}: Error finding container 9a8d82015a1c19d67f37fa0097bd04001537ae71bb9d50aab3527e0775994486: Status 404 returned error can't find the container with id 9a8d82015a1c19d67f37fa0097bd04001537ae71bb9d50aab3527e0775994486 Mar 14 08:56:54 crc kubenswrapper[5129]: I0314 08:56:54.095212 5129 generic.go:334] "Generic (PLEG): container finished" podID="117258eb-cadd-4d21-b2fe-6b2901131e5f" containerID="3654335946cffa3fff6a4ba9084ce2f1a20dafab44a3e605e0cb811b09e860cd" exitCode=0 Mar 14 08:56:54 crc kubenswrapper[5129]: I0314 08:56:54.095205 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xb87m" event={"ID":"117258eb-cadd-4d21-b2fe-6b2901131e5f","Type":"ContainerDied","Data":"3654335946cffa3fff6a4ba9084ce2f1a20dafab44a3e605e0cb811b09e860cd"} Mar 14 08:56:54 crc kubenswrapper[5129]: I0314 08:56:54.095494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xb87m" event={"ID":"117258eb-cadd-4d21-b2fe-6b2901131e5f","Type":"ContainerStarted","Data":"5294d5a35f0f0342f91d9bc3c782c5a2bc261e5edb22011444277e74f22d3da7"} Mar 14 08:56:54 crc kubenswrapper[5129]: I0314 08:56:54.097489 5129 generic.go:334] "Generic (PLEG): container finished" podID="da2020c4-b598-4f6d-9867-337f481bab41" containerID="d084c43162e9879770e569b167ca69ab6a90086fc13f85f53cac8b93f0e4bc80" exitCode=0 Mar 14 08:56:54 crc kubenswrapper[5129]: I0314 08:56:54.097519 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8475-account-create-update-nq4b8" event={"ID":"da2020c4-b598-4f6d-9867-337f481bab41","Type":"ContainerDied","Data":"d084c43162e9879770e569b167ca69ab6a90086fc13f85f53cac8b93f0e4bc80"} Mar 14 08:56:54 crc kubenswrapper[5129]: I0314 08:56:54.097537 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8475-account-create-update-nq4b8" event={"ID":"da2020c4-b598-4f6d-9867-337f481bab41","Type":"ContainerStarted","Data":"9a8d82015a1c19d67f37fa0097bd04001537ae71bb9d50aab3527e0775994486"} Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.557729 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.565853 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.590110 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c95dn\" (UniqueName: \"kubernetes.io/projected/da2020c4-b598-4f6d-9867-337f481bab41-kube-api-access-c95dn\") pod \"da2020c4-b598-4f6d-9867-337f481bab41\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.590471 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117258eb-cadd-4d21-b2fe-6b2901131e5f-operator-scripts\") pod \"117258eb-cadd-4d21-b2fe-6b2901131e5f\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.590510 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da2020c4-b598-4f6d-9867-337f481bab41-operator-scripts\") pod \"da2020c4-b598-4f6d-9867-337f481bab41\" (UID: \"da2020c4-b598-4f6d-9867-337f481bab41\") " Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.590555 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2l9b\" (UniqueName: \"kubernetes.io/projected/117258eb-cadd-4d21-b2fe-6b2901131e5f-kube-api-access-n2l9b\") pod \"117258eb-cadd-4d21-b2fe-6b2901131e5f\" (UID: \"117258eb-cadd-4d21-b2fe-6b2901131e5f\") " Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.591772 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2020c4-b598-4f6d-9867-337f481bab41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da2020c4-b598-4f6d-9867-337f481bab41" (UID: "da2020c4-b598-4f6d-9867-337f481bab41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.591914 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/117258eb-cadd-4d21-b2fe-6b2901131e5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "117258eb-cadd-4d21-b2fe-6b2901131e5f" (UID: "117258eb-cadd-4d21-b2fe-6b2901131e5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.600414 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2020c4-b598-4f6d-9867-337f481bab41-kube-api-access-c95dn" (OuterVolumeSpecName: "kube-api-access-c95dn") pod "da2020c4-b598-4f6d-9867-337f481bab41" (UID: "da2020c4-b598-4f6d-9867-337f481bab41"). InnerVolumeSpecName "kube-api-access-c95dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.606359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117258eb-cadd-4d21-b2fe-6b2901131e5f-kube-api-access-n2l9b" (OuterVolumeSpecName: "kube-api-access-n2l9b") pod "117258eb-cadd-4d21-b2fe-6b2901131e5f" (UID: "117258eb-cadd-4d21-b2fe-6b2901131e5f"). InnerVolumeSpecName "kube-api-access-n2l9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.692416 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c95dn\" (UniqueName: \"kubernetes.io/projected/da2020c4-b598-4f6d-9867-337f481bab41-kube-api-access-c95dn\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.692492 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117258eb-cadd-4d21-b2fe-6b2901131e5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.692506 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da2020c4-b598-4f6d-9867-337f481bab41-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:55 crc kubenswrapper[5129]: I0314 08:56:55.692552 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2l9b\" (UniqueName: \"kubernetes.io/projected/117258eb-cadd-4d21-b2fe-6b2901131e5f-kube-api-access-n2l9b\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:56 crc kubenswrapper[5129]: I0314 08:56:56.127565 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xb87m" event={"ID":"117258eb-cadd-4d21-b2fe-6b2901131e5f","Type":"ContainerDied","Data":"5294d5a35f0f0342f91d9bc3c782c5a2bc261e5edb22011444277e74f22d3da7"} Mar 14 08:56:56 crc kubenswrapper[5129]: I0314 08:56:56.127619 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xb87m" Mar 14 08:56:56 crc kubenswrapper[5129]: I0314 08:56:56.127641 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5294d5a35f0f0342f91d9bc3c782c5a2bc261e5edb22011444277e74f22d3da7" Mar 14 08:56:56 crc kubenswrapper[5129]: I0314 08:56:56.133633 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8475-account-create-update-nq4b8" event={"ID":"da2020c4-b598-4f6d-9867-337f481bab41","Type":"ContainerDied","Data":"9a8d82015a1c19d67f37fa0097bd04001537ae71bb9d50aab3527e0775994486"} Mar 14 08:56:56 crc kubenswrapper[5129]: I0314 08:56:56.133657 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8d82015a1c19d67f37fa0097bd04001537ae71bb9d50aab3527e0775994486" Mar 14 08:56:56 crc kubenswrapper[5129]: I0314 08:56:56.133700 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8475-account-create-update-nq4b8" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.895219 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-prd8k"] Mar 14 08:56:57 crc kubenswrapper[5129]: E0314 08:56:57.896214 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117258eb-cadd-4d21-b2fe-6b2901131e5f" containerName="mariadb-database-create" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.896234 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="117258eb-cadd-4d21-b2fe-6b2901131e5f" containerName="mariadb-database-create" Mar 14 08:56:57 crc kubenswrapper[5129]: E0314 08:56:57.896263 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2020c4-b598-4f6d-9867-337f481bab41" containerName="mariadb-account-create-update" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.896280 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2020c4-b598-4f6d-9867-337f481bab41" containerName="mariadb-account-create-update" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.896541 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2020c4-b598-4f6d-9867-337f481bab41" containerName="mariadb-account-create-update" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.896574 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="117258eb-cadd-4d21-b2fe-6b2901131e5f" containerName="mariadb-database-create" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.897451 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.899950 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.900717 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bsbgb" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.901127 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.914556 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-prd8k"] Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.939064 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-config\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.939167 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-combined-ca-bundle\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:57 crc kubenswrapper[5129]: I0314 08:56:57.939269 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7fg\" (UniqueName: \"kubernetes.io/projected/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-kube-api-access-gc7fg\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.042339 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-config\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.042433 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-combined-ca-bundle\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.042488 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7fg\" (UniqueName: \"kubernetes.io/projected/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-kube-api-access-gc7fg\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.051816 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-combined-ca-bundle\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.054947 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-config\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.061113 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7fg\" (UniqueName: \"kubernetes.io/projected/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-kube-api-access-gc7fg\") pod \"neutron-db-sync-prd8k\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.235309 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bsbgb" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.243623 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prd8k" Mar 14 08:56:58 crc kubenswrapper[5129]: I0314 08:56:58.767385 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-prd8k"] Mar 14 08:56:59 crc kubenswrapper[5129]: I0314 08:56:59.171435 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prd8k" event={"ID":"7fa745c2-ed72-4a7a-8a5a-ee1733246f11","Type":"ContainerStarted","Data":"1e0d55ae3db73d445f78dead6c5674087358766ca6767a11c9401d3213a475a7"} Mar 14 08:56:59 crc kubenswrapper[5129]: I0314 08:56:59.171879 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prd8k" event={"ID":"7fa745c2-ed72-4a7a-8a5a-ee1733246f11","Type":"ContainerStarted","Data":"8265e3ff18a991c10ade2c4e78b3cc88abd46abd485a2dffad2c6367e52738b2"} Mar 14 08:57:04 crc kubenswrapper[5129]: I0314 08:57:04.235779 5129 generic.go:334] "Generic (PLEG): container finished" podID="7fa745c2-ed72-4a7a-8a5a-ee1733246f11" containerID="1e0d55ae3db73d445f78dead6c5674087358766ca6767a11c9401d3213a475a7" exitCode=0 Mar 14 08:57:04 crc kubenswrapper[5129]: I0314 08:57:04.235952 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prd8k" event={"ID":"7fa745c2-ed72-4a7a-8a5a-ee1733246f11","Type":"ContainerDied","Data":"1e0d55ae3db73d445f78dead6c5674087358766ca6767a11c9401d3213a475a7"} Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.694056 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prd8k" Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.808629 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-combined-ca-bundle\") pod \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.808761 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc7fg\" (UniqueName: \"kubernetes.io/projected/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-kube-api-access-gc7fg\") pod \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.808919 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-config\") pod \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\" (UID: \"7fa745c2-ed72-4a7a-8a5a-ee1733246f11\") " Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.817145 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-kube-api-access-gc7fg" (OuterVolumeSpecName: "kube-api-access-gc7fg") pod "7fa745c2-ed72-4a7a-8a5a-ee1733246f11" (UID: "7fa745c2-ed72-4a7a-8a5a-ee1733246f11"). InnerVolumeSpecName "kube-api-access-gc7fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.837672 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fa745c2-ed72-4a7a-8a5a-ee1733246f11" (UID: "7fa745c2-ed72-4a7a-8a5a-ee1733246f11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.842299 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-config" (OuterVolumeSpecName: "config") pod "7fa745c2-ed72-4a7a-8a5a-ee1733246f11" (UID: "7fa745c2-ed72-4a7a-8a5a-ee1733246f11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.910866 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.910904 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:05 crc kubenswrapper[5129]: I0314 08:57:05.910917 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc7fg\" (UniqueName: \"kubernetes.io/projected/7fa745c2-ed72-4a7a-8a5a-ee1733246f11-kube-api-access-gc7fg\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.260508 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-prd8k" event={"ID":"7fa745c2-ed72-4a7a-8a5a-ee1733246f11","Type":"ContainerDied","Data":"8265e3ff18a991c10ade2c4e78b3cc88abd46abd485a2dffad2c6367e52738b2"} Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.260556 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-prd8k" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.260559 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8265e3ff18a991c10ade2c4e78b3cc88abd46abd485a2dffad2c6367e52738b2" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.538284 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5868879f8c-t66nx"] Mar 14 08:57:06 crc kubenswrapper[5129]: E0314 08:57:06.541978 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa745c2-ed72-4a7a-8a5a-ee1733246f11" containerName="neutron-db-sync" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.542022 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa745c2-ed72-4a7a-8a5a-ee1733246f11" containerName="neutron-db-sync" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.542425 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa745c2-ed72-4a7a-8a5a-ee1733246f11" containerName="neutron-db-sync" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.543944 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.573358 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5868879f8c-t66nx"] Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.610404 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f7cdfcdd4-848dw"] Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.618232 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.625468 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bsbgb" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.630468 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.630803 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.630964 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.635356 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f7cdfcdd4-848dw"] Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.729662 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5slc5\" (UniqueName: \"kubernetes.io/projected/43757357-92fb-43c6-8b43-05b6b3439600-kube-api-access-5slc5\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.729757 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-config\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.729814 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-nb\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.729841 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-ovndb-tls-certs\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.729869 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9nd\" (UniqueName: \"kubernetes.io/projected/f96e7ded-86c9-404c-b278-262796e10cfb-kube-api-access-4g9nd\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.729915 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-config\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.729954 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-combined-ca-bundle\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.730057 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-sb\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.730097 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-dns-svc\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.730133 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-httpd-config\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832277 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-sb\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832418 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-dns-svc\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832459 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-httpd-config\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832523 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5slc5\" (UniqueName: \"kubernetes.io/projected/43757357-92fb-43c6-8b43-05b6b3439600-kube-api-access-5slc5\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832581 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-config\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832643 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-nb\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832664 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-ovndb-tls-certs\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832691 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9nd\" (UniqueName: \"kubernetes.io/projected/f96e7ded-86c9-404c-b278-262796e10cfb-kube-api-access-4g9nd\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832730 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-config\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.832763 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-combined-ca-bundle\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.834692 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-config\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.834718 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-sb\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.834824 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-dns-svc\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.835952 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-nb\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.841012 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-httpd-config\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.842435 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-combined-ca-bundle\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.843402 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-config\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.853542 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-ovndb-tls-certs\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.854194 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5slc5\" (UniqueName: \"kubernetes.io/projected/43757357-92fb-43c6-8b43-05b6b3439600-kube-api-access-5slc5\") pod \"neutron-f7cdfcdd4-848dw\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.862573 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9nd\" (UniqueName: \"kubernetes.io/projected/f96e7ded-86c9-404c-b278-262796e10cfb-kube-api-access-4g9nd\") pod \"dnsmasq-dns-5868879f8c-t66nx\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.870404 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:06 crc kubenswrapper[5129]: I0314 08:57:06.942471 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:07 crc kubenswrapper[5129]: I0314 08:57:07.451447 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5868879f8c-t66nx"] Mar 14 08:57:07 crc kubenswrapper[5129]: I0314 08:57:07.590644 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f7cdfcdd4-848dw"] Mar 14 08:57:07 crc kubenswrapper[5129]: W0314 08:57:07.609691 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43757357_92fb_43c6_8b43_05b6b3439600.slice/crio-c04994c050377519c021c6abbadfc6414e2e317b0e49f694f443253f21ce72d7 WatchSource:0}: Error finding container c04994c050377519c021c6abbadfc6414e2e317b0e49f694f443253f21ce72d7: Status 404 returned error can't find the container with id c04994c050377519c021c6abbadfc6414e2e317b0e49f694f443253f21ce72d7 Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.288948 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7cdfcdd4-848dw" event={"ID":"43757357-92fb-43c6-8b43-05b6b3439600","Type":"ContainerStarted","Data":"62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36"} Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.289340 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7cdfcdd4-848dw" event={"ID":"43757357-92fb-43c6-8b43-05b6b3439600","Type":"ContainerStarted","Data":"6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21"} Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.289355 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7cdfcdd4-848dw" event={"ID":"43757357-92fb-43c6-8b43-05b6b3439600","Type":"ContainerStarted","Data":"c04994c050377519c021c6abbadfc6414e2e317b0e49f694f443253f21ce72d7"} Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.289373 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.290795 5129 generic.go:334] "Generic (PLEG): container finished" podID="f96e7ded-86c9-404c-b278-262796e10cfb" containerID="98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99" exitCode=0 Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.290975 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" event={"ID":"f96e7ded-86c9-404c-b278-262796e10cfb","Type":"ContainerDied","Data":"98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99"} Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.291079 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" event={"ID":"f96e7ded-86c9-404c-b278-262796e10cfb","Type":"ContainerStarted","Data":"9f7b0a2c0d2f92d6cd5565f604d6cc5c42fefde1aade29d360cbd3803aa04c08"} Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.330349 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f7cdfcdd4-848dw" podStartSLOduration=2.330307023 podStartE2EDuration="2.330307023s" podCreationTimestamp="2026-03-14 08:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:57:08.311579567 +0000 UTC m=+7091.063494751" watchObservedRunningTime="2026-03-14 08:57:08.330307023 +0000 UTC m=+7091.082222217" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.905395 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76f68cc57f-g29rv"] Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.906863 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.911502 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.911502 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.936032 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f68cc57f-g29rv"] Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.994159 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-internal-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.994347 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-config\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.994504 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-public-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.994682 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-combined-ca-bundle\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.994760 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhnq\" (UniqueName: \"kubernetes.io/projected/764337bd-5689-4897-82aa-cc5d7e55e39f-kube-api-access-7lhnq\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.994805 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-ovndb-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:08 crc kubenswrapper[5129]: I0314 08:57:08.994846 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-httpd-config\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.096763 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-internal-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.096845 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-config\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.096881 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-public-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.096925 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-combined-ca-bundle\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.096983 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhnq\" (UniqueName: \"kubernetes.io/projected/764337bd-5689-4897-82aa-cc5d7e55e39f-kube-api-access-7lhnq\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.097022 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-ovndb-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.097062 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-httpd-config\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.106338 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-combined-ca-bundle\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.108304 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-ovndb-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.113343 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-httpd-config\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.119351 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-public-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.120234 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-internal-tls-certs\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.135495 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/764337bd-5689-4897-82aa-cc5d7e55e39f-config\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.181346 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhnq\" (UniqueName: \"kubernetes.io/projected/764337bd-5689-4897-82aa-cc5d7e55e39f-kube-api-access-7lhnq\") pod \"neutron-76f68cc57f-g29rv\" (UID: \"764337bd-5689-4897-82aa-cc5d7e55e39f\") " pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.228875 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.317674 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" event={"ID":"f96e7ded-86c9-404c-b278-262796e10cfb","Type":"ContainerStarted","Data":"f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0"} Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.317787 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.357531 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" podStartSLOduration=3.357511126 podStartE2EDuration="3.357511126s" podCreationTimestamp="2026-03-14 08:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:57:09.345948833 +0000 UTC m=+7092.097864017" watchObservedRunningTime="2026-03-14 08:57:09.357511126 +0000 UTC m=+7092.109426310" Mar 14 08:57:09 crc kubenswrapper[5129]: I0314 08:57:09.898723 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f68cc57f-g29rv"] Mar 14 08:57:09 crc kubenswrapper[5129]: W0314 08:57:09.904752 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod764337bd_5689_4897_82aa_cc5d7e55e39f.slice/crio-eac32ef6295ca3481fe4ea6d5188d7541e7e78a6e4db701d515c834ec09fe755 WatchSource:0}: Error finding container eac32ef6295ca3481fe4ea6d5188d7541e7e78a6e4db701d515c834ec09fe755: Status 404 returned error can't find the container with id eac32ef6295ca3481fe4ea6d5188d7541e7e78a6e4db701d515c834ec09fe755 Mar 14 08:57:10 crc kubenswrapper[5129]: I0314 08:57:10.327519 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f68cc57f-g29rv" event={"ID":"764337bd-5689-4897-82aa-cc5d7e55e39f","Type":"ContainerStarted","Data":"10accb8477b47e3d1abc383746af5feb9105069f40c7861e19f345255dc8606f"} Mar 14 08:57:10 crc kubenswrapper[5129]: I0314 08:57:10.327836 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f68cc57f-g29rv" event={"ID":"764337bd-5689-4897-82aa-cc5d7e55e39f","Type":"ContainerStarted","Data":"eac32ef6295ca3481fe4ea6d5188d7541e7e78a6e4db701d515c834ec09fe755"} Mar 14 08:57:11 crc kubenswrapper[5129]: I0314 08:57:11.348984 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f68cc57f-g29rv" event={"ID":"764337bd-5689-4897-82aa-cc5d7e55e39f","Type":"ContainerStarted","Data":"31f4c1ef8467125c5cda4813b07ad3601a0093f22440402d47bc110f2891d436"} Mar 14 08:57:11 crc kubenswrapper[5129]: I0314 08:57:11.349675 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:11 crc kubenswrapper[5129]: I0314 08:57:11.395516 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76f68cc57f-g29rv" podStartSLOduration=3.395482147 podStartE2EDuration="3.395482147s" podCreationTimestamp="2026-03-14 08:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:57:11.385240589 +0000 UTC m=+7094.137155913" watchObservedRunningTime="2026-03-14 08:57:11.395482147 +0000 UTC m=+7094.147397361" Mar 14 08:57:16 crc kubenswrapper[5129]: I0314 08:57:16.871921 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:57:16 crc kubenswrapper[5129]: I0314 08:57:16.934182 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dcc7fd9c5-xx776"] Mar 14 08:57:16 crc kubenswrapper[5129]: I0314 08:57:16.934459 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" podUID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerName="dnsmasq-dns" containerID="cri-o://e52c828c4c947bb847c88584cc67f0e447afabeb1af44ca5cdd3e1c6a5960b72" gracePeriod=10 Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.406523 5129 generic.go:334] "Generic (PLEG): container finished" podID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerID="e52c828c4c947bb847c88584cc67f0e447afabeb1af44ca5cdd3e1c6a5960b72" exitCode=0 Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.406617 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" event={"ID":"31d6fa6c-d1e7-47d2-9f15-b482e6d61975","Type":"ContainerDied","Data":"e52c828c4c947bb847c88584cc67f0e447afabeb1af44ca5cdd3e1c6a5960b72"} Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.407071 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" event={"ID":"31d6fa6c-d1e7-47d2-9f15-b482e6d61975","Type":"ContainerDied","Data":"bb3a36c5e1edb7338ab5f3a49259c90275b6cfad6d204227da5697c125c78f56"} Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.407096 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3a36c5e1edb7338ab5f3a49259c90275b6cfad6d204227da5697c125c78f56" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.446497 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.609501 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-dns-svc\") pod \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.609795 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-config\") pod \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.609972 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5nfk\" (UniqueName: \"kubernetes.io/projected/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-kube-api-access-l5nfk\") pod \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.610038 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-sb\") pod \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.610145 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-nb\") pod \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\" (UID: \"31d6fa6c-d1e7-47d2-9f15-b482e6d61975\") " Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.628906 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-kube-api-access-l5nfk" (OuterVolumeSpecName: "kube-api-access-l5nfk") pod "31d6fa6c-d1e7-47d2-9f15-b482e6d61975" (UID: "31d6fa6c-d1e7-47d2-9f15-b482e6d61975"). InnerVolumeSpecName "kube-api-access-l5nfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.697155 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31d6fa6c-d1e7-47d2-9f15-b482e6d61975" (UID: "31d6fa6c-d1e7-47d2-9f15-b482e6d61975"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.697487 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-config" (OuterVolumeSpecName: "config") pod "31d6fa6c-d1e7-47d2-9f15-b482e6d61975" (UID: "31d6fa6c-d1e7-47d2-9f15-b482e6d61975"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.706009 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31d6fa6c-d1e7-47d2-9f15-b482e6d61975" (UID: "31d6fa6c-d1e7-47d2-9f15-b482e6d61975"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.715042 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.715074 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.715089 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5nfk\" (UniqueName: \"kubernetes.io/projected/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-kube-api-access-l5nfk\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.715103 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.737535 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31d6fa6c-d1e7-47d2-9f15-b482e6d61975" (UID: "31d6fa6c-d1e7-47d2-9f15-b482e6d61975"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:57:17 crc kubenswrapper[5129]: I0314 08:57:17.816738 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d6fa6c-d1e7-47d2-9f15-b482e6d61975-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:18 crc kubenswrapper[5129]: I0314 08:57:18.414536 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcc7fd9c5-xx776" Mar 14 08:57:18 crc kubenswrapper[5129]: I0314 08:57:18.445032 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dcc7fd9c5-xx776"] Mar 14 08:57:18 crc kubenswrapper[5129]: I0314 08:57:18.453455 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dcc7fd9c5-xx776"] Mar 14 08:57:19 crc kubenswrapper[5129]: I0314 08:57:19.574348 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:57:19 crc kubenswrapper[5129]: I0314 08:57:19.575027 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:57:20 crc kubenswrapper[5129]: I0314 08:57:20.048280 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" path="/var/lib/kubelet/pods/31d6fa6c-d1e7-47d2-9f15-b482e6d61975/volumes" Mar 14 08:57:36 crc kubenswrapper[5129]: I0314 08:57:36.957439 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:39 crc kubenswrapper[5129]: I0314 08:57:39.252882 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76f68cc57f-g29rv" Mar 14 08:57:39 crc kubenswrapper[5129]: I0314 08:57:39.328210 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f7cdfcdd4-848dw"] Mar 14 08:57:39 crc kubenswrapper[5129]: I0314 08:57:39.328549 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f7cdfcdd4-848dw" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-api" containerID="cri-o://6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21" gracePeriod=30 Mar 14 08:57:39 crc kubenswrapper[5129]: I0314 08:57:39.328709 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f7cdfcdd4-848dw" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-httpd" containerID="cri-o://62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36" gracePeriod=30 Mar 14 08:57:39 crc kubenswrapper[5129]: I0314 08:57:39.749078 5129 generic.go:334] "Generic (PLEG): container finished" podID="43757357-92fb-43c6-8b43-05b6b3439600" containerID="62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36" exitCode=0 Mar 14 08:57:39 crc kubenswrapper[5129]: I0314 08:57:39.749148 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7cdfcdd4-848dw" event={"ID":"43757357-92fb-43c6-8b43-05b6b3439600","Type":"ContainerDied","Data":"62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36"} Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.479218 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.551390 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-combined-ca-bundle\") pod \"43757357-92fb-43c6-8b43-05b6b3439600\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.551591 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-config\") pod \"43757357-92fb-43c6-8b43-05b6b3439600\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.551779 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5slc5\" (UniqueName: \"kubernetes.io/projected/43757357-92fb-43c6-8b43-05b6b3439600-kube-api-access-5slc5\") pod \"43757357-92fb-43c6-8b43-05b6b3439600\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.551814 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-ovndb-tls-certs\") pod \"43757357-92fb-43c6-8b43-05b6b3439600\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.551866 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-httpd-config\") pod \"43757357-92fb-43c6-8b43-05b6b3439600\" (UID: \"43757357-92fb-43c6-8b43-05b6b3439600\") " Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.559650 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "43757357-92fb-43c6-8b43-05b6b3439600" (UID: "43757357-92fb-43c6-8b43-05b6b3439600"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.559673 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43757357-92fb-43c6-8b43-05b6b3439600-kube-api-access-5slc5" (OuterVolumeSpecName: "kube-api-access-5slc5") pod "43757357-92fb-43c6-8b43-05b6b3439600" (UID: "43757357-92fb-43c6-8b43-05b6b3439600"). InnerVolumeSpecName "kube-api-access-5slc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.616385 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43757357-92fb-43c6-8b43-05b6b3439600" (UID: "43757357-92fb-43c6-8b43-05b6b3439600"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.616560 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-config" (OuterVolumeSpecName: "config") pod "43757357-92fb-43c6-8b43-05b6b3439600" (UID: "43757357-92fb-43c6-8b43-05b6b3439600"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.650562 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "43757357-92fb-43c6-8b43-05b6b3439600" (UID: "43757357-92fb-43c6-8b43-05b6b3439600"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.654050 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.654077 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.654090 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5slc5\" (UniqueName: \"kubernetes.io/projected/43757357-92fb-43c6-8b43-05b6b3439600-kube-api-access-5slc5\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.654100 5129 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.654109 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/43757357-92fb-43c6-8b43-05b6b3439600-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.789460 5129 generic.go:334] "Generic (PLEG): container finished" podID="43757357-92fb-43c6-8b43-05b6b3439600" containerID="6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21" exitCode=0 Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.789528 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7cdfcdd4-848dw" event={"ID":"43757357-92fb-43c6-8b43-05b6b3439600","Type":"ContainerDied","Data":"6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21"} Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.789566 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7cdfcdd4-848dw" event={"ID":"43757357-92fb-43c6-8b43-05b6b3439600","Type":"ContainerDied","Data":"c04994c050377519c021c6abbadfc6414e2e317b0e49f694f443253f21ce72d7"} Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.789574 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7cdfcdd4-848dw" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.789585 5129 scope.go:117] "RemoveContainer" containerID="62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.813873 5129 scope.go:117] "RemoveContainer" containerID="6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.829275 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f7cdfcdd4-848dw"] Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.839558 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f7cdfcdd4-848dw"] Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.847339 5129 scope.go:117] "RemoveContainer" containerID="62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36" Mar 14 08:57:43 crc kubenswrapper[5129]: E0314 08:57:43.848198 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36\": container with ID starting with 62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36 not found: ID does not exist" containerID="62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.848326 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36"} err="failed to get container status \"62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36\": rpc error: code = NotFound desc = could not find container \"62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36\": container with ID starting with 62899c3d8c7646448789d2141a3f7265377cc859e59aadf403207eef5a44ac36 not found: ID does not exist" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.848475 5129 scope.go:117] "RemoveContainer" containerID="6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21" Mar 14 08:57:43 crc kubenswrapper[5129]: E0314 08:57:43.849480 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21\": container with ID starting with 6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21 not found: ID does not exist" containerID="6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21" Mar 14 08:57:43 crc kubenswrapper[5129]: I0314 08:57:43.849533 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21"} err="failed to get container status \"6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21\": rpc error: code = NotFound desc = could not find container \"6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21\": container with ID starting with 6912105d25d76f0d8456fa3cb6756ccc5eaf3da189333ed5a3afbda9df593a21 not found: ID does not exist" Mar 14 08:57:44 crc kubenswrapper[5129]: I0314 08:57:44.053243 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43757357-92fb-43c6-8b43-05b6b3439600" path="/var/lib/kubelet/pods/43757357-92fb-43c6-8b43-05b6b3439600/volumes" Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.574409 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.574837 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.574903 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.575879 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.575941 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" gracePeriod=600 Mar 14 08:57:49 crc kubenswrapper[5129]: E0314 08:57:49.705354 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.849454 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" exitCode=0 Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.849505 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d"} Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.849582 5129 scope.go:117] "RemoveContainer" containerID="08fb3f0642680c2408576300f89e73c1a127607f27f00eed4c184f6d6d9f3720" Mar 14 08:57:49 crc kubenswrapper[5129]: I0314 08:57:49.850446 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:57:49 crc kubenswrapper[5129]: E0314 08:57:49.850755 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.147737 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7j4sn"] Mar 14 08:58:00 crc kubenswrapper[5129]: E0314 08:58:00.148895 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-httpd" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.148912 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-httpd" Mar 14 08:58:00 crc kubenswrapper[5129]: E0314 08:58:00.148935 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-api" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.148942 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-api" Mar 14 08:58:00 crc kubenswrapper[5129]: E0314 08:58:00.148959 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerName="dnsmasq-dns" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.148967 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerName="dnsmasq-dns" Mar 14 08:58:00 crc kubenswrapper[5129]: E0314 08:58:00.148984 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerName="init" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.148990 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerName="init" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.149160 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-httpd" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.149179 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="43757357-92fb-43c6-8b43-05b6b3439600" containerName="neutron-api" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.149189 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d6fa6c-d1e7-47d2-9f15-b482e6d61975" containerName="dnsmasq-dns" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.150076 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.153210 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.153389 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.153397 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.164795 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7j4sn"] Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.249945 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j47x9\" (UniqueName: \"kubernetes.io/projected/116977c9-6bcd-4783-9885-f1b73bda04e6-kube-api-access-j47x9\") pod \"auto-csr-approver-29557978-7j4sn\" (UID: \"116977c9-6bcd-4783-9885-f1b73bda04e6\") " pod="openshift-infra/auto-csr-approver-29557978-7j4sn" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.352142 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j47x9\" (UniqueName: \"kubernetes.io/projected/116977c9-6bcd-4783-9885-f1b73bda04e6-kube-api-access-j47x9\") pod \"auto-csr-approver-29557978-7j4sn\" (UID: \"116977c9-6bcd-4783-9885-f1b73bda04e6\") " pod="openshift-infra/auto-csr-approver-29557978-7j4sn" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.372338 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j47x9\" (UniqueName: \"kubernetes.io/projected/116977c9-6bcd-4783-9885-f1b73bda04e6-kube-api-access-j47x9\") pod \"auto-csr-approver-29557978-7j4sn\" (UID: \"116977c9-6bcd-4783-9885-f1b73bda04e6\") " pod="openshift-infra/auto-csr-approver-29557978-7j4sn" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.469328 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.727089 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7j4sn"] Mar 14 08:58:00 crc kubenswrapper[5129]: W0314 08:58:00.743102 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116977c9_6bcd_4783_9885_f1b73bda04e6.slice/crio-efddb0ad9066b18cd6611cf3a51ea80e812b19a84007d56207782a71c17d9e98 WatchSource:0}: Error finding container efddb0ad9066b18cd6611cf3a51ea80e812b19a84007d56207782a71c17d9e98: Status 404 returned error can't find the container with id efddb0ad9066b18cd6611cf3a51ea80e812b19a84007d56207782a71c17d9e98 Mar 14 08:58:00 crc kubenswrapper[5129]: I0314 08:58:00.971706 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" event={"ID":"116977c9-6bcd-4783-9885-f1b73bda04e6","Type":"ContainerStarted","Data":"efddb0ad9066b18cd6611cf3a51ea80e812b19a84007d56207782a71c17d9e98"} Mar 14 08:58:01 crc kubenswrapper[5129]: I0314 08:58:01.986431 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" event={"ID":"116977c9-6bcd-4783-9885-f1b73bda04e6","Type":"ContainerStarted","Data":"9984b133d4c499298b9d153d514aec2a47f57568d35839d7ac4a95078c392cdc"} Mar 14 08:58:02 crc kubenswrapper[5129]: I0314 08:58:02.006492 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" podStartSLOduration=1.228529176 podStartE2EDuration="2.006472269s" podCreationTimestamp="2026-03-14 08:58:00 +0000 UTC" firstStartedPulling="2026-03-14 08:58:00.74524671 +0000 UTC m=+7143.497161894" lastFinishedPulling="2026-03-14 08:58:01.523189803 +0000 UTC m=+7144.275104987" observedRunningTime="2026-03-14 08:58:01.999659255 +0000 UTC m=+7144.751574459" watchObservedRunningTime="2026-03-14 08:58:02.006472269 +0000 UTC m=+7144.758387453" Mar 14 08:58:03 crc kubenswrapper[5129]: I0314 08:58:02.999994 5129 generic.go:334] "Generic (PLEG): container finished" podID="116977c9-6bcd-4783-9885-f1b73bda04e6" containerID="9984b133d4c499298b9d153d514aec2a47f57568d35839d7ac4a95078c392cdc" exitCode=0 Mar 14 08:58:03 crc kubenswrapper[5129]: I0314 08:58:03.000085 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" event={"ID":"116977c9-6bcd-4783-9885-f1b73bda04e6","Type":"ContainerDied","Data":"9984b133d4c499298b9d153d514aec2a47f57568d35839d7ac4a95078c392cdc"} Mar 14 08:58:03 crc kubenswrapper[5129]: I0314 08:58:03.036701 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:58:03 crc kubenswrapper[5129]: E0314 08:58:03.037404 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:58:04 crc kubenswrapper[5129]: I0314 08:58:04.342664 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" Mar 14 08:58:04 crc kubenswrapper[5129]: I0314 08:58:04.441206 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j47x9\" (UniqueName: \"kubernetes.io/projected/116977c9-6bcd-4783-9885-f1b73bda04e6-kube-api-access-j47x9\") pod \"116977c9-6bcd-4783-9885-f1b73bda04e6\" (UID: \"116977c9-6bcd-4783-9885-f1b73bda04e6\") " Mar 14 08:58:04 crc kubenswrapper[5129]: I0314 08:58:04.449944 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116977c9-6bcd-4783-9885-f1b73bda04e6-kube-api-access-j47x9" (OuterVolumeSpecName: "kube-api-access-j47x9") pod "116977c9-6bcd-4783-9885-f1b73bda04e6" (UID: "116977c9-6bcd-4783-9885-f1b73bda04e6"). InnerVolumeSpecName "kube-api-access-j47x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:58:04 crc kubenswrapper[5129]: I0314 08:58:04.544555 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j47x9\" (UniqueName: \"kubernetes.io/projected/116977c9-6bcd-4783-9885-f1b73bda04e6-kube-api-access-j47x9\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:05 crc kubenswrapper[5129]: I0314 08:58:05.022933 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" event={"ID":"116977c9-6bcd-4783-9885-f1b73bda04e6","Type":"ContainerDied","Data":"efddb0ad9066b18cd6611cf3a51ea80e812b19a84007d56207782a71c17d9e98"} Mar 14 08:58:05 crc kubenswrapper[5129]: I0314 08:58:05.022975 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efddb0ad9066b18cd6611cf3a51ea80e812b19a84007d56207782a71c17d9e98" Mar 14 08:58:05 crc kubenswrapper[5129]: I0314 08:58:05.023030 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7j4sn" Mar 14 08:58:05 crc kubenswrapper[5129]: I0314 08:58:05.090069 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-g8qfb"] Mar 14 08:58:05 crc kubenswrapper[5129]: I0314 08:58:05.101813 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-g8qfb"] Mar 14 08:58:06 crc kubenswrapper[5129]: I0314 08:58:06.052642 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f1a22d-c7b8-4fd6-8111-55421a6d8b4e" path="/var/lib/kubelet/pods/38f1a22d-c7b8-4fd6-8111-55421a6d8b4e/volumes" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.728121 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s6fsl"] Mar 14 08:58:07 crc kubenswrapper[5129]: E0314 08:58:07.728574 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116977c9-6bcd-4783-9885-f1b73bda04e6" containerName="oc" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.728589 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="116977c9-6bcd-4783-9885-f1b73bda04e6" containerName="oc" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.728827 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="116977c9-6bcd-4783-9885-f1b73bda04e6" containerName="oc" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.729482 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.733035 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.733892 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.733978 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.734186 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.741131 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-wxqjh" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.745499 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s6fsl"] Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.778373 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s6fsl"] Mar 14 08:58:07 crc kubenswrapper[5129]: E0314 08:58:07.780975 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-nwgf8 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-s6fsl" podUID="ae799352-0bbc-4fe3-9cd2-e7d38b938f91" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.818766 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-ring-data-devices\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.818818 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-dispersionconf\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.818850 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-scripts\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.818894 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-combined-ca-bundle\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.818927 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-etc-swift\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.818981 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-swiftconf\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.819106 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwgf8\" (UniqueName: \"kubernetes.io/projected/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-kube-api-access-nwgf8\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.921837 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-scripts\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.921918 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-combined-ca-bundle\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.921938 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-etc-swift\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.921983 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-swiftconf\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.922086 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwgf8\" (UniqueName: \"kubernetes.io/projected/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-kube-api-access-nwgf8\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.922116 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-ring-data-devices\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.922130 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-dispersionconf\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.923665 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-scripts\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.923977 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-etc-swift\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.924187 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-ring-data-devices\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.945162 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6f5d8457-qpds4"] Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.947359 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.964474 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-swiftconf\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.967441 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-combined-ca-bundle\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.967845 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6f5d8457-qpds4"] Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.978257 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-dispersionconf\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:07 crc kubenswrapper[5129]: I0314 08:58:07.986897 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwgf8\" (UniqueName: \"kubernetes.io/projected/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-kube-api-access-nwgf8\") pod \"swift-ring-rebalance-s6fsl\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.027609 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nktz\" (UniqueName: \"kubernetes.io/projected/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-kube-api-access-6nktz\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.027671 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-config\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.027885 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.028025 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.028250 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-dns-svc\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.057526 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.090488 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.129999 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-dns-svc\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.130086 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nktz\" (UniqueName: \"kubernetes.io/projected/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-kube-api-access-6nktz\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.130106 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-config\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.130136 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.130214 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.131352 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.131368 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-config\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.134333 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.137197 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-dns-svc\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.149272 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nktz\" (UniqueName: \"kubernetes.io/projected/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-kube-api-access-6nktz\") pod \"dnsmasq-dns-6b6f5d8457-qpds4\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.234394 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-dispersionconf\") pod \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.234461 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-combined-ca-bundle\") pod \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.234529 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-scripts\") pod \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.234590 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-etc-swift\") pod \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.234763 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-swiftconf\") pod \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.234782 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwgf8\" (UniqueName: \"kubernetes.io/projected/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-kube-api-access-nwgf8\") pod \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.234803 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-ring-data-devices\") pod \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\" (UID: \"ae799352-0bbc-4fe3-9cd2-e7d38b938f91\") " Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.237359 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ae799352-0bbc-4fe3-9cd2-e7d38b938f91" (UID: "ae799352-0bbc-4fe3-9cd2-e7d38b938f91"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.238368 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-scripts" (OuterVolumeSpecName: "scripts") pod "ae799352-0bbc-4fe3-9cd2-e7d38b938f91" (UID: "ae799352-0bbc-4fe3-9cd2-e7d38b938f91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.240196 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ae799352-0bbc-4fe3-9cd2-e7d38b938f91" (UID: "ae799352-0bbc-4fe3-9cd2-e7d38b938f91"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.243205 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ae799352-0bbc-4fe3-9cd2-e7d38b938f91" (UID: "ae799352-0bbc-4fe3-9cd2-e7d38b938f91"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.243623 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-kube-api-access-nwgf8" (OuterVolumeSpecName: "kube-api-access-nwgf8") pod "ae799352-0bbc-4fe3-9cd2-e7d38b938f91" (UID: "ae799352-0bbc-4fe3-9cd2-e7d38b938f91"). InnerVolumeSpecName "kube-api-access-nwgf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.246711 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ae799352-0bbc-4fe3-9cd2-e7d38b938f91" (UID: "ae799352-0bbc-4fe3-9cd2-e7d38b938f91"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.260883 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae799352-0bbc-4fe3-9cd2-e7d38b938f91" (UID: "ae799352-0bbc-4fe3-9cd2-e7d38b938f91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.337036 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwgf8\" (UniqueName: \"kubernetes.io/projected/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-kube-api-access-nwgf8\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.337075 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.337090 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.337099 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.337108 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.337117 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.337126 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae799352-0bbc-4fe3-9cd2-e7d38b938f91-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.382411 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:08 crc kubenswrapper[5129]: I0314 08:58:08.894695 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6f5d8457-qpds4"] Mar 14 08:58:09 crc kubenswrapper[5129]: I0314 08:58:09.073238 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s6fsl" Mar 14 08:58:09 crc kubenswrapper[5129]: I0314 08:58:09.073493 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" event={"ID":"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d","Type":"ContainerStarted","Data":"eb5e5e97fdb4c7992416869d7a4b11ff504781453c83fb8dc91df2c6df227614"} Mar 14 08:58:09 crc kubenswrapper[5129]: I0314 08:58:09.150350 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s6fsl"] Mar 14 08:58:09 crc kubenswrapper[5129]: I0314 08:58:09.165298 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-s6fsl"] Mar 14 08:58:10 crc kubenswrapper[5129]: I0314 08:58:10.047490 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae799352-0bbc-4fe3-9cd2-e7d38b938f91" path="/var/lib/kubelet/pods/ae799352-0bbc-4fe3-9cd2-e7d38b938f91/volumes" Mar 14 08:58:10 crc kubenswrapper[5129]: I0314 08:58:10.085593 5129 generic.go:334] "Generic (PLEG): container finished" podID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerID="337c017805d22dfec9e7bf121dae1bd5d163d72834197b9372a9914c2415d144" exitCode=0 Mar 14 08:58:10 crc kubenswrapper[5129]: I0314 08:58:10.085655 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" event={"ID":"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d","Type":"ContainerDied","Data":"337c017805d22dfec9e7bf121dae1bd5d163d72834197b9372a9914c2415d144"} Mar 14 08:58:11 crc kubenswrapper[5129]: I0314 08:58:11.107394 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" event={"ID":"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d","Type":"ContainerStarted","Data":"b1460d89c7377d038a7aa5adbec8de2f2d4c5afdbb0f08d952b2ec09be869e1f"} Mar 14 08:58:11 crc kubenswrapper[5129]: I0314 08:58:11.108910 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:11 crc kubenswrapper[5129]: I0314 08:58:11.131085 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" podStartSLOduration=4.13106457 podStartE2EDuration="4.13106457s" podCreationTimestamp="2026-03-14 08:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:58:11.127249607 +0000 UTC m=+7153.879164801" watchObservedRunningTime="2026-03-14 08:58:11.13106457 +0000 UTC m=+7153.882979754" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.442761 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5679cc5964-ql2sl"] Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.444974 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.448329 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.448874 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-wxqjh" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.449333 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.449681 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.452700 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.453837 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5679cc5964-ql2sl"] Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.463036 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.530766 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-internal-tls-certs\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.530840 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nsxf\" (UniqueName: \"kubernetes.io/projected/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-kube-api-access-2nsxf\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.530883 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-public-tls-certs\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.531295 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-combined-ca-bundle\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.531346 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-log-httpd\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.531383 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-etc-swift\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.531402 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-run-httpd\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.531557 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-config-data\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633386 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-combined-ca-bundle\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633452 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-log-httpd\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633475 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-etc-swift\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633499 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-run-httpd\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633528 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-config-data\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633581 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-internal-tls-certs\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633634 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nsxf\" (UniqueName: \"kubernetes.io/projected/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-kube-api-access-2nsxf\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.633669 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-public-tls-certs\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.634170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-log-httpd\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.634540 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-run-httpd\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.642791 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-public-tls-certs\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.644477 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-internal-tls-certs\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.645458 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-etc-swift\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.652108 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-config-data\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.656254 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-combined-ca-bundle\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.656927 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nsxf\" (UniqueName: \"kubernetes.io/projected/fd82fc6b-2f78-4b93-9c8a-2135600be3e0-kube-api-access-2nsxf\") pod \"swift-proxy-5679cc5964-ql2sl\" (UID: \"fd82fc6b-2f78-4b93-9c8a-2135600be3e0\") " pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:12 crc kubenswrapper[5129]: I0314 08:58:12.765542 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:13 crc kubenswrapper[5129]: I0314 08:58:13.524534 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5679cc5964-ql2sl"] Mar 14 08:58:13 crc kubenswrapper[5129]: W0314 08:58:13.547105 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd82fc6b_2f78_4b93_9c8a_2135600be3e0.slice/crio-6b241b788397933ae1394837366ba1ba31f00fa076a05910c8d68eb1f24c4bca WatchSource:0}: Error finding container 6b241b788397933ae1394837366ba1ba31f00fa076a05910c8d68eb1f24c4bca: Status 404 returned error can't find the container with id 6b241b788397933ae1394837366ba1ba31f00fa076a05910c8d68eb1f24c4bca Mar 14 08:58:14 crc kubenswrapper[5129]: I0314 08:58:14.149578 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5679cc5964-ql2sl" event={"ID":"fd82fc6b-2f78-4b93-9c8a-2135600be3e0","Type":"ContainerStarted","Data":"6b241b788397933ae1394837366ba1ba31f00fa076a05910c8d68eb1f24c4bca"} Mar 14 08:58:15 crc kubenswrapper[5129]: I0314 08:58:15.036978 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:58:15 crc kubenswrapper[5129]: E0314 08:58:15.037228 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:58:18 crc kubenswrapper[5129]: I0314 08:58:18.206125 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5679cc5964-ql2sl" event={"ID":"fd82fc6b-2f78-4b93-9c8a-2135600be3e0","Type":"ContainerStarted","Data":"61bf9aed35f76f09d1d0411ad53895c06ca1251d47a3648b7115b563eec1a9e3"} Mar 14 08:58:18 crc kubenswrapper[5129]: I0314 08:58:18.383807 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:58:18 crc kubenswrapper[5129]: I0314 08:58:18.459077 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5868879f8c-t66nx"] Mar 14 08:58:18 crc kubenswrapper[5129]: I0314 08:58:18.459362 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" podUID="f96e7ded-86c9-404c-b278-262796e10cfb" containerName="dnsmasq-dns" containerID="cri-o://f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0" gracePeriod=10 Mar 14 08:58:18 crc kubenswrapper[5129]: I0314 08:58:18.972829 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.109350 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-nb\") pod \"f96e7ded-86c9-404c-b278-262796e10cfb\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.109531 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-sb\") pod \"f96e7ded-86c9-404c-b278-262796e10cfb\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.109626 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9nd\" (UniqueName: \"kubernetes.io/projected/f96e7ded-86c9-404c-b278-262796e10cfb-kube-api-access-4g9nd\") pod \"f96e7ded-86c9-404c-b278-262796e10cfb\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.109814 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-dns-svc\") pod \"f96e7ded-86c9-404c-b278-262796e10cfb\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.109879 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-config\") pod \"f96e7ded-86c9-404c-b278-262796e10cfb\" (UID: \"f96e7ded-86c9-404c-b278-262796e10cfb\") " Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.116356 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96e7ded-86c9-404c-b278-262796e10cfb-kube-api-access-4g9nd" (OuterVolumeSpecName: "kube-api-access-4g9nd") pod "f96e7ded-86c9-404c-b278-262796e10cfb" (UID: "f96e7ded-86c9-404c-b278-262796e10cfb"). InnerVolumeSpecName "kube-api-access-4g9nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.156342 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f96e7ded-86c9-404c-b278-262796e10cfb" (UID: "f96e7ded-86c9-404c-b278-262796e10cfb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.160102 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f96e7ded-86c9-404c-b278-262796e10cfb" (UID: "f96e7ded-86c9-404c-b278-262796e10cfb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.165126 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f96e7ded-86c9-404c-b278-262796e10cfb" (UID: "f96e7ded-86c9-404c-b278-262796e10cfb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.187952 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-config" (OuterVolumeSpecName: "config") pod "f96e7ded-86c9-404c-b278-262796e10cfb" (UID: "f96e7ded-86c9-404c-b278-262796e10cfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.214192 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.214256 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.214272 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.214289 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96e7ded-86c9-404c-b278-262796e10cfb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.214305 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9nd\" (UniqueName: \"kubernetes.io/projected/f96e7ded-86c9-404c-b278-262796e10cfb-kube-api-access-4g9nd\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.232084 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5679cc5964-ql2sl" event={"ID":"fd82fc6b-2f78-4b93-9c8a-2135600be3e0","Type":"ContainerStarted","Data":"0a519cfceb7bbaed0e4e5276d4933ab37c6817f158ef9e39559b807a8fa478f1"} Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.234755 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.234831 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.238750 5129 generic.go:334] "Generic (PLEG): container finished" podID="f96e7ded-86c9-404c-b278-262796e10cfb" containerID="f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0" exitCode=0 Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.238799 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" event={"ID":"f96e7ded-86c9-404c-b278-262796e10cfb","Type":"ContainerDied","Data":"f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0"} Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.238849 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" event={"ID":"f96e7ded-86c9-404c-b278-262796e10cfb","Type":"ContainerDied","Data":"9f7b0a2c0d2f92d6cd5565f604d6cc5c42fefde1aade29d360cbd3803aa04c08"} Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.238869 5129 scope.go:117] "RemoveContainer" containerID="f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.239086 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5868879f8c-t66nx" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.261531 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5679cc5964-ql2sl" podStartSLOduration=3.864855028 podStartE2EDuration="7.261513132s" podCreationTimestamp="2026-03-14 08:58:12 +0000 UTC" firstStartedPulling="2026-03-14 08:58:13.553314841 +0000 UTC m=+7156.305230025" lastFinishedPulling="2026-03-14 08:58:16.949972945 +0000 UTC m=+7159.701888129" observedRunningTime="2026-03-14 08:58:19.258573583 +0000 UTC m=+7162.010488787" watchObservedRunningTime="2026-03-14 08:58:19.261513132 +0000 UTC m=+7162.013428316" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.279123 5129 scope.go:117] "RemoveContainer" containerID="98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.295518 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5868879f8c-t66nx"] Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.304265 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5868879f8c-t66nx"] Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.321398 5129 scope.go:117] "RemoveContainer" containerID="f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0" Mar 14 08:58:19 crc kubenswrapper[5129]: E0314 08:58:19.322274 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0\": container with ID starting with f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0 not found: ID does not exist" containerID="f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.322312 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0"} err="failed to get container status \"f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0\": rpc error: code = NotFound desc = could not find container \"f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0\": container with ID starting with f0f001c9284d292c7e01f02618ac8ee6d3bb86384f98bc58274fce992dcbe2f0 not found: ID does not exist" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.322339 5129 scope.go:117] "RemoveContainer" containerID="98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99" Mar 14 08:58:19 crc kubenswrapper[5129]: E0314 08:58:19.322547 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99\": container with ID starting with 98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99 not found: ID does not exist" containerID="98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99" Mar 14 08:58:19 crc kubenswrapper[5129]: I0314 08:58:19.322569 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99"} err="failed to get container status \"98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99\": rpc error: code = NotFound desc = could not find container \"98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99\": container with ID starting with 98afaca0e50c7c3d0a4a7f87d926966c129d5304443e2f3e95664c2142c20f99 not found: ID does not exist" Mar 14 08:58:19 crc kubenswrapper[5129]: E0314 08:58:19.407207 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf96e7ded_86c9_404c_b278_262796e10cfb.slice/crio-9f7b0a2c0d2f92d6cd5565f604d6cc5c42fefde1aade29d360cbd3803aa04c08\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf96e7ded_86c9_404c_b278_262796e10cfb.slice\": RecentStats: unable to find data in memory cache]" Mar 14 08:58:20 crc kubenswrapper[5129]: I0314 08:58:20.047831 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f96e7ded-86c9-404c-b278-262796e10cfb" path="/var/lib/kubelet/pods/f96e7ded-86c9-404c-b278-262796e10cfb/volumes" Mar 14 08:58:27 crc kubenswrapper[5129]: I0314 08:58:27.037534 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:58:27 crc kubenswrapper[5129]: E0314 08:58:27.039260 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:58:27 crc kubenswrapper[5129]: I0314 08:58:27.771242 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:27 crc kubenswrapper[5129]: I0314 08:58:27.773869 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5679cc5964-ql2sl" Mar 14 08:58:27 crc kubenswrapper[5129]: I0314 08:58:27.850227 5129 scope.go:117] "RemoveContainer" containerID="75f40604138b8ecf4521cb7a52943b7c7bdc5060bfb188dd9eaf0ae68b6569fb" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.596649 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qss72"] Mar 14 08:58:33 crc kubenswrapper[5129]: E0314 08:58:33.597691 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96e7ded-86c9-404c-b278-262796e10cfb" containerName="init" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.597708 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96e7ded-86c9-404c-b278-262796e10cfb" containerName="init" Mar 14 08:58:33 crc kubenswrapper[5129]: E0314 08:58:33.597731 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96e7ded-86c9-404c-b278-262796e10cfb" containerName="dnsmasq-dns" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.597749 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96e7ded-86c9-404c-b278-262796e10cfb" containerName="dnsmasq-dns" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.598059 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f96e7ded-86c9-404c-b278-262796e10cfb" containerName="dnsmasq-dns" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.598854 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qss72" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.608627 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qss72"] Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.702010 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-670c-account-create-update-7bmhp"] Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.703568 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.707772 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.711469 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-670c-account-create-update-7bmhp"] Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.799043 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxwxl\" (UniqueName: \"kubernetes.io/projected/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-kube-api-access-zxwxl\") pod \"cinder-670c-account-create-update-7bmhp\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.799228 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9lkg\" (UniqueName: \"kubernetes.io/projected/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-kube-api-access-c9lkg\") pod \"cinder-db-create-qss72\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " pod="openstack/cinder-db-create-qss72" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.799364 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-operator-scripts\") pod \"cinder-db-create-qss72\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " pod="openstack/cinder-db-create-qss72" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.799439 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-operator-scripts\") pod \"cinder-670c-account-create-update-7bmhp\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.900845 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxwxl\" (UniqueName: \"kubernetes.io/projected/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-kube-api-access-zxwxl\") pod \"cinder-670c-account-create-update-7bmhp\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.900946 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9lkg\" (UniqueName: \"kubernetes.io/projected/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-kube-api-access-c9lkg\") pod \"cinder-db-create-qss72\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " pod="openstack/cinder-db-create-qss72" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.901012 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-operator-scripts\") pod \"cinder-db-create-qss72\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " pod="openstack/cinder-db-create-qss72" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.901046 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-operator-scripts\") pod \"cinder-670c-account-create-update-7bmhp\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.901892 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-operator-scripts\") pod \"cinder-670c-account-create-update-7bmhp\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.901917 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-operator-scripts\") pod \"cinder-db-create-qss72\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " pod="openstack/cinder-db-create-qss72" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.922337 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxwxl\" (UniqueName: \"kubernetes.io/projected/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-kube-api-access-zxwxl\") pod \"cinder-670c-account-create-update-7bmhp\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.933281 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9lkg\" (UniqueName: \"kubernetes.io/projected/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-kube-api-access-c9lkg\") pod \"cinder-db-create-qss72\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " pod="openstack/cinder-db-create-qss72" Mar 14 08:58:33 crc kubenswrapper[5129]: I0314 08:58:33.937447 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qss72" Mar 14 08:58:34 crc kubenswrapper[5129]: I0314 08:58:34.024636 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:34 crc kubenswrapper[5129]: I0314 08:58:34.442366 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qss72"] Mar 14 08:58:34 crc kubenswrapper[5129]: I0314 08:58:34.582768 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-670c-account-create-update-7bmhp"] Mar 14 08:58:35 crc kubenswrapper[5129]: I0314 08:58:35.385946 5129 generic.go:334] "Generic (PLEG): container finished" podID="ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0" containerID="1f3c731844a0a51497e284a579f1961efea26e1c4cce48a26e6d64a5fe3690bb" exitCode=0 Mar 14 08:58:35 crc kubenswrapper[5129]: I0314 08:58:35.386021 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-670c-account-create-update-7bmhp" event={"ID":"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0","Type":"ContainerDied","Data":"1f3c731844a0a51497e284a579f1961efea26e1c4cce48a26e6d64a5fe3690bb"} Mar 14 08:58:35 crc kubenswrapper[5129]: I0314 08:58:35.387172 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-670c-account-create-update-7bmhp" event={"ID":"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0","Type":"ContainerStarted","Data":"ef57c2435f8f6319279e528c797340fbcd61538408d2594ba57ec4d8df74b65c"} Mar 14 08:58:35 crc kubenswrapper[5129]: I0314 08:58:35.391593 5129 generic.go:334] "Generic (PLEG): container finished" podID="0e0ce82f-aea7-41ff-abe9-fae70dc076ed" containerID="f742be5079eb882e6135b4262407da57eb4a26191b27deacddb634561c9cb989" exitCode=0 Mar 14 08:58:35 crc kubenswrapper[5129]: I0314 08:58:35.391675 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qss72" event={"ID":"0e0ce82f-aea7-41ff-abe9-fae70dc076ed","Type":"ContainerDied","Data":"f742be5079eb882e6135b4262407da57eb4a26191b27deacddb634561c9cb989"} Mar 14 08:58:35 crc kubenswrapper[5129]: I0314 08:58:35.391733 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qss72" event={"ID":"0e0ce82f-aea7-41ff-abe9-fae70dc076ed","Type":"ContainerStarted","Data":"552c59b487e4e0fc1a9ce9704cec9417f06a866b370a3f1cf0de2f532766ee83"} Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.833673 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.840064 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qss72" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.894919 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-operator-scripts\") pod \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.895012 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9lkg\" (UniqueName: \"kubernetes.io/projected/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-kube-api-access-c9lkg\") pod \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.895044 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxwxl\" (UniqueName: \"kubernetes.io/projected/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-kube-api-access-zxwxl\") pod \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\" (UID: \"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0\") " Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.895119 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-operator-scripts\") pod \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\" (UID: \"0e0ce82f-aea7-41ff-abe9-fae70dc076ed\") " Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.895780 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e0ce82f-aea7-41ff-abe9-fae70dc076ed" (UID: "0e0ce82f-aea7-41ff-abe9-fae70dc076ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.895807 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0" (UID: "ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.902403 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-kube-api-access-zxwxl" (OuterVolumeSpecName: "kube-api-access-zxwxl") pod "ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0" (UID: "ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0"). InnerVolumeSpecName "kube-api-access-zxwxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.908131 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-kube-api-access-c9lkg" (OuterVolumeSpecName: "kube-api-access-c9lkg") pod "0e0ce82f-aea7-41ff-abe9-fae70dc076ed" (UID: "0e0ce82f-aea7-41ff-abe9-fae70dc076ed"). InnerVolumeSpecName "kube-api-access-c9lkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.997466 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.997500 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9lkg\" (UniqueName: \"kubernetes.io/projected/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-kube-api-access-c9lkg\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.997514 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxwxl\" (UniqueName: \"kubernetes.io/projected/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0-kube-api-access-zxwxl\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:36 crc kubenswrapper[5129]: I0314 08:58:36.997523 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0ce82f-aea7-41ff-abe9-fae70dc076ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:37 crc kubenswrapper[5129]: I0314 08:58:37.426352 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-670c-account-create-update-7bmhp" event={"ID":"ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0","Type":"ContainerDied","Data":"ef57c2435f8f6319279e528c797340fbcd61538408d2594ba57ec4d8df74b65c"} Mar 14 08:58:37 crc kubenswrapper[5129]: I0314 08:58:37.426457 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef57c2435f8f6319279e528c797340fbcd61538408d2594ba57ec4d8df74b65c" Mar 14 08:58:37 crc kubenswrapper[5129]: I0314 08:58:37.426408 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-670c-account-create-update-7bmhp" Mar 14 08:58:37 crc kubenswrapper[5129]: I0314 08:58:37.431466 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qss72" event={"ID":"0e0ce82f-aea7-41ff-abe9-fae70dc076ed","Type":"ContainerDied","Data":"552c59b487e4e0fc1a9ce9704cec9417f06a866b370a3f1cf0de2f532766ee83"} Mar 14 08:58:37 crc kubenswrapper[5129]: I0314 08:58:37.431550 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552c59b487e4e0fc1a9ce9704cec9417f06a866b370a3f1cf0de2f532766ee83" Mar 14 08:58:37 crc kubenswrapper[5129]: I0314 08:58:37.431506 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qss72" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.887642 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2zxg6"] Mar 14 08:58:38 crc kubenswrapper[5129]: E0314 08:58:38.888250 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0ce82f-aea7-41ff-abe9-fae70dc076ed" containerName="mariadb-database-create" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.888262 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0ce82f-aea7-41ff-abe9-fae70dc076ed" containerName="mariadb-database-create" Mar 14 08:58:38 crc kubenswrapper[5129]: E0314 08:58:38.888284 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0" containerName="mariadb-account-create-update" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.888291 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0" containerName="mariadb-account-create-update" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.888444 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0ce82f-aea7-41ff-abe9-fae70dc076ed" containerName="mariadb-database-create" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.888470 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0" containerName="mariadb-account-create-update" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.889068 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.892048 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xsf56" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.892298 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.892728 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.905426 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2zxg6"] Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.937704 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-scripts\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.937758 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bebf2c98-990d-442f-9e7e-a0ee1bf73280-etc-machine-id\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.937839 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htzcr\" (UniqueName: \"kubernetes.io/projected/bebf2c98-990d-442f-9e7e-a0ee1bf73280-kube-api-access-htzcr\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.937892 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-db-sync-config-data\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.937947 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-combined-ca-bundle\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:38 crc kubenswrapper[5129]: I0314 08:58:38.938035 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-config-data\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.039428 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-config-data\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.039518 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-scripts\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.039551 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bebf2c98-990d-442f-9e7e-a0ee1bf73280-etc-machine-id\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.039620 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htzcr\" (UniqueName: \"kubernetes.io/projected/bebf2c98-990d-442f-9e7e-a0ee1bf73280-kube-api-access-htzcr\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.039674 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-db-sync-config-data\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.039719 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-combined-ca-bundle\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.040463 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bebf2c98-990d-442f-9e7e-a0ee1bf73280-etc-machine-id\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.046092 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-combined-ca-bundle\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.046401 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-config-data\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.047690 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-db-sync-config-data\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.047966 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-scripts\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.057394 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htzcr\" (UniqueName: \"kubernetes.io/projected/bebf2c98-990d-442f-9e7e-a0ee1bf73280-kube-api-access-htzcr\") pod \"cinder-db-sync-2zxg6\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.222456 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:58:39 crc kubenswrapper[5129]: I0314 08:58:39.687267 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2zxg6"] Mar 14 08:58:40 crc kubenswrapper[5129]: I0314 08:58:40.463336 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zxg6" event={"ID":"bebf2c98-990d-442f-9e7e-a0ee1bf73280","Type":"ContainerStarted","Data":"029a854c2e920ebbb954788b1895eba278a0495dccca1cff7133c5b6e53d5f7e"} Mar 14 08:58:42 crc kubenswrapper[5129]: I0314 08:58:42.037030 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:58:42 crc kubenswrapper[5129]: E0314 08:58:42.037647 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:58:56 crc kubenswrapper[5129]: I0314 08:58:56.040157 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:58:56 crc kubenswrapper[5129]: E0314 08:58:56.040993 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:59:03 crc kubenswrapper[5129]: E0314 08:59:03.473480 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:7002c9136b77c6990bfebf085d6871b3" Mar 14 08:59:03 crc kubenswrapper[5129]: E0314 08:59:03.474038 5129 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:7002c9136b77c6990bfebf085d6871b3" Mar 14 08:59:03 crc kubenswrapper[5129]: E0314 08:59:03.474200 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:7002c9136b77c6990bfebf085d6871b3,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-htzcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2zxg6_openstack(bebf2c98-990d-442f-9e7e-a0ee1bf73280): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:59:03 crc kubenswrapper[5129]: E0314 08:59:03.476149 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2zxg6" podUID="bebf2c98-990d-442f-9e7e-a0ee1bf73280" Mar 14 08:59:03 crc kubenswrapper[5129]: E0314 08:59:03.663514 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:7002c9136b77c6990bfebf085d6871b3\\\"\"" pod="openstack/cinder-db-sync-2zxg6" podUID="bebf2c98-990d-442f-9e7e-a0ee1bf73280" Mar 14 08:59:07 crc kubenswrapper[5129]: I0314 08:59:07.037482 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:59:07 crc kubenswrapper[5129]: E0314 08:59:07.038407 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:59:16 crc kubenswrapper[5129]: I0314 08:59:16.812072 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zxg6" event={"ID":"bebf2c98-990d-442f-9e7e-a0ee1bf73280","Type":"ContainerStarted","Data":"b4f01e7b018f2109951d066ddc8241a08eb628110228c9223f3c0295305d9cf4"} Mar 14 08:59:16 crc kubenswrapper[5129]: I0314 08:59:16.846288 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2zxg6" podStartSLOduration=3.02752929 podStartE2EDuration="38.846265169s" podCreationTimestamp="2026-03-14 08:58:38 +0000 UTC" firstStartedPulling="2026-03-14 08:58:39.695387001 +0000 UTC m=+7182.447302185" lastFinishedPulling="2026-03-14 08:59:15.51412287 +0000 UTC m=+7218.266038064" observedRunningTime="2026-03-14 08:59:16.842980941 +0000 UTC m=+7219.594896125" watchObservedRunningTime="2026-03-14 08:59:16.846265169 +0000 UTC m=+7219.598180363" Mar 14 08:59:21 crc kubenswrapper[5129]: I0314 08:59:21.860237 5129 generic.go:334] "Generic (PLEG): container finished" podID="bebf2c98-990d-442f-9e7e-a0ee1bf73280" containerID="b4f01e7b018f2109951d066ddc8241a08eb628110228c9223f3c0295305d9cf4" exitCode=0 Mar 14 08:59:21 crc kubenswrapper[5129]: I0314 08:59:21.860360 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zxg6" event={"ID":"bebf2c98-990d-442f-9e7e-a0ee1bf73280","Type":"ContainerDied","Data":"b4f01e7b018f2109951d066ddc8241a08eb628110228c9223f3c0295305d9cf4"} Mar 14 08:59:22 crc kubenswrapper[5129]: I0314 08:59:22.037340 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:59:22 crc kubenswrapper[5129]: E0314 08:59:22.037858 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.153376 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.238510 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-combined-ca-bundle\") pod \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.238632 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htzcr\" (UniqueName: \"kubernetes.io/projected/bebf2c98-990d-442f-9e7e-a0ee1bf73280-kube-api-access-htzcr\") pod \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.238713 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-scripts\") pod \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.238808 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-db-sync-config-data\") pod \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.238848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-config-data\") pod \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.238900 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bebf2c98-990d-442f-9e7e-a0ee1bf73280-etc-machine-id\") pod \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\" (UID: \"bebf2c98-990d-442f-9e7e-a0ee1bf73280\") " Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.239078 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bebf2c98-990d-442f-9e7e-a0ee1bf73280-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bebf2c98-990d-442f-9e7e-a0ee1bf73280" (UID: "bebf2c98-990d-442f-9e7e-a0ee1bf73280"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.239401 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bebf2c98-990d-442f-9e7e-a0ee1bf73280-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.244807 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bebf2c98-990d-442f-9e7e-a0ee1bf73280" (UID: "bebf2c98-990d-442f-9e7e-a0ee1bf73280"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.244831 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-scripts" (OuterVolumeSpecName: "scripts") pod "bebf2c98-990d-442f-9e7e-a0ee1bf73280" (UID: "bebf2c98-990d-442f-9e7e-a0ee1bf73280"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.245251 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bebf2c98-990d-442f-9e7e-a0ee1bf73280-kube-api-access-htzcr" (OuterVolumeSpecName: "kube-api-access-htzcr") pod "bebf2c98-990d-442f-9e7e-a0ee1bf73280" (UID: "bebf2c98-990d-442f-9e7e-a0ee1bf73280"). InnerVolumeSpecName "kube-api-access-htzcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.263227 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bebf2c98-990d-442f-9e7e-a0ee1bf73280" (UID: "bebf2c98-990d-442f-9e7e-a0ee1bf73280"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.283798 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-config-data" (OuterVolumeSpecName: "config-data") pod "bebf2c98-990d-442f-9e7e-a0ee1bf73280" (UID: "bebf2c98-990d-442f-9e7e-a0ee1bf73280"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.341515 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.341568 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htzcr\" (UniqueName: \"kubernetes.io/projected/bebf2c98-990d-442f-9e7e-a0ee1bf73280-kube-api-access-htzcr\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.341584 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.341619 5129 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.341632 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebf2c98-990d-442f-9e7e-a0ee1bf73280-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.880161 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2zxg6" event={"ID":"bebf2c98-990d-442f-9e7e-a0ee1bf73280","Type":"ContainerDied","Data":"029a854c2e920ebbb954788b1895eba278a0495dccca1cff7133c5b6e53d5f7e"} Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.880220 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="029a854c2e920ebbb954788b1895eba278a0495dccca1cff7133c5b6e53d5f7e" Mar 14 08:59:23 crc kubenswrapper[5129]: I0314 08:59:23.880312 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2zxg6" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.236957 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84667d55b7-tbvxl"] Mar 14 08:59:24 crc kubenswrapper[5129]: E0314 08:59:24.237906 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebf2c98-990d-442f-9e7e-a0ee1bf73280" containerName="cinder-db-sync" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.237927 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebf2c98-990d-442f-9e7e-a0ee1bf73280" containerName="cinder-db-sync" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.238199 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebf2c98-990d-442f-9e7e-a0ee1bf73280" containerName="cinder-db-sync" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.239491 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.267373 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84667d55b7-tbvxl"] Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.368889 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-sb\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.368993 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-config\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.369085 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-nb\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.369137 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-664j6\" (UniqueName: \"kubernetes.io/projected/5db4cd12-35cc-4168-9e0c-d966640d5a78-kube-api-access-664j6\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.369159 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-dns-svc\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.369934 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.371527 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.375027 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.377201 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xsf56" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.377492 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.380026 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.387492 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.471255 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-664j6\" (UniqueName: \"kubernetes.io/projected/5db4cd12-35cc-4168-9e0c-d966640d5a78-kube-api-access-664j6\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.471317 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-dns-svc\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.471367 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-sb\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.471425 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-config\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.471468 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-nb\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.472516 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-nb\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.472542 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-sb\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.472568 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-dns-svc\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.473071 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-config\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.488988 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-664j6\" (UniqueName: \"kubernetes.io/projected/5db4cd12-35cc-4168-9e0c-d966640d5a78-kube-api-access-664j6\") pod \"dnsmasq-dns-84667d55b7-tbvxl\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.561843 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.573210 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-scripts\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.573263 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.573289 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-logs\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.573423 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqcz4\" (UniqueName: \"kubernetes.io/projected/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-kube-api-access-kqcz4\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.573464 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.573521 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.573556 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.676236 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqcz4\" (UniqueName: \"kubernetes.io/projected/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-kube-api-access-kqcz4\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.676305 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.676332 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.676359 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.676430 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-scripts\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.676455 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.676475 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-logs\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.677239 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-logs\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.677308 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.683731 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data-custom\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.687351 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-scripts\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.688360 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.688451 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.753499 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqcz4\" (UniqueName: \"kubernetes.io/projected/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-kube-api-access-kqcz4\") pod \"cinder-api-0\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " pod="openstack/cinder-api-0" Mar 14 08:59:24 crc kubenswrapper[5129]: I0314 08:59:24.988270 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:59:25 crc kubenswrapper[5129]: I0314 08:59:25.097092 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84667d55b7-tbvxl"] Mar 14 08:59:25 crc kubenswrapper[5129]: W0314 08:59:25.119349 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5db4cd12_35cc_4168_9e0c_d966640d5a78.slice/crio-4778f023cf0417bdfa0fd621dbc511d7ab939dbcf2ec2f9676f9c5391718ccd8 WatchSource:0}: Error finding container 4778f023cf0417bdfa0fd621dbc511d7ab939dbcf2ec2f9676f9c5391718ccd8: Status 404 returned error can't find the container with id 4778f023cf0417bdfa0fd621dbc511d7ab939dbcf2ec2f9676f9c5391718ccd8 Mar 14 08:59:25 crc kubenswrapper[5129]: W0314 08:59:25.445042 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73c90a8b_9f9f_4527_9306_62ba15eaa4ba.slice/crio-b63c87747f610aab602547696496e09dfd9d5a393b0e9e24ce9c31ab07c77b16 WatchSource:0}: Error finding container b63c87747f610aab602547696496e09dfd9d5a393b0e9e24ce9c31ab07c77b16: Status 404 returned error can't find the container with id b63c87747f610aab602547696496e09dfd9d5a393b0e9e24ce9c31ab07c77b16 Mar 14 08:59:25 crc kubenswrapper[5129]: I0314 08:59:25.447540 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:25 crc kubenswrapper[5129]: I0314 08:59:25.916589 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73c90a8b-9f9f-4527-9306-62ba15eaa4ba","Type":"ContainerStarted","Data":"b63c87747f610aab602547696496e09dfd9d5a393b0e9e24ce9c31ab07c77b16"} Mar 14 08:59:25 crc kubenswrapper[5129]: I0314 08:59:25.918574 5129 generic.go:334] "Generic (PLEG): container finished" podID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerID="31b4b18dc2e389e1ea0861d8d9c46699992d1c9bdd5b3207e3194f3a1b09a456" exitCode=0 Mar 14 08:59:25 crc kubenswrapper[5129]: I0314 08:59:25.918635 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" event={"ID":"5db4cd12-35cc-4168-9e0c-d966640d5a78","Type":"ContainerDied","Data":"31b4b18dc2e389e1ea0861d8d9c46699992d1c9bdd5b3207e3194f3a1b09a456"} Mar 14 08:59:25 crc kubenswrapper[5129]: I0314 08:59:25.918693 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" event={"ID":"5db4cd12-35cc-4168-9e0c-d966640d5a78","Type":"ContainerStarted","Data":"4778f023cf0417bdfa0fd621dbc511d7ab939dbcf2ec2f9676f9c5391718ccd8"} Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.664845 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.930549 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73c90a8b-9f9f-4527-9306-62ba15eaa4ba","Type":"ContainerStarted","Data":"c0193abded00c76572576ba079a2e89579232fdcf1e12b41363a4bc2d12e7e14"} Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.930948 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73c90a8b-9f9f-4527-9306-62ba15eaa4ba","Type":"ContainerStarted","Data":"47151a473f4b42d16ecef9cf44def8caca19a4c65f1654951fd70f3bd534883e"} Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.930746 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api-log" containerID="cri-o://47151a473f4b42d16ecef9cf44def8caca19a4c65f1654951fd70f3bd534883e" gracePeriod=30 Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.930977 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.930857 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api" containerID="cri-o://c0193abded00c76572576ba079a2e89579232fdcf1e12b41363a4bc2d12e7e14" gracePeriod=30 Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.934687 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" event={"ID":"5db4cd12-35cc-4168-9e0c-d966640d5a78","Type":"ContainerStarted","Data":"ee8e8fde444c65972529b5a42bc21d7f38174e4edeb3492278cf8f3a62d4367a"} Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.935031 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:26 crc kubenswrapper[5129]: I0314 08:59:26.967729 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.967695253 podStartE2EDuration="2.967695253s" podCreationTimestamp="2026-03-14 08:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:26.964159507 +0000 UTC m=+7229.716074701" watchObservedRunningTime="2026-03-14 08:59:26.967695253 +0000 UTC m=+7229.719610437" Mar 14 08:59:27 crc kubenswrapper[5129]: I0314 08:59:27.005123 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" podStartSLOduration=3.005094238 podStartE2EDuration="3.005094238s" podCreationTimestamp="2026-03-14 08:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:59:26.993184934 +0000 UTC m=+7229.745100118" watchObservedRunningTime="2026-03-14 08:59:27.005094238 +0000 UTC m=+7229.757009422" Mar 14 08:59:27 crc kubenswrapper[5129]: I0314 08:59:27.948959 5129 generic.go:334] "Generic (PLEG): container finished" podID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerID="47151a473f4b42d16ecef9cf44def8caca19a4c65f1654951fd70f3bd534883e" exitCode=143 Mar 14 08:59:27 crc kubenswrapper[5129]: I0314 08:59:27.949062 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73c90a8b-9f9f-4527-9306-62ba15eaa4ba","Type":"ContainerDied","Data":"47151a473f4b42d16ecef9cf44def8caca19a4c65f1654951fd70f3bd534883e"} Mar 14 08:59:27 crc kubenswrapper[5129]: I0314 08:59:27.957405 5129 scope.go:117] "RemoveContainer" containerID="f31c05ff22a0a586c3c03277071c6ddb9534c1b40411b6420de498964f888eb6" Mar 14 08:59:34 crc kubenswrapper[5129]: I0314 08:59:34.036399 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:59:34 crc kubenswrapper[5129]: E0314 08:59:34.037281 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:59:34 crc kubenswrapper[5129]: I0314 08:59:34.563812 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 08:59:34 crc kubenswrapper[5129]: I0314 08:59:34.628310 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6f5d8457-qpds4"] Mar 14 08:59:34 crc kubenswrapper[5129]: I0314 08:59:34.628580 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" podUID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerName="dnsmasq-dns" containerID="cri-o://b1460d89c7377d038a7aa5adbec8de2f2d4c5afdbb0f08d952b2ec09be869e1f" gracePeriod=10 Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.034260 5129 generic.go:334] "Generic (PLEG): container finished" podID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerID="b1460d89c7377d038a7aa5adbec8de2f2d4c5afdbb0f08d952b2ec09be869e1f" exitCode=0 Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.034538 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" event={"ID":"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d","Type":"ContainerDied","Data":"b1460d89c7377d038a7aa5adbec8de2f2d4c5afdbb0f08d952b2ec09be869e1f"} Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.130112 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.327428 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-nb\") pod \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.327625 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-config\") pod \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.327704 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-dns-svc\") pod \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.327769 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nktz\" (UniqueName: \"kubernetes.io/projected/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-kube-api-access-6nktz\") pod \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.327845 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-sb\") pod \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\" (UID: \"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d\") " Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.341197 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-kube-api-access-6nktz" (OuterVolumeSpecName: "kube-api-access-6nktz") pod "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" (UID: "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d"). InnerVolumeSpecName "kube-api-access-6nktz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.383433 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" (UID: "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.387619 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" (UID: "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.388392 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" (UID: "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.394456 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-config" (OuterVolumeSpecName: "config") pod "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" (UID: "09f2e1cd-1b80-4b94-8187-a21ebbe5f93d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.431161 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.431204 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.431213 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.431225 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nktz\" (UniqueName: \"kubernetes.io/projected/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-kube-api-access-6nktz\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:35 crc kubenswrapper[5129]: I0314 08:59:35.431236 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:36 crc kubenswrapper[5129]: I0314 08:59:36.064788 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" Mar 14 08:59:36 crc kubenswrapper[5129]: I0314 08:59:36.067417 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6f5d8457-qpds4" event={"ID":"09f2e1cd-1b80-4b94-8187-a21ebbe5f93d","Type":"ContainerDied","Data":"eb5e5e97fdb4c7992416869d7a4b11ff504781453c83fb8dc91df2c6df227614"} Mar 14 08:59:36 crc kubenswrapper[5129]: I0314 08:59:36.067469 5129 scope.go:117] "RemoveContainer" containerID="b1460d89c7377d038a7aa5adbec8de2f2d4c5afdbb0f08d952b2ec09be869e1f" Mar 14 08:59:36 crc kubenswrapper[5129]: I0314 08:59:36.110468 5129 scope.go:117] "RemoveContainer" containerID="337c017805d22dfec9e7bf121dae1bd5d163d72834197b9372a9914c2415d144" Mar 14 08:59:36 crc kubenswrapper[5129]: I0314 08:59:36.116017 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6f5d8457-qpds4"] Mar 14 08:59:36 crc kubenswrapper[5129]: I0314 08:59:36.128853 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b6f5d8457-qpds4"] Mar 14 08:59:36 crc kubenswrapper[5129]: I0314 08:59:36.947260 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 08:59:38 crc kubenswrapper[5129]: I0314 08:59:38.050962 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" path="/var/lib/kubelet/pods/09f2e1cd-1b80-4b94-8187-a21ebbe5f93d/volumes" Mar 14 08:59:48 crc kubenswrapper[5129]: I0314 08:59:48.045178 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 08:59:48 crc kubenswrapper[5129]: E0314 08:59:48.045905 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.311850 5129 generic.go:334] "Generic (PLEG): container finished" podID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerID="c0193abded00c76572576ba079a2e89579232fdcf1e12b41363a4bc2d12e7e14" exitCode=137 Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.311947 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73c90a8b-9f9f-4527-9306-62ba15eaa4ba","Type":"ContainerDied","Data":"c0193abded00c76572576ba079a2e89579232fdcf1e12b41363a4bc2d12e7e14"} Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.457085 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.622586 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data\") pod \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.622669 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-combined-ca-bundle\") pod \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.622747 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-logs\") pod \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.622808 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data-custom\") pod \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.622896 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqcz4\" (UniqueName: \"kubernetes.io/projected/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-kube-api-access-kqcz4\") pod \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.622921 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-etc-machine-id\") pod \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.622983 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-scripts\") pod \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\" (UID: \"73c90a8b-9f9f-4527-9306-62ba15eaa4ba\") " Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.623407 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-logs" (OuterVolumeSpecName: "logs") pod "73c90a8b-9f9f-4527-9306-62ba15eaa4ba" (UID: "73c90a8b-9f9f-4527-9306-62ba15eaa4ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.623445 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "73c90a8b-9f9f-4527-9306-62ba15eaa4ba" (UID: "73c90a8b-9f9f-4527-9306-62ba15eaa4ba"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.624014 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.624036 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.631241 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-scripts" (OuterVolumeSpecName: "scripts") pod "73c90a8b-9f9f-4527-9306-62ba15eaa4ba" (UID: "73c90a8b-9f9f-4527-9306-62ba15eaa4ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.631336 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "73c90a8b-9f9f-4527-9306-62ba15eaa4ba" (UID: "73c90a8b-9f9f-4527-9306-62ba15eaa4ba"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.632038 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-kube-api-access-kqcz4" (OuterVolumeSpecName: "kube-api-access-kqcz4") pod "73c90a8b-9f9f-4527-9306-62ba15eaa4ba" (UID: "73c90a8b-9f9f-4527-9306-62ba15eaa4ba"). InnerVolumeSpecName "kube-api-access-kqcz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.655533 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c90a8b-9f9f-4527-9306-62ba15eaa4ba" (UID: "73c90a8b-9f9f-4527-9306-62ba15eaa4ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.678383 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data" (OuterVolumeSpecName: "config-data") pod "73c90a8b-9f9f-4527-9306-62ba15eaa4ba" (UID: "73c90a8b-9f9f-4527-9306-62ba15eaa4ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.726644 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.726704 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.726720 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.726732 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqcz4\" (UniqueName: \"kubernetes.io/projected/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-kube-api-access-kqcz4\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:57 crc kubenswrapper[5129]: I0314 08:59:57.726743 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c90a8b-9f9f-4527-9306-62ba15eaa4ba-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.323963 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73c90a8b-9f9f-4527-9306-62ba15eaa4ba","Type":"ContainerDied","Data":"b63c87747f610aab602547696496e09dfd9d5a393b0e9e24ce9c31ab07c77b16"} Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.324031 5129 scope.go:117] "RemoveContainer" containerID="c0193abded00c76572576ba079a2e89579232fdcf1e12b41363a4bc2d12e7e14" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.324054 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.357643 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.359155 5129 scope.go:117] "RemoveContainer" containerID="47151a473f4b42d16ecef9cf44def8caca19a4c65f1654951fd70f3bd534883e" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.367590 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.391503 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:58 crc kubenswrapper[5129]: E0314 08:59:58.392008 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.392030 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api" Mar 14 08:59:58 crc kubenswrapper[5129]: E0314 08:59:58.392051 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api-log" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.392061 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api-log" Mar 14 08:59:58 crc kubenswrapper[5129]: E0314 08:59:58.392085 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerName="init" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.392094 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerName="init" Mar 14 08:59:58 crc kubenswrapper[5129]: E0314 08:59:58.392107 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerName="dnsmasq-dns" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.392114 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerName="dnsmasq-dns" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.392273 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api-log" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.392290 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f2e1cd-1b80-4b94-8187-a21ebbe5f93d" containerName="dnsmasq-dns" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.392310 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" containerName="cinder-api" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.395557 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.402321 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.402704 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.402903 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xsf56" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.408045 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.408799 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.409439 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.413118 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.542621 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543111 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-scripts\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543211 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543275 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf00d63-a196-4bc1-8176-521b4e0c4695-logs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543361 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543425 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tfw\" (UniqueName: \"kubernetes.io/projected/3bf00d63-a196-4bc1-8176-521b4e0c4695-kube-api-access-g4tfw\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543493 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3bf00d63-a196-4bc1-8176-521b4e0c4695-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543843 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.543932 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data-custom\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.645879 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.645998 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data-custom\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646057 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646107 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-scripts\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646155 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646188 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf00d63-a196-4bc1-8176-521b4e0c4695-logs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646224 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tfw\" (UniqueName: \"kubernetes.io/projected/3bf00d63-a196-4bc1-8176-521b4e0c4695-kube-api-access-g4tfw\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646295 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3bf00d63-a196-4bc1-8176-521b4e0c4695-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.646461 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3bf00d63-a196-4bc1-8176-521b4e0c4695-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.647701 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf00d63-a196-4bc1-8176-521b4e0c4695-logs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.653860 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-scripts\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.654587 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.655885 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.661775 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.661889 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data-custom\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.663787 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.673859 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tfw\" (UniqueName: \"kubernetes.io/projected/3bf00d63-a196-4bc1-8176-521b4e0c4695-kube-api-access-g4tfw\") pod \"cinder-api-0\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " pod="openstack/cinder-api-0" Mar 14 08:59:58 crc kubenswrapper[5129]: I0314 08:59:58.726588 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:59:59 crc kubenswrapper[5129]: I0314 08:59:59.219195 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:59:59 crc kubenswrapper[5129]: W0314 08:59:59.222301 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bf00d63_a196_4bc1_8176_521b4e0c4695.slice/crio-5c2d0a9e7c41a6a48a936132651c2d994a8668381121b214e074283968f81581 WatchSource:0}: Error finding container 5c2d0a9e7c41a6a48a936132651c2d994a8668381121b214e074283968f81581: Status 404 returned error can't find the container with id 5c2d0a9e7c41a6a48a936132651c2d994a8668381121b214e074283968f81581 Mar 14 08:59:59 crc kubenswrapper[5129]: I0314 08:59:59.335000 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3bf00d63-a196-4bc1-8176-521b4e0c4695","Type":"ContainerStarted","Data":"5c2d0a9e7c41a6a48a936132651c2d994a8668381121b214e074283968f81581"} Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.070190 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c90a8b-9f9f-4527-9306-62ba15eaa4ba" path="/var/lib/kubelet/pods/73c90a8b-9f9f-4527-9306-62ba15eaa4ba/volumes" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.150762 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557980-hc7f4"] Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.152574 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-hc7f4" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.154909 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.155529 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.155923 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.160495 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv"] Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.162113 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.166914 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.167118 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.172631 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-hc7f4"] Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.186775 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv"] Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.206596 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9mh\" (UniqueName: \"kubernetes.io/projected/b8fc6210-3f18-4ea6-9c7e-040906265e9b-kube-api-access-xm9mh\") pod \"auto-csr-approver-29557980-hc7f4\" (UID: \"b8fc6210-3f18-4ea6-9c7e-040906265e9b\") " pod="openshift-infra/auto-csr-approver-29557980-hc7f4" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.206744 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fee75f5-d499-42e6-92a2-794e1b325cd4-config-volume\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.206799 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmpc2\" (UniqueName: \"kubernetes.io/projected/6fee75f5-d499-42e6-92a2-794e1b325cd4-kube-api-access-cmpc2\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.207197 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6fee75f5-d499-42e6-92a2-794e1b325cd4-secret-volume\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.308552 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6fee75f5-d499-42e6-92a2-794e1b325cd4-secret-volume\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.309056 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9mh\" (UniqueName: \"kubernetes.io/projected/b8fc6210-3f18-4ea6-9c7e-040906265e9b-kube-api-access-xm9mh\") pod \"auto-csr-approver-29557980-hc7f4\" (UID: \"b8fc6210-3f18-4ea6-9c7e-040906265e9b\") " pod="openshift-infra/auto-csr-approver-29557980-hc7f4" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.309230 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fee75f5-d499-42e6-92a2-794e1b325cd4-config-volume\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.309324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmpc2\" (UniqueName: \"kubernetes.io/projected/6fee75f5-d499-42e6-92a2-794e1b325cd4-kube-api-access-cmpc2\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.310127 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fee75f5-d499-42e6-92a2-794e1b325cd4-config-volume\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.316484 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6fee75f5-d499-42e6-92a2-794e1b325cd4-secret-volume\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.325853 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9mh\" (UniqueName: \"kubernetes.io/projected/b8fc6210-3f18-4ea6-9c7e-040906265e9b-kube-api-access-xm9mh\") pod \"auto-csr-approver-29557980-hc7f4\" (UID: \"b8fc6210-3f18-4ea6-9c7e-040906265e9b\") " pod="openshift-infra/auto-csr-approver-29557980-hc7f4" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.326872 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmpc2\" (UniqueName: \"kubernetes.io/projected/6fee75f5-d499-42e6-92a2-794e1b325cd4-kube-api-access-cmpc2\") pod \"collect-profiles-29557980-w4tqv\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.364831 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3bf00d63-a196-4bc1-8176-521b4e0c4695","Type":"ContainerStarted","Data":"85a78aab5990b7a81cb85e989d70a35e7c1b6edf4bf01854ab57a1f901042230"} Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.494832 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-hc7f4" Mar 14 09:00:00 crc kubenswrapper[5129]: I0314 09:00:00.513794 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.018234 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-hc7f4"] Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.028077 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv"] Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.038142 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:00:01 crc kubenswrapper[5129]: E0314 09:00:01.038365 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.375648 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3bf00d63-a196-4bc1-8176-521b4e0c4695","Type":"ContainerStarted","Data":"37580e194ea61e67ffabd35b6b99d298c8c19eb374ab7337eaebd24fe5e695fa"} Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.376053 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.377247 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" event={"ID":"6fee75f5-d499-42e6-92a2-794e1b325cd4","Type":"ContainerStarted","Data":"d8362e92a7ee0f3cab0ef02001ba9898a72aeb469bcf6d5766fa8ddb3d5486af"} Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.377285 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" event={"ID":"6fee75f5-d499-42e6-92a2-794e1b325cd4","Type":"ContainerStarted","Data":"995ffbe13c33ddb065d6a5d37616bdeee889ef29a599a88604891102d0ffa498"} Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.378790 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-hc7f4" event={"ID":"b8fc6210-3f18-4ea6-9c7e-040906265e9b","Type":"ContainerStarted","Data":"a5d55bcf2eaf364b8d0eb04c9810be0950a481b25b882564e84e41317e260dfa"} Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.407659 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.4076370750000002 podStartE2EDuration="3.407637075s" podCreationTimestamp="2026-03-14 08:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:01.401693294 +0000 UTC m=+7264.153608498" watchObservedRunningTime="2026-03-14 09:00:01.407637075 +0000 UTC m=+7264.159552259" Mar 14 09:00:01 crc kubenswrapper[5129]: I0314 09:00:01.423820 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" podStartSLOduration=1.4238039040000001 podStartE2EDuration="1.423803904s" podCreationTimestamp="2026-03-14 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:01.418913521 +0000 UTC m=+7264.170828705" watchObservedRunningTime="2026-03-14 09:00:01.423803904 +0000 UTC m=+7264.175719088" Mar 14 09:00:01 crc kubenswrapper[5129]: E0314 09:00:01.620543 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fee75f5_d499_42e6_92a2_794e1b325cd4.slice/crio-d8362e92a7ee0f3cab0ef02001ba9898a72aeb469bcf6d5766fa8ddb3d5486af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fee75f5_d499_42e6_92a2_794e1b325cd4.slice/crio-conmon-d8362e92a7ee0f3cab0ef02001ba9898a72aeb469bcf6d5766fa8ddb3d5486af.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:00:02 crc kubenswrapper[5129]: I0314 09:00:02.392354 5129 generic.go:334] "Generic (PLEG): container finished" podID="6fee75f5-d499-42e6-92a2-794e1b325cd4" containerID="d8362e92a7ee0f3cab0ef02001ba9898a72aeb469bcf6d5766fa8ddb3d5486af" exitCode=0 Mar 14 09:00:02 crc kubenswrapper[5129]: I0314 09:00:02.392522 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" event={"ID":"6fee75f5-d499-42e6-92a2-794e1b325cd4","Type":"ContainerDied","Data":"d8362e92a7ee0f3cab0ef02001ba9898a72aeb469bcf6d5766fa8ddb3d5486af"} Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.781903 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.883074 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmpc2\" (UniqueName: \"kubernetes.io/projected/6fee75f5-d499-42e6-92a2-794e1b325cd4-kube-api-access-cmpc2\") pod \"6fee75f5-d499-42e6-92a2-794e1b325cd4\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.883656 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6fee75f5-d499-42e6-92a2-794e1b325cd4-secret-volume\") pod \"6fee75f5-d499-42e6-92a2-794e1b325cd4\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.883799 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fee75f5-d499-42e6-92a2-794e1b325cd4-config-volume\") pod \"6fee75f5-d499-42e6-92a2-794e1b325cd4\" (UID: \"6fee75f5-d499-42e6-92a2-794e1b325cd4\") " Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.885049 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fee75f5-d499-42e6-92a2-794e1b325cd4-config-volume" (OuterVolumeSpecName: "config-volume") pod "6fee75f5-d499-42e6-92a2-794e1b325cd4" (UID: "6fee75f5-d499-42e6-92a2-794e1b325cd4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.890109 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fee75f5-d499-42e6-92a2-794e1b325cd4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6fee75f5-d499-42e6-92a2-794e1b325cd4" (UID: "6fee75f5-d499-42e6-92a2-794e1b325cd4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.897729 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fee75f5-d499-42e6-92a2-794e1b325cd4-kube-api-access-cmpc2" (OuterVolumeSpecName: "kube-api-access-cmpc2") pod "6fee75f5-d499-42e6-92a2-794e1b325cd4" (UID: "6fee75f5-d499-42e6-92a2-794e1b325cd4"). InnerVolumeSpecName "kube-api-access-cmpc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.987559 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6fee75f5-d499-42e6-92a2-794e1b325cd4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.987626 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fee75f5-d499-42e6-92a2-794e1b325cd4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:03 crc kubenswrapper[5129]: I0314 09:00:03.987643 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmpc2\" (UniqueName: \"kubernetes.io/projected/6fee75f5-d499-42e6-92a2-794e1b325cd4-kube-api-access-cmpc2\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:04 crc kubenswrapper[5129]: I0314 09:00:04.415924 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" event={"ID":"6fee75f5-d499-42e6-92a2-794e1b325cd4","Type":"ContainerDied","Data":"995ffbe13c33ddb065d6a5d37616bdeee889ef29a599a88604891102d0ffa498"} Mar 14 09:00:04 crc kubenswrapper[5129]: I0314 09:00:04.415962 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="995ffbe13c33ddb065d6a5d37616bdeee889ef29a599a88604891102d0ffa498" Mar 14 09:00:04 crc kubenswrapper[5129]: I0314 09:00:04.416555 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv" Mar 14 09:00:04 crc kubenswrapper[5129]: I0314 09:00:04.418114 5129 generic.go:334] "Generic (PLEG): container finished" podID="b8fc6210-3f18-4ea6-9c7e-040906265e9b" containerID="3b1ac07c3a71fa0fae21fa07a816f3032d78e1a6056566d8b59b9a7ababb3b2f" exitCode=0 Mar 14 09:00:04 crc kubenswrapper[5129]: I0314 09:00:04.418167 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-hc7f4" event={"ID":"b8fc6210-3f18-4ea6-9c7e-040906265e9b","Type":"ContainerDied","Data":"3b1ac07c3a71fa0fae21fa07a816f3032d78e1a6056566d8b59b9a7ababb3b2f"} Mar 14 09:00:04 crc kubenswrapper[5129]: I0314 09:00:04.898130 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5"] Mar 14 09:00:04 crc kubenswrapper[5129]: I0314 09:00:04.910548 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557935-tctc5"] Mar 14 09:00:05 crc kubenswrapper[5129]: I0314 09:00:05.850814 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-hc7f4" Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.028929 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm9mh\" (UniqueName: \"kubernetes.io/projected/b8fc6210-3f18-4ea6-9c7e-040906265e9b-kube-api-access-xm9mh\") pod \"b8fc6210-3f18-4ea6-9c7e-040906265e9b\" (UID: \"b8fc6210-3f18-4ea6-9c7e-040906265e9b\") " Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.037558 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fc6210-3f18-4ea6-9c7e-040906265e9b-kube-api-access-xm9mh" (OuterVolumeSpecName: "kube-api-access-xm9mh") pod "b8fc6210-3f18-4ea6-9c7e-040906265e9b" (UID: "b8fc6210-3f18-4ea6-9c7e-040906265e9b"). InnerVolumeSpecName "kube-api-access-xm9mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.054728 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a9b5e2-d646-4c22-930e-a5ac08cf3e56" path="/var/lib/kubelet/pods/42a9b5e2-d646-4c22-930e-a5ac08cf3e56/volumes" Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.131702 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm9mh\" (UniqueName: \"kubernetes.io/projected/b8fc6210-3f18-4ea6-9c7e-040906265e9b-kube-api-access-xm9mh\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.438003 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-hc7f4" event={"ID":"b8fc6210-3f18-4ea6-9c7e-040906265e9b","Type":"ContainerDied","Data":"a5d55bcf2eaf364b8d0eb04c9810be0950a481b25b882564e84e41317e260dfa"} Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.438362 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d55bcf2eaf364b8d0eb04c9810be0950a481b25b882564e84e41317e260dfa" Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.438056 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-hc7f4" Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.929516 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-jj4g6"] Mar 14 09:00:06 crc kubenswrapper[5129]: I0314 09:00:06.941930 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-jj4g6"] Mar 14 09:00:08 crc kubenswrapper[5129]: I0314 09:00:08.051470 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e411d5e-ec75-4c09-a8e7-ba6495e8683b" path="/var/lib/kubelet/pods/6e411d5e-ec75-4c09-a8e7-ba6495e8683b/volumes" Mar 14 09:00:10 crc kubenswrapper[5129]: I0314 09:00:10.564279 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.230023 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5vp9r"] Mar 14 09:00:11 crc kubenswrapper[5129]: E0314 09:00:11.233684 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fee75f5-d499-42e6-92a2-794e1b325cd4" containerName="collect-profiles" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.233762 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fee75f5-d499-42e6-92a2-794e1b325cd4" containerName="collect-profiles" Mar 14 09:00:11 crc kubenswrapper[5129]: E0314 09:00:11.233793 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fc6210-3f18-4ea6-9c7e-040906265e9b" containerName="oc" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.233809 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fc6210-3f18-4ea6-9c7e-040906265e9b" containerName="oc" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.235776 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8fc6210-3f18-4ea6-9c7e-040906265e9b" containerName="oc" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.235826 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fee75f5-d499-42e6-92a2-794e1b325cd4" containerName="collect-profiles" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.241490 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.275841 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vp9r"] Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.334218 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-utilities\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.334285 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsmqg\" (UniqueName: \"kubernetes.io/projected/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-kube-api-access-xsmqg\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.334326 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-catalog-content\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.435941 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsmqg\" (UniqueName: \"kubernetes.io/projected/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-kube-api-access-xsmqg\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.436031 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-catalog-content\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.436136 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-utilities\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.436597 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-catalog-content\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.436661 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-utilities\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.469632 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsmqg\" (UniqueName: \"kubernetes.io/projected/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-kube-api-access-xsmqg\") pod \"community-operators-5vp9r\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:11 crc kubenswrapper[5129]: I0314 09:00:11.585512 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:12 crc kubenswrapper[5129]: I0314 09:00:12.036813 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:00:12 crc kubenswrapper[5129]: E0314 09:00:12.037648 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:00:12 crc kubenswrapper[5129]: I0314 09:00:12.214890 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vp9r"] Mar 14 09:00:12 crc kubenswrapper[5129]: I0314 09:00:12.504828 5129 generic.go:334] "Generic (PLEG): container finished" podID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerID="e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01" exitCode=0 Mar 14 09:00:12 crc kubenswrapper[5129]: I0314 09:00:12.504904 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vp9r" event={"ID":"3c0f8626-b4e6-48fe-80eb-b17efc43d28c","Type":"ContainerDied","Data":"e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01"} Mar 14 09:00:12 crc kubenswrapper[5129]: I0314 09:00:12.505541 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vp9r" event={"ID":"3c0f8626-b4e6-48fe-80eb-b17efc43d28c","Type":"ContainerStarted","Data":"de12b9b7f64ad6687a4ab63dc753139a776d92d242637b0f7842312e61752fbd"} Mar 14 09:00:14 crc kubenswrapper[5129]: I0314 09:00:14.523213 5129 generic.go:334] "Generic (PLEG): container finished" podID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerID="3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c" exitCode=0 Mar 14 09:00:14 crc kubenswrapper[5129]: I0314 09:00:14.523288 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vp9r" event={"ID":"3c0f8626-b4e6-48fe-80eb-b17efc43d28c","Type":"ContainerDied","Data":"3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c"} Mar 14 09:00:14 crc kubenswrapper[5129]: I0314 09:00:14.526194 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:00:15 crc kubenswrapper[5129]: I0314 09:00:15.534829 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vp9r" event={"ID":"3c0f8626-b4e6-48fe-80eb-b17efc43d28c","Type":"ContainerStarted","Data":"1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b"} Mar 14 09:00:21 crc kubenswrapper[5129]: I0314 09:00:21.586308 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:21 crc kubenswrapper[5129]: I0314 09:00:21.586867 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:22 crc kubenswrapper[5129]: I0314 09:00:22.658522 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5vp9r" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="registry-server" probeResult="failure" output=< Mar 14 09:00:22 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:00:22 crc kubenswrapper[5129]: > Mar 14 09:00:25 crc kubenswrapper[5129]: I0314 09:00:25.038064 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:00:25 crc kubenswrapper[5129]: E0314 09:00:25.039123 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:00:28 crc kubenswrapper[5129]: I0314 09:00:28.025277 5129 scope.go:117] "RemoveContainer" containerID="51836c45836211810ac84637e54672b86910a37bb85e1ab59e243670db7c6c65" Mar 14 09:00:28 crc kubenswrapper[5129]: I0314 09:00:28.065342 5129 scope.go:117] "RemoveContainer" containerID="7f336ec20dd0c117c58d41547a262b708c4159207646c854ab1ada0c6040d0b8" Mar 14 09:00:28 crc kubenswrapper[5129]: I0314 09:00:28.104746 5129 scope.go:117] "RemoveContainer" containerID="86d45daa624ee8fe82612390d6e7f96cb88e6cb16950f3ac0d74dc50b5c805f8" Mar 14 09:00:28 crc kubenswrapper[5129]: I0314 09:00:28.171521 5129 scope.go:117] "RemoveContainer" containerID="e52c828c4c947bb847c88584cc67f0e447afabeb1af44ca5cdd3e1c6a5960b72" Mar 14 09:00:28 crc kubenswrapper[5129]: I0314 09:00:28.194978 5129 scope.go:117] "RemoveContainer" containerID="81b19590fff4577e99e55f0dcee9c9e14bec6db8ab6030801260e633071123bb" Mar 14 09:00:31 crc kubenswrapper[5129]: I0314 09:00:31.633922 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:31 crc kubenswrapper[5129]: I0314 09:00:31.672196 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5vp9r" podStartSLOduration=18.255929098 podStartE2EDuration="20.672171818s" podCreationTimestamp="2026-03-14 09:00:11 +0000 UTC" firstStartedPulling="2026-03-14 09:00:12.507031908 +0000 UTC m=+7275.258947092" lastFinishedPulling="2026-03-14 09:00:14.923274628 +0000 UTC m=+7277.675189812" observedRunningTime="2026-03-14 09:00:15.565321121 +0000 UTC m=+7278.317236305" watchObservedRunningTime="2026-03-14 09:00:31.672171818 +0000 UTC m=+7294.424087002" Mar 14 09:00:31 crc kubenswrapper[5129]: I0314 09:00:31.689677 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:31 crc kubenswrapper[5129]: I0314 09:00:31.879137 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vp9r"] Mar 14 09:00:32 crc kubenswrapper[5129]: I0314 09:00:32.745689 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5vp9r" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="registry-server" containerID="cri-o://1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b" gracePeriod=2 Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.196999 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.286912 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-utilities\") pod \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.287003 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-catalog-content\") pod \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.287159 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsmqg\" (UniqueName: \"kubernetes.io/projected/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-kube-api-access-xsmqg\") pod \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\" (UID: \"3c0f8626-b4e6-48fe-80eb-b17efc43d28c\") " Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.288373 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-utilities" (OuterVolumeSpecName: "utilities") pod "3c0f8626-b4e6-48fe-80eb-b17efc43d28c" (UID: "3c0f8626-b4e6-48fe-80eb-b17efc43d28c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.294508 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-kube-api-access-xsmqg" (OuterVolumeSpecName: "kube-api-access-xsmqg") pod "3c0f8626-b4e6-48fe-80eb-b17efc43d28c" (UID: "3c0f8626-b4e6-48fe-80eb-b17efc43d28c"). InnerVolumeSpecName "kube-api-access-xsmqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.346408 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c0f8626-b4e6-48fe-80eb-b17efc43d28c" (UID: "3c0f8626-b4e6-48fe-80eb-b17efc43d28c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.389082 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.389566 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.389583 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsmqg\" (UniqueName: \"kubernetes.io/projected/3c0f8626-b4e6-48fe-80eb-b17efc43d28c-kube-api-access-xsmqg\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.757932 5129 generic.go:334] "Generic (PLEG): container finished" podID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerID="1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b" exitCode=0 Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.757980 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vp9r" event={"ID":"3c0f8626-b4e6-48fe-80eb-b17efc43d28c","Type":"ContainerDied","Data":"1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b"} Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.757994 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vp9r" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.758012 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vp9r" event={"ID":"3c0f8626-b4e6-48fe-80eb-b17efc43d28c","Type":"ContainerDied","Data":"de12b9b7f64ad6687a4ab63dc753139a776d92d242637b0f7842312e61752fbd"} Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.758030 5129 scope.go:117] "RemoveContainer" containerID="1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.778990 5129 scope.go:117] "RemoveContainer" containerID="3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.802145 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vp9r"] Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.811792 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5vp9r"] Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.835137 5129 scope.go:117] "RemoveContainer" containerID="e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.850884 5129 scope.go:117] "RemoveContainer" containerID="1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b" Mar 14 09:00:33 crc kubenswrapper[5129]: E0314 09:00:33.852210 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b\": container with ID starting with 1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b not found: ID does not exist" containerID="1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.852251 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b"} err="failed to get container status \"1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b\": rpc error: code = NotFound desc = could not find container \"1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b\": container with ID starting with 1bebd37f60fd82c031231482cca174d1ecab9709735e8c96a86aed4b5c334f7b not found: ID does not exist" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.852276 5129 scope.go:117] "RemoveContainer" containerID="3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c" Mar 14 09:00:33 crc kubenswrapper[5129]: E0314 09:00:33.852688 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c\": container with ID starting with 3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c not found: ID does not exist" containerID="3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.852742 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c"} err="failed to get container status \"3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c\": rpc error: code = NotFound desc = could not find container \"3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c\": container with ID starting with 3ed10affd3f93246ed8211faabd10ece6398b333d9dadd3a3cd7388a0e2b193c not found: ID does not exist" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.852767 5129 scope.go:117] "RemoveContainer" containerID="e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01" Mar 14 09:00:33 crc kubenswrapper[5129]: E0314 09:00:33.853224 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01\": container with ID starting with e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01 not found: ID does not exist" containerID="e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01" Mar 14 09:00:33 crc kubenswrapper[5129]: I0314 09:00:33.853271 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01"} err="failed to get container status \"e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01\": rpc error: code = NotFound desc = could not find container \"e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01\": container with ID starting with e1980bcb79bdc1adb7ec0737b6f1fbc73db4cc9333cb683125bd16500688fe01 not found: ID does not exist" Mar 14 09:00:34 crc kubenswrapper[5129]: I0314 09:00:34.046131 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" path="/var/lib/kubelet/pods/3c0f8626-b4e6-48fe-80eb-b17efc43d28c/volumes" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:36.999814 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:37 crc kubenswrapper[5129]: E0314 09:00:37.000498 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="extract-content" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.000514 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="extract-content" Mar 14 09:00:37 crc kubenswrapper[5129]: E0314 09:00:37.000532 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="extract-utilities" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.000540 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="extract-utilities" Mar 14 09:00:37 crc kubenswrapper[5129]: E0314 09:00:37.000560 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="registry-server" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.000566 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="registry-server" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.000794 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0f8626-b4e6-48fe-80eb-b17efc43d28c" containerName="registry-server" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.001998 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.006058 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.014458 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.059081 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.059158 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.059255 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.059282 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.059304 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-scripts\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.059334 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5jm\" (UniqueName: \"kubernetes.io/projected/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-kube-api-access-jw5jm\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.160657 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.160745 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.160845 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.160863 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.160882 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-scripts\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.160910 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5jm\" (UniqueName: \"kubernetes.io/projected/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-kube-api-access-jw5jm\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.160972 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.166216 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-scripts\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.166410 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.169086 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.175277 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.178183 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5jm\" (UniqueName: \"kubernetes.io/projected/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-kube-api-access-jw5jm\") pod \"cinder-scheduler-0\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.321478 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 09:00:37 crc kubenswrapper[5129]: I0314 09:00:37.804283 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:38 crc kubenswrapper[5129]: I0314 09:00:38.575712 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 09:00:38 crc kubenswrapper[5129]: I0314 09:00:38.576277 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api-log" containerID="cri-o://85a78aab5990b7a81cb85e989d70a35e7c1b6edf4bf01854ab57a1f901042230" gracePeriod=30 Mar 14 09:00:38 crc kubenswrapper[5129]: I0314 09:00:38.576775 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api" containerID="cri-o://37580e194ea61e67ffabd35b6b99d298c8c19eb374ab7337eaebd24fe5e695fa" gracePeriod=30 Mar 14 09:00:38 crc kubenswrapper[5129]: I0314 09:00:38.802181 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9","Type":"ContainerStarted","Data":"7bc378dc342896c252903822fc15dba46ccd13a6722dc3fd08f3283eb7be1165"} Mar 14 09:00:38 crc kubenswrapper[5129]: I0314 09:00:38.804419 5129 generic.go:334] "Generic (PLEG): container finished" podID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerID="85a78aab5990b7a81cb85e989d70a35e7c1b6edf4bf01854ab57a1f901042230" exitCode=143 Mar 14 09:00:38 crc kubenswrapper[5129]: I0314 09:00:38.804451 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3bf00d63-a196-4bc1-8176-521b4e0c4695","Type":"ContainerDied","Data":"85a78aab5990b7a81cb85e989d70a35e7c1b6edf4bf01854ab57a1f901042230"} Mar 14 09:00:39 crc kubenswrapper[5129]: I0314 09:00:39.825374 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9","Type":"ContainerStarted","Data":"b6d2426e9c1a31feb2380441f11dce8a056f2a28bafb90e6f42c2e204707d05e"} Mar 14 09:00:39 crc kubenswrapper[5129]: I0314 09:00:39.826323 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9","Type":"ContainerStarted","Data":"07a79c27a6c3c17847eb5748cc651900dfce4eaf195fe2a5e3434887104da3d0"} Mar 14 09:00:39 crc kubenswrapper[5129]: I0314 09:00:39.851330 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.360481054 podStartE2EDuration="3.851313833s" podCreationTimestamp="2026-03-14 09:00:36 +0000 UTC" firstStartedPulling="2026-03-14 09:00:37.80780195 +0000 UTC m=+7300.559717134" lastFinishedPulling="2026-03-14 09:00:38.298634729 +0000 UTC m=+7301.050549913" observedRunningTime="2026-03-14 09:00:39.849143175 +0000 UTC m=+7302.601058359" watchObservedRunningTime="2026-03-14 09:00:39.851313833 +0000 UTC m=+7302.603229017" Mar 14 09:00:40 crc kubenswrapper[5129]: I0314 09:00:40.037248 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:00:40 crc kubenswrapper[5129]: E0314 09:00:40.037940 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:00:41 crc kubenswrapper[5129]: I0314 09:00:41.728719 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.102:8776/healthcheck\": read tcp 10.217.0.2:52260->10.217.1.102:8776: read: connection reset by peer" Mar 14 09:00:41 crc kubenswrapper[5129]: I0314 09:00:41.857382 5129 generic.go:334] "Generic (PLEG): container finished" podID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerID="37580e194ea61e67ffabd35b6b99d298c8c19eb374ab7337eaebd24fe5e695fa" exitCode=0 Mar 14 09:00:41 crc kubenswrapper[5129]: I0314 09:00:41.857446 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3bf00d63-a196-4bc1-8176-521b4e0c4695","Type":"ContainerDied","Data":"37580e194ea61e67ffabd35b6b99d298c8c19eb374ab7337eaebd24fe5e695fa"} Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.145493 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266139 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266246 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data-custom\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266315 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-public-tls-certs\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266360 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-combined-ca-bundle\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266414 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf00d63-a196-4bc1-8176-521b4e0c4695-logs\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266446 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-scripts\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266474 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-internal-tls-certs\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266498 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tfw\" (UniqueName: \"kubernetes.io/projected/3bf00d63-a196-4bc1-8176-521b4e0c4695-kube-api-access-g4tfw\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266521 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3bf00d63-a196-4bc1-8176-521b4e0c4695-etc-machine-id\") pod \"3bf00d63-a196-4bc1-8176-521b4e0c4695\" (UID: \"3bf00d63-a196-4bc1-8176-521b4e0c4695\") " Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.266986 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bf00d63-a196-4bc1-8176-521b4e0c4695-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.272699 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.282992 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf00d63-a196-4bc1-8176-521b4e0c4695-logs" (OuterVolumeSpecName: "logs") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.291848 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf00d63-a196-4bc1-8176-521b4e0c4695-kube-api-access-g4tfw" (OuterVolumeSpecName: "kube-api-access-g4tfw") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "kube-api-access-g4tfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.295963 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-scripts" (OuterVolumeSpecName: "scripts") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.322200 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.322725 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.334263 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.336876 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data" (OuterVolumeSpecName: "config-data") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.360812 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3bf00d63-a196-4bc1-8176-521b4e0c4695" (UID: "3bf00d63-a196-4bc1-8176-521b4e0c4695"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369089 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369128 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf00d63-a196-4bc1-8176-521b4e0c4695-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369140 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369154 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369166 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tfw\" (UniqueName: \"kubernetes.io/projected/3bf00d63-a196-4bc1-8176-521b4e0c4695-kube-api-access-g4tfw\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369177 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3bf00d63-a196-4bc1-8176-521b4e0c4695-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369187 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369247 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.369262 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bf00d63-a196-4bc1-8176-521b4e0c4695-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.870884 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3bf00d63-a196-4bc1-8176-521b4e0c4695","Type":"ContainerDied","Data":"5c2d0a9e7c41a6a48a936132651c2d994a8668381121b214e074283968f81581"} Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.870934 5129 scope.go:117] "RemoveContainer" containerID="37580e194ea61e67ffabd35b6b99d298c8c19eb374ab7337eaebd24fe5e695fa" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.871929 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.924915 5129 scope.go:117] "RemoveContainer" containerID="85a78aab5990b7a81cb85e989d70a35e7c1b6edf4bf01854ab57a1f901042230" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.932008 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.947078 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.977772 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 09:00:42 crc kubenswrapper[5129]: E0314 09:00:42.992417 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.992459 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api" Mar 14 09:00:42 crc kubenswrapper[5129]: E0314 09:00:42.992476 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api-log" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.992483 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api-log" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.993469 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api-log" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.993522 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" containerName="cinder-api" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.995331 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.995447 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 09:00:42 crc kubenswrapper[5129]: I0314 09:00:42.999224 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:42.999616 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:42.999821 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087311 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-public-tls-certs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087360 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ef9df1-2d5c-406a-9adf-26fd7bd95731-logs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087404 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-config-data-custom\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087457 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087487 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-scripts\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087524 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087544 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vrb\" (UniqueName: \"kubernetes.io/projected/49ef9df1-2d5c-406a-9adf-26fd7bd95731-kube-api-access-r6vrb\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087564 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-config-data\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.087583 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ef9df1-2d5c-406a-9adf-26fd7bd95731-etc-machine-id\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.189620 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-config-data-custom\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.189691 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.189717 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-scripts\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.189758 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.189807 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vrb\" (UniqueName: \"kubernetes.io/projected/49ef9df1-2d5c-406a-9adf-26fd7bd95731-kube-api-access-r6vrb\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.189833 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-config-data\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.190331 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ef9df1-2d5c-406a-9adf-26fd7bd95731-etc-machine-id\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.190395 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-public-tls-certs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.190432 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ef9df1-2d5c-406a-9adf-26fd7bd95731-logs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.190395 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ef9df1-2d5c-406a-9adf-26fd7bd95731-etc-machine-id\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.190943 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ef9df1-2d5c-406a-9adf-26fd7bd95731-logs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.193871 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.194190 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-config-data\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.196876 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.197437 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-config-data-custom\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.198107 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-public-tls-certs\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.199114 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ef9df1-2d5c-406a-9adf-26fd7bd95731-scripts\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.210029 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vrb\" (UniqueName: \"kubernetes.io/projected/49ef9df1-2d5c-406a-9adf-26fd7bd95731-kube-api-access-r6vrb\") pod \"cinder-api-0\" (UID: \"49ef9df1-2d5c-406a-9adf-26fd7bd95731\") " pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.318277 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.747950 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 09:00:43 crc kubenswrapper[5129]: W0314 09:00:43.754511 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49ef9df1_2d5c_406a_9adf_26fd7bd95731.slice/crio-0ff26d12bf89c7e407d9f54a1f3538f1386eda01fd24e8dfe674cda4beb8b31a WatchSource:0}: Error finding container 0ff26d12bf89c7e407d9f54a1f3538f1386eda01fd24e8dfe674cda4beb8b31a: Status 404 returned error can't find the container with id 0ff26d12bf89c7e407d9f54a1f3538f1386eda01fd24e8dfe674cda4beb8b31a Mar 14 09:00:43 crc kubenswrapper[5129]: I0314 09:00:43.891279 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"49ef9df1-2d5c-406a-9adf-26fd7bd95731","Type":"ContainerStarted","Data":"0ff26d12bf89c7e407d9f54a1f3538f1386eda01fd24e8dfe674cda4beb8b31a"} Mar 14 09:00:44 crc kubenswrapper[5129]: I0314 09:00:44.049160 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf00d63-a196-4bc1-8176-521b4e0c4695" path="/var/lib/kubelet/pods/3bf00d63-a196-4bc1-8176-521b4e0c4695/volumes" Mar 14 09:00:44 crc kubenswrapper[5129]: I0314 09:00:44.905770 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"49ef9df1-2d5c-406a-9adf-26fd7bd95731","Type":"ContainerStarted","Data":"18760e1a8f01c04c470e54a2da4880bfbc1ad471de8a3cd93d6de534b96c58f5"} Mar 14 09:00:44 crc kubenswrapper[5129]: I0314 09:00:44.906207 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"49ef9df1-2d5c-406a-9adf-26fd7bd95731","Type":"ContainerStarted","Data":"21b2fa34713dbaf2de156d9c55a33a5965665f6260dc744436be21bb53153051"} Mar 14 09:00:44 crc kubenswrapper[5129]: I0314 09:00:44.906247 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 09:00:44 crc kubenswrapper[5129]: I0314 09:00:44.929287 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.929262054 podStartE2EDuration="2.929262054s" podCreationTimestamp="2026-03-14 09:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:44.928207856 +0000 UTC m=+7307.680123060" watchObservedRunningTime="2026-03-14 09:00:44.929262054 +0000 UTC m=+7307.681177248" Mar 14 09:00:47 crc kubenswrapper[5129]: I0314 09:00:47.559976 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 09:00:47 crc kubenswrapper[5129]: I0314 09:00:47.632070 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:47 crc kubenswrapper[5129]: I0314 09:00:47.937336 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="cinder-scheduler" containerID="cri-o://07a79c27a6c3c17847eb5748cc651900dfce4eaf195fe2a5e3434887104da3d0" gracePeriod=30 Mar 14 09:00:47 crc kubenswrapper[5129]: I0314 09:00:47.937464 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="probe" containerID="cri-o://b6d2426e9c1a31feb2380441f11dce8a056f2a28bafb90e6f42c2e204707d05e" gracePeriod=30 Mar 14 09:00:48 crc kubenswrapper[5129]: I0314 09:00:48.948787 5129 generic.go:334] "Generic (PLEG): container finished" podID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerID="b6d2426e9c1a31feb2380441f11dce8a056f2a28bafb90e6f42c2e204707d05e" exitCode=0 Mar 14 09:00:48 crc kubenswrapper[5129]: I0314 09:00:48.948832 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9","Type":"ContainerDied","Data":"b6d2426e9c1a31feb2380441f11dce8a056f2a28bafb90e6f42c2e204707d05e"} Mar 14 09:00:49 crc kubenswrapper[5129]: I0314 09:00:49.963804 5129 generic.go:334] "Generic (PLEG): container finished" podID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerID="07a79c27a6c3c17847eb5748cc651900dfce4eaf195fe2a5e3434887104da3d0" exitCode=0 Mar 14 09:00:49 crc kubenswrapper[5129]: I0314 09:00:49.963984 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9","Type":"ContainerDied","Data":"07a79c27a6c3c17847eb5748cc651900dfce4eaf195fe2a5e3434887104da3d0"} Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.196815 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.248551 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-combined-ca-bundle\") pod \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.248676 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data-custom\") pod \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.248745 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-etc-machine-id\") pod \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.248769 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw5jm\" (UniqueName: \"kubernetes.io/projected/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-kube-api-access-jw5jm\") pod \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.248858 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" (UID: "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.248904 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-scripts\") pod \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.249012 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data\") pod \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\" (UID: \"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9\") " Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.249469 5129 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.255425 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-kube-api-access-jw5jm" (OuterVolumeSpecName: "kube-api-access-jw5jm") pod "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" (UID: "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9"). InnerVolumeSpecName "kube-api-access-jw5jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.260931 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-scripts" (OuterVolumeSpecName: "scripts") pod "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" (UID: "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.266915 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" (UID: "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.308629 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" (UID: "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.337550 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data" (OuterVolumeSpecName: "config-data") pod "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" (UID: "56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.356680 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.356992 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.357099 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.357172 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw5jm\" (UniqueName: \"kubernetes.io/projected/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-kube-api-access-jw5jm\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.357252 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.976815 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9","Type":"ContainerDied","Data":"7bc378dc342896c252903822fc15dba46ccd13a6722dc3fd08f3283eb7be1165"} Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.976891 5129 scope.go:117] "RemoveContainer" containerID="b6d2426e9c1a31feb2380441f11dce8a056f2a28bafb90e6f42c2e204707d05e" Mar 14 09:00:50 crc kubenswrapper[5129]: I0314 09:00:50.976889 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.004896 5129 scope.go:117] "RemoveContainer" containerID="07a79c27a6c3c17847eb5748cc651900dfce4eaf195fe2a5e3434887104da3d0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.023683 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.032103 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.077767 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:51 crc kubenswrapper[5129]: E0314 09:00:51.078530 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="probe" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.078566 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="probe" Mar 14 09:00:51 crc kubenswrapper[5129]: E0314 09:00:51.078645 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="cinder-scheduler" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.078660 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="cinder-scheduler" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.078985 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="probe" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.079031 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" containerName="cinder-scheduler" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.082836 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.086781 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.087901 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.174567 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bk4\" (UniqueName: \"kubernetes.io/projected/346b9da9-f8da-416b-aa79-e42409eb111a-kube-api-access-96bk4\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.174866 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346b9da9-f8da-416b-aa79-e42409eb111a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.174907 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.174944 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.174981 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-scripts\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.175031 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-config-data\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.277003 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346b9da9-f8da-416b-aa79-e42409eb111a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.277053 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.277076 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.277093 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-scripts\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.277120 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-config-data\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.277173 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96bk4\" (UniqueName: \"kubernetes.io/projected/346b9da9-f8da-416b-aa79-e42409eb111a-kube-api-access-96bk4\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.277544 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/346b9da9-f8da-416b-aa79-e42409eb111a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.283671 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-config-data\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.283973 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-scripts\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.288570 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.297543 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346b9da9-f8da-416b-aa79-e42409eb111a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.302244 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bk4\" (UniqueName: \"kubernetes.io/projected/346b9da9-f8da-416b-aa79-e42409eb111a-kube-api-access-96bk4\") pod \"cinder-scheduler-0\" (UID: \"346b9da9-f8da-416b-aa79-e42409eb111a\") " pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.411279 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.913504 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 09:00:51 crc kubenswrapper[5129]: I0314 09:00:51.989988 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346b9da9-f8da-416b-aa79-e42409eb111a","Type":"ContainerStarted","Data":"3c60eaa3d50ed2914ecc2fa2f71573f561b95cbb96bcf2485223633136747a39"} Mar 14 09:00:52 crc kubenswrapper[5129]: I0314 09:00:52.050077 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9" path="/var/lib/kubelet/pods/56e7a7fa-3f97-4c8c-9dfb-f16e8c030cf9/volumes" Mar 14 09:00:53 crc kubenswrapper[5129]: I0314 09:00:53.000793 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346b9da9-f8da-416b-aa79-e42409eb111a","Type":"ContainerStarted","Data":"e7138c3b859ccad2f60373d422aa06e09384e21a3dbe9b5d9a378bb23d2abcc5"} Mar 14 09:00:54 crc kubenswrapper[5129]: I0314 09:00:54.014424 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"346b9da9-f8da-416b-aa79-e42409eb111a","Type":"ContainerStarted","Data":"3ed9d4b57911d293c42cddc830d9fd8afb0535177f797f25029fbf365f0b4fca"} Mar 14 09:00:54 crc kubenswrapper[5129]: I0314 09:00:54.040462 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.040440043 podStartE2EDuration="3.040440043s" podCreationTimestamp="2026-03-14 09:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:54.03551254 +0000 UTC m=+7316.787427724" watchObservedRunningTime="2026-03-14 09:00:54.040440043 +0000 UTC m=+7316.792355227" Mar 14 09:00:55 crc kubenswrapper[5129]: I0314 09:00:55.037636 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:00:55 crc kubenswrapper[5129]: E0314 09:00:55.038428 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:00:55 crc kubenswrapper[5129]: I0314 09:00:55.207305 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 09:00:56 crc kubenswrapper[5129]: I0314 09:00:56.412315 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.157760 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29557981-mvkq7"] Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.159714 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.170071 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557981-mvkq7"] Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.285138 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb6hj\" (UniqueName: \"kubernetes.io/projected/06f8fa26-0897-4a17-a055-86534de558f7-kube-api-access-rb6hj\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.285200 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-combined-ca-bundle\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.285271 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-config-data\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.285295 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-fernet-keys\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.387301 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-config-data\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.387368 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-fernet-keys\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.387442 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb6hj\" (UniqueName: \"kubernetes.io/projected/06f8fa26-0897-4a17-a055-86534de558f7-kube-api-access-rb6hj\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.387469 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-combined-ca-bundle\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.394441 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-fernet-keys\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.395304 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-combined-ca-bundle\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.397006 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-config-data\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.406152 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb6hj\" (UniqueName: \"kubernetes.io/projected/06f8fa26-0897-4a17-a055-86534de558f7-kube-api-access-rb6hj\") pod \"keystone-cron-29557981-mvkq7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.491119 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:00 crc kubenswrapper[5129]: W0314 09:01:00.960507 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06f8fa26_0897_4a17_a055_86534de558f7.slice/crio-8d69cb12f70e34a8fe1efbc0947677835d5707610a5c3a27d06dd44a6b8152bf WatchSource:0}: Error finding container 8d69cb12f70e34a8fe1efbc0947677835d5707610a5c3a27d06dd44a6b8152bf: Status 404 returned error can't find the container with id 8d69cb12f70e34a8fe1efbc0947677835d5707610a5c3a27d06dd44a6b8152bf Mar 14 09:01:00 crc kubenswrapper[5129]: I0314 09:01:00.969757 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557981-mvkq7"] Mar 14 09:01:01 crc kubenswrapper[5129]: I0314 09:01:01.108933 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-mvkq7" event={"ID":"06f8fa26-0897-4a17-a055-86534de558f7","Type":"ContainerStarted","Data":"8d69cb12f70e34a8fe1efbc0947677835d5707610a5c3a27d06dd44a6b8152bf"} Mar 14 09:01:01 crc kubenswrapper[5129]: I0314 09:01:01.630131 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 09:01:01 crc kubenswrapper[5129]: I0314 09:01:01.973321 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7rjql"] Mar 14 09:01:01 crc kubenswrapper[5129]: I0314 09:01:01.974845 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7rjql" Mar 14 09:01:01 crc kubenswrapper[5129]: I0314 09:01:01.999726 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7rjql"] Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.089706 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-276d-account-create-update-v486s"] Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.093707 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.099406 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.102925 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-276d-account-create-update-v486s"] Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.126543 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ml9\" (UniqueName: \"kubernetes.io/projected/7c093d97-b823-4772-a6ac-e324f8c64188-kube-api-access-q8ml9\") pod \"glance-db-create-7rjql\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " pod="openstack/glance-db-create-7rjql" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.126578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c093d97-b823-4772-a6ac-e324f8c64188-operator-scripts\") pod \"glance-db-create-7rjql\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " pod="openstack/glance-db-create-7rjql" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.130555 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-mvkq7" event={"ID":"06f8fa26-0897-4a17-a055-86534de558f7","Type":"ContainerStarted","Data":"e5729746d4e6c0bfc00a64d2c99f87b6b29bbfc8cdff32c77996fc35b55d4fc6"} Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.156390 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29557981-mvkq7" podStartSLOduration=2.156363962 podStartE2EDuration="2.156363962s" podCreationTimestamp="2026-03-14 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:02.149693342 +0000 UTC m=+7324.901608526" watchObservedRunningTime="2026-03-14 09:01:02.156363962 +0000 UTC m=+7324.908279146" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.228244 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c093d97-b823-4772-a6ac-e324f8c64188-operator-scripts\") pod \"glance-db-create-7rjql\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " pod="openstack/glance-db-create-7rjql" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.228368 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61da916-484d-43c8-98a1-b7d5846218cf-operator-scripts\") pod \"glance-276d-account-create-update-v486s\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.228501 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rr94\" (UniqueName: \"kubernetes.io/projected/b61da916-484d-43c8-98a1-b7d5846218cf-kube-api-access-5rr94\") pod \"glance-276d-account-create-update-v486s\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.229693 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ml9\" (UniqueName: \"kubernetes.io/projected/7c093d97-b823-4772-a6ac-e324f8c64188-kube-api-access-q8ml9\") pod \"glance-db-create-7rjql\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " pod="openstack/glance-db-create-7rjql" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.230129 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c093d97-b823-4772-a6ac-e324f8c64188-operator-scripts\") pod \"glance-db-create-7rjql\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " pod="openstack/glance-db-create-7rjql" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.248742 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ml9\" (UniqueName: \"kubernetes.io/projected/7c093d97-b823-4772-a6ac-e324f8c64188-kube-api-access-q8ml9\") pod \"glance-db-create-7rjql\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " pod="openstack/glance-db-create-7rjql" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.306028 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7rjql" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.331747 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rr94\" (UniqueName: \"kubernetes.io/projected/b61da916-484d-43c8-98a1-b7d5846218cf-kube-api-access-5rr94\") pod \"glance-276d-account-create-update-v486s\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.332027 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61da916-484d-43c8-98a1-b7d5846218cf-operator-scripts\") pod \"glance-276d-account-create-update-v486s\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.332865 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61da916-484d-43c8-98a1-b7d5846218cf-operator-scripts\") pod \"glance-276d-account-create-update-v486s\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.356468 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rr94\" (UniqueName: \"kubernetes.io/projected/b61da916-484d-43c8-98a1-b7d5846218cf-kube-api-access-5rr94\") pod \"glance-276d-account-create-update-v486s\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:02 crc kubenswrapper[5129]: I0314 09:01:02.425869 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:03 crc kubenswrapper[5129]: W0314 09:01:02.782175 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c093d97_b823_4772_a6ac_e324f8c64188.slice/crio-a717a9ae34b57aa8e60a202979271f899781bfa6d470641a45081d6d5c48f153 WatchSource:0}: Error finding container a717a9ae34b57aa8e60a202979271f899781bfa6d470641a45081d6d5c48f153: Status 404 returned error can't find the container with id a717a9ae34b57aa8e60a202979271f899781bfa6d470641a45081d6d5c48f153 Mar 14 09:01:03 crc kubenswrapper[5129]: I0314 09:01:02.782665 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7rjql"] Mar 14 09:01:03 crc kubenswrapper[5129]: I0314 09:01:03.141039 5129 generic.go:334] "Generic (PLEG): container finished" podID="7c093d97-b823-4772-a6ac-e324f8c64188" containerID="b04c7ca870d112e206a7ab30467f1f8f7dc9350cc8b0b085ee81ae36772ae0fd" exitCode=0 Mar 14 09:01:03 crc kubenswrapper[5129]: I0314 09:01:03.141194 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7rjql" event={"ID":"7c093d97-b823-4772-a6ac-e324f8c64188","Type":"ContainerDied","Data":"b04c7ca870d112e206a7ab30467f1f8f7dc9350cc8b0b085ee81ae36772ae0fd"} Mar 14 09:01:03 crc kubenswrapper[5129]: I0314 09:01:03.142128 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7rjql" event={"ID":"7c093d97-b823-4772-a6ac-e324f8c64188","Type":"ContainerStarted","Data":"a717a9ae34b57aa8e60a202979271f899781bfa6d470641a45081d6d5c48f153"} Mar 14 09:01:03 crc kubenswrapper[5129]: I0314 09:01:03.528371 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-276d-account-create-update-v486s"] Mar 14 09:01:03 crc kubenswrapper[5129]: W0314 09:01:03.538148 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61da916_484d_43c8_98a1_b7d5846218cf.slice/crio-5921aad7e1e5f205586eabee1ccc1f2dbc8490ac617bb472b6884cd61b78b2ca WatchSource:0}: Error finding container 5921aad7e1e5f205586eabee1ccc1f2dbc8490ac617bb472b6884cd61b78b2ca: Status 404 returned error can't find the container with id 5921aad7e1e5f205586eabee1ccc1f2dbc8490ac617bb472b6884cd61b78b2ca Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.154384 5129 generic.go:334] "Generic (PLEG): container finished" podID="06f8fa26-0897-4a17-a055-86534de558f7" containerID="e5729746d4e6c0bfc00a64d2c99f87b6b29bbfc8cdff32c77996fc35b55d4fc6" exitCode=0 Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.154460 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-mvkq7" event={"ID":"06f8fa26-0897-4a17-a055-86534de558f7","Type":"ContainerDied","Data":"e5729746d4e6c0bfc00a64d2c99f87b6b29bbfc8cdff32c77996fc35b55d4fc6"} Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.157097 5129 generic.go:334] "Generic (PLEG): container finished" podID="b61da916-484d-43c8-98a1-b7d5846218cf" containerID="414a4b03f77f957031a3dcc3fa489ae983a68da4a1c13fb14da0287016ae3aee" exitCode=0 Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.157171 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-276d-account-create-update-v486s" event={"ID":"b61da916-484d-43c8-98a1-b7d5846218cf","Type":"ContainerDied","Data":"414a4b03f77f957031a3dcc3fa489ae983a68da4a1c13fb14da0287016ae3aee"} Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.157239 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-276d-account-create-update-v486s" event={"ID":"b61da916-484d-43c8-98a1-b7d5846218cf","Type":"ContainerStarted","Data":"5921aad7e1e5f205586eabee1ccc1f2dbc8490ac617bb472b6884cd61b78b2ca"} Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.522786 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7rjql" Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.697843 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8ml9\" (UniqueName: \"kubernetes.io/projected/7c093d97-b823-4772-a6ac-e324f8c64188-kube-api-access-q8ml9\") pod \"7c093d97-b823-4772-a6ac-e324f8c64188\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.698119 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c093d97-b823-4772-a6ac-e324f8c64188-operator-scripts\") pod \"7c093d97-b823-4772-a6ac-e324f8c64188\" (UID: \"7c093d97-b823-4772-a6ac-e324f8c64188\") " Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.699150 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c093d97-b823-4772-a6ac-e324f8c64188-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c093d97-b823-4772-a6ac-e324f8c64188" (UID: "7c093d97-b823-4772-a6ac-e324f8c64188"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.705999 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c093d97-b823-4772-a6ac-e324f8c64188-kube-api-access-q8ml9" (OuterVolumeSpecName: "kube-api-access-q8ml9") pod "7c093d97-b823-4772-a6ac-e324f8c64188" (UID: "7c093d97-b823-4772-a6ac-e324f8c64188"). InnerVolumeSpecName "kube-api-access-q8ml9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.801139 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c093d97-b823-4772-a6ac-e324f8c64188-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:04 crc kubenswrapper[5129]: I0314 09:01:04.801225 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8ml9\" (UniqueName: \"kubernetes.io/projected/7c093d97-b823-4772-a6ac-e324f8c64188-kube-api-access-q8ml9\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.169353 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7rjql" event={"ID":"7c093d97-b823-4772-a6ac-e324f8c64188","Type":"ContainerDied","Data":"a717a9ae34b57aa8e60a202979271f899781bfa6d470641a45081d6d5c48f153"} Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.169514 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a717a9ae34b57aa8e60a202979271f899781bfa6d470641a45081d6d5c48f153" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.169387 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7rjql" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.620137 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.628461 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.732058 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rr94\" (UniqueName: \"kubernetes.io/projected/b61da916-484d-43c8-98a1-b7d5846218cf-kube-api-access-5rr94\") pod \"b61da916-484d-43c8-98a1-b7d5846218cf\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.732163 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-fernet-keys\") pod \"06f8fa26-0897-4a17-a055-86534de558f7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.732271 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61da916-484d-43c8-98a1-b7d5846218cf-operator-scripts\") pod \"b61da916-484d-43c8-98a1-b7d5846218cf\" (UID: \"b61da916-484d-43c8-98a1-b7d5846218cf\") " Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.732978 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b61da916-484d-43c8-98a1-b7d5846218cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b61da916-484d-43c8-98a1-b7d5846218cf" (UID: "b61da916-484d-43c8-98a1-b7d5846218cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.733059 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb6hj\" (UniqueName: \"kubernetes.io/projected/06f8fa26-0897-4a17-a055-86534de558f7-kube-api-access-rb6hj\") pod \"06f8fa26-0897-4a17-a055-86534de558f7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.733096 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-config-data\") pod \"06f8fa26-0897-4a17-a055-86534de558f7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.733432 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-combined-ca-bundle\") pod \"06f8fa26-0897-4a17-a055-86534de558f7\" (UID: \"06f8fa26-0897-4a17-a055-86534de558f7\") " Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.733828 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61da916-484d-43c8-98a1-b7d5846218cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.740680 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "06f8fa26-0897-4a17-a055-86534de558f7" (UID: "06f8fa26-0897-4a17-a055-86534de558f7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.742424 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f8fa26-0897-4a17-a055-86534de558f7-kube-api-access-rb6hj" (OuterVolumeSpecName: "kube-api-access-rb6hj") pod "06f8fa26-0897-4a17-a055-86534de558f7" (UID: "06f8fa26-0897-4a17-a055-86534de558f7"). InnerVolumeSpecName "kube-api-access-rb6hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.742788 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61da916-484d-43c8-98a1-b7d5846218cf-kube-api-access-5rr94" (OuterVolumeSpecName: "kube-api-access-5rr94") pod "b61da916-484d-43c8-98a1-b7d5846218cf" (UID: "b61da916-484d-43c8-98a1-b7d5846218cf"). InnerVolumeSpecName "kube-api-access-5rr94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.761731 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06f8fa26-0897-4a17-a055-86534de558f7" (UID: "06f8fa26-0897-4a17-a055-86534de558f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.789933 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-config-data" (OuterVolumeSpecName: "config-data") pod "06f8fa26-0897-4a17-a055-86534de558f7" (UID: "06f8fa26-0897-4a17-a055-86534de558f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.835232 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb6hj\" (UniqueName: \"kubernetes.io/projected/06f8fa26-0897-4a17-a055-86534de558f7-kube-api-access-rb6hj\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.835269 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.835281 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.835289 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rr94\" (UniqueName: \"kubernetes.io/projected/b61da916-484d-43c8-98a1-b7d5846218cf-kube-api-access-5rr94\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[5129]: I0314 09:01:05.835297 5129 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06f8fa26-0897-4a17-a055-86534de558f7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:06 crc kubenswrapper[5129]: I0314 09:01:06.036282 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:01:06 crc kubenswrapper[5129]: E0314 09:01:06.036859 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:01:06 crc kubenswrapper[5129]: I0314 09:01:06.179417 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-mvkq7" event={"ID":"06f8fa26-0897-4a17-a055-86534de558f7","Type":"ContainerDied","Data":"8d69cb12f70e34a8fe1efbc0947677835d5707610a5c3a27d06dd44a6b8152bf"} Mar 14 09:01:06 crc kubenswrapper[5129]: I0314 09:01:06.179691 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d69cb12f70e34a8fe1efbc0947677835d5707610a5c3a27d06dd44a6b8152bf" Mar 14 09:01:06 crc kubenswrapper[5129]: I0314 09:01:06.179830 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-mvkq7" Mar 14 09:01:06 crc kubenswrapper[5129]: I0314 09:01:06.181942 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-276d-account-create-update-v486s" event={"ID":"b61da916-484d-43c8-98a1-b7d5846218cf","Type":"ContainerDied","Data":"5921aad7e1e5f205586eabee1ccc1f2dbc8490ac617bb472b6884cd61b78b2ca"} Mar 14 09:01:06 crc kubenswrapper[5129]: I0314 09:01:06.181967 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5921aad7e1e5f205586eabee1ccc1f2dbc8490ac617bb472b6884cd61b78b2ca" Mar 14 09:01:06 crc kubenswrapper[5129]: I0314 09:01:06.182434 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-276d-account-create-update-v486s" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.151707 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-q2sjz"] Mar 14 09:01:07 crc kubenswrapper[5129]: E0314 09:01:07.152158 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61da916-484d-43c8-98a1-b7d5846218cf" containerName="mariadb-account-create-update" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.152173 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61da916-484d-43c8-98a1-b7d5846218cf" containerName="mariadb-account-create-update" Mar 14 09:01:07 crc kubenswrapper[5129]: E0314 09:01:07.152184 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c093d97-b823-4772-a6ac-e324f8c64188" containerName="mariadb-database-create" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.152192 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c093d97-b823-4772-a6ac-e324f8c64188" containerName="mariadb-database-create" Mar 14 09:01:07 crc kubenswrapper[5129]: E0314 09:01:07.152236 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f8fa26-0897-4a17-a055-86534de558f7" containerName="keystone-cron" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.152246 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f8fa26-0897-4a17-a055-86534de558f7" containerName="keystone-cron" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.152539 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61da916-484d-43c8-98a1-b7d5846218cf" containerName="mariadb-account-create-update" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.152557 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c093d97-b823-4772-a6ac-e324f8c64188" containerName="mariadb-database-create" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.152568 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f8fa26-0897-4a17-a055-86534de558f7" containerName="keystone-cron" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.153161 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.156302 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.159258 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p4sfk" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.169385 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-q2sjz"] Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.259842 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7gqt\" (UniqueName: \"kubernetes.io/projected/2d7cfecd-212c-45e6-b678-5e673ff37968-kube-api-access-v7gqt\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.260365 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-db-sync-config-data\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.260390 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-config-data\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.260429 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-combined-ca-bundle\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.362748 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-db-sync-config-data\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.362813 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-config-data\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.362862 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-combined-ca-bundle\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.362968 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7gqt\" (UniqueName: \"kubernetes.io/projected/2d7cfecd-212c-45e6-b678-5e673ff37968-kube-api-access-v7gqt\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.371728 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-db-sync-config-data\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.371852 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-combined-ca-bundle\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.372326 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-config-data\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.396124 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7gqt\" (UniqueName: \"kubernetes.io/projected/2d7cfecd-212c-45e6-b678-5e673ff37968-kube-api-access-v7gqt\") pod \"glance-db-sync-q2sjz\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:07 crc kubenswrapper[5129]: I0314 09:01:07.479085 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:08 crc kubenswrapper[5129]: I0314 09:01:08.032863 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-q2sjz"] Mar 14 09:01:08 crc kubenswrapper[5129]: I0314 09:01:08.205290 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q2sjz" event={"ID":"2d7cfecd-212c-45e6-b678-5e673ff37968","Type":"ContainerStarted","Data":"519dd4ad5e5b03ff4a02a522eb2b18ed26672b9c2d26604092b6c5885768993d"} Mar 14 09:01:21 crc kubenswrapper[5129]: I0314 09:01:21.036700 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:01:21 crc kubenswrapper[5129]: E0314 09:01:21.037962 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:01:25 crc kubenswrapper[5129]: I0314 09:01:25.354326 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q2sjz" event={"ID":"2d7cfecd-212c-45e6-b678-5e673ff37968","Type":"ContainerStarted","Data":"b7ec48e09f49897d15d62ac69c4f6b5157fde2f77f6ab2e3d6369c7f088652c2"} Mar 14 09:01:25 crc kubenswrapper[5129]: I0314 09:01:25.388308 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-q2sjz" podStartSLOduration=1.9474577910000002 podStartE2EDuration="18.388286164s" podCreationTimestamp="2026-03-14 09:01:07 +0000 UTC" firstStartedPulling="2026-03-14 09:01:08.044651132 +0000 UTC m=+7330.796566316" lastFinishedPulling="2026-03-14 09:01:24.485479505 +0000 UTC m=+7347.237394689" observedRunningTime="2026-03-14 09:01:25.382276881 +0000 UTC m=+7348.134192075" watchObservedRunningTime="2026-03-14 09:01:25.388286164 +0000 UTC m=+7348.140201348" Mar 14 09:01:28 crc kubenswrapper[5129]: I0314 09:01:28.390327 5129 generic.go:334] "Generic (PLEG): container finished" podID="2d7cfecd-212c-45e6-b678-5e673ff37968" containerID="b7ec48e09f49897d15d62ac69c4f6b5157fde2f77f6ab2e3d6369c7f088652c2" exitCode=0 Mar 14 09:01:28 crc kubenswrapper[5129]: I0314 09:01:28.390421 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q2sjz" event={"ID":"2d7cfecd-212c-45e6-b678-5e673ff37968","Type":"ContainerDied","Data":"b7ec48e09f49897d15d62ac69c4f6b5157fde2f77f6ab2e3d6369c7f088652c2"} Mar 14 09:01:29 crc kubenswrapper[5129]: I0314 09:01:29.862293 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.035890 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-combined-ca-bundle\") pod \"2d7cfecd-212c-45e6-b678-5e673ff37968\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.035982 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-db-sync-config-data\") pod \"2d7cfecd-212c-45e6-b678-5e673ff37968\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.036217 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-config-data\") pod \"2d7cfecd-212c-45e6-b678-5e673ff37968\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.036448 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7gqt\" (UniqueName: \"kubernetes.io/projected/2d7cfecd-212c-45e6-b678-5e673ff37968-kube-api-access-v7gqt\") pod \"2d7cfecd-212c-45e6-b678-5e673ff37968\" (UID: \"2d7cfecd-212c-45e6-b678-5e673ff37968\") " Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.041897 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7cfecd-212c-45e6-b678-5e673ff37968-kube-api-access-v7gqt" (OuterVolumeSpecName: "kube-api-access-v7gqt") pod "2d7cfecd-212c-45e6-b678-5e673ff37968" (UID: "2d7cfecd-212c-45e6-b678-5e673ff37968"). InnerVolumeSpecName "kube-api-access-v7gqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.047846 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2d7cfecd-212c-45e6-b678-5e673ff37968" (UID: "2d7cfecd-212c-45e6-b678-5e673ff37968"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.082134 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d7cfecd-212c-45e6-b678-5e673ff37968" (UID: "2d7cfecd-212c-45e6-b678-5e673ff37968"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.086666 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-config-data" (OuterVolumeSpecName: "config-data") pod "2d7cfecd-212c-45e6-b678-5e673ff37968" (UID: "2d7cfecd-212c-45e6-b678-5e673ff37968"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.139155 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7gqt\" (UniqueName: \"kubernetes.io/projected/2d7cfecd-212c-45e6-b678-5e673ff37968-kube-api-access-v7gqt\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.139203 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.139216 5129 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.139226 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7cfecd-212c-45e6-b678-5e673ff37968-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.413126 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-q2sjz" event={"ID":"2d7cfecd-212c-45e6-b678-5e673ff37968","Type":"ContainerDied","Data":"519dd4ad5e5b03ff4a02a522eb2b18ed26672b9c2d26604092b6c5885768993d"} Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.413759 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="519dd4ad5e5b03ff4a02a522eb2b18ed26672b9c2d26604092b6c5885768993d" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.413228 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-q2sjz" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.713182 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:30 crc kubenswrapper[5129]: E0314 09:01:30.714969 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7cfecd-212c-45e6-b678-5e673ff37968" containerName="glance-db-sync" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.715000 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7cfecd-212c-45e6-b678-5e673ff37968" containerName="glance-db-sync" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.715195 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d7cfecd-212c-45e6-b678-5e673ff37968" containerName="glance-db-sync" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.716170 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.719048 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.719431 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.719639 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p4sfk" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.799573 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.851135 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c4b58fb7-5mf7r"] Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.852510 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.856496 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.856549 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq8dm\" (UniqueName: \"kubernetes.io/projected/268b884c-e047-4ce4-b6e3-122fdd2a76b3-kube-api-access-sq8dm\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.856628 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-logs\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.856692 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.856886 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.857040 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.905069 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c4b58fb7-5mf7r"] Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.950179 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.951678 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.954160 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.959683 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.959747 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-dns-svc\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.959799 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.960281 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.959828 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq8dm\" (UniqueName: \"kubernetes.io/projected/268b884c-e047-4ce4-b6e3-122fdd2a76b3-kube-api-access-sq8dm\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.960409 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-logs\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.960778 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-logs\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.960860 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzn5h\" (UniqueName: \"kubernetes.io/projected/c4715224-6c27-4dc1-9380-fe174dcc9521-kube-api-access-gzn5h\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.960943 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.960997 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-sb\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.961112 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-config\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.961154 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-nb\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.961198 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.966296 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.970557 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-config-data\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.981342 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:30 crc kubenswrapper[5129]: I0314 09:01:30.982068 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-scripts\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.007138 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq8dm\" (UniqueName: \"kubernetes.io/projected/268b884c-e047-4ce4-b6e3-122fdd2a76b3-kube-api-access-sq8dm\") pod \"glance-default-external-api-0\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.033084 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.063771 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.063852 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvnlm\" (UniqueName: \"kubernetes.io/projected/831185d4-1300-45aa-8999-8f4a5230ffd7-kube-api-access-xvnlm\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.063913 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.063958 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzn5h\" (UniqueName: \"kubernetes.io/projected/c4715224-6c27-4dc1-9380-fe174dcc9521-kube-api-access-gzn5h\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.063988 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.064036 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-sb\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.064084 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-config\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.064114 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.064145 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-nb\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.064229 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.064299 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-dns-svc\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.065685 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-dns-svc\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.067014 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-sb\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.067576 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-nb\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.071575 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-config\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.090600 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzn5h\" (UniqueName: \"kubernetes.io/projected/c4715224-6c27-4dc1-9380-fe174dcc9521-kube-api-access-gzn5h\") pod \"dnsmasq-dns-56c4b58fb7-5mf7r\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.165775 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.165928 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.165950 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvnlm\" (UniqueName: \"kubernetes.io/projected/831185d4-1300-45aa-8999-8f4a5230ffd7-kube-api-access-xvnlm\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.166006 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.166039 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.166100 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.167802 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.168301 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.169541 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.173281 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.175964 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.192880 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvnlm\" (UniqueName: \"kubernetes.io/projected/831185d4-1300-45aa-8999-8f4a5230ffd7-kube-api-access-xvnlm\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.210532 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.361060 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.703489 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:31 crc kubenswrapper[5129]: I0314 09:01:31.800956 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c4b58fb7-5mf7r"] Mar 14 09:01:32 crc kubenswrapper[5129]: I0314 09:01:32.086725 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:32 crc kubenswrapper[5129]: W0314 09:01:32.103826 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod831185d4_1300_45aa_8999_8f4a5230ffd7.slice/crio-6ea15becd4ea94dc00024369bc0024047425e5177df75889636afe1bbbacc71d WatchSource:0}: Error finding container 6ea15becd4ea94dc00024369bc0024047425e5177df75889636afe1bbbacc71d: Status 404 returned error can't find the container with id 6ea15becd4ea94dc00024369bc0024047425e5177df75889636afe1bbbacc71d Mar 14 09:01:32 crc kubenswrapper[5129]: I0314 09:01:32.461005 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" event={"ID":"c4715224-6c27-4dc1-9380-fe174dcc9521","Type":"ContainerStarted","Data":"891b230714423c951308bf0a7d46f706930bb8d895184572fba2660cb06c6cae"} Mar 14 09:01:32 crc kubenswrapper[5129]: I0314 09:01:32.464493 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"268b884c-e047-4ce4-b6e3-122fdd2a76b3","Type":"ContainerStarted","Data":"033b297acfc4a4be6b84f9c28c117ddc1110490e095f9e59e21853c594fb52a2"} Mar 14 09:01:32 crc kubenswrapper[5129]: I0314 09:01:32.466224 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"831185d4-1300-45aa-8999-8f4a5230ffd7","Type":"ContainerStarted","Data":"6ea15becd4ea94dc00024369bc0024047425e5177df75889636afe1bbbacc71d"} Mar 14 09:01:32 crc kubenswrapper[5129]: I0314 09:01:32.655249 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.194027 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.476008 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"831185d4-1300-45aa-8999-8f4a5230ffd7","Type":"ContainerStarted","Data":"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26"} Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.478132 5129 generic.go:334] "Generic (PLEG): container finished" podID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerID="e213f634472cb1b1e8e23692c3610ecd0936e58bdefd080ee41e3d5dd1a53f44" exitCode=0 Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.478169 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" event={"ID":"c4715224-6c27-4dc1-9380-fe174dcc9521","Type":"ContainerDied","Data":"e213f634472cb1b1e8e23692c3610ecd0936e58bdefd080ee41e3d5dd1a53f44"} Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.481325 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"268b884c-e047-4ce4-b6e3-122fdd2a76b3","Type":"ContainerStarted","Data":"6fb775ebae952897aebd29a9c3e2e726db5a3cdb0938457571da084e0d12af0e"} Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.481367 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"268b884c-e047-4ce4-b6e3-122fdd2a76b3","Type":"ContainerStarted","Data":"04cb67186b62d00e264e435df3d7845a66d510a1bb42d7ded1128bd4c907ef31"} Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.481435 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-log" containerID="cri-o://04cb67186b62d00e264e435df3d7845a66d510a1bb42d7ded1128bd4c907ef31" gracePeriod=30 Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.481470 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-httpd" containerID="cri-o://6fb775ebae952897aebd29a9c3e2e726db5a3cdb0938457571da084e0d12af0e" gracePeriod=30 Mar 14 09:01:33 crc kubenswrapper[5129]: I0314 09:01:33.541396 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.541377754 podStartE2EDuration="3.541377754s" podCreationTimestamp="2026-03-14 09:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:33.528939506 +0000 UTC m=+7356.280854690" watchObservedRunningTime="2026-03-14 09:01:33.541377754 +0000 UTC m=+7356.293292938" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.495055 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"831185d4-1300-45aa-8999-8f4a5230ffd7","Type":"ContainerStarted","Data":"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907"} Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.495357 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-log" containerID="cri-o://90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26" gracePeriod=30 Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.495578 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-httpd" containerID="cri-o://fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907" gracePeriod=30 Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.501396 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" event={"ID":"c4715224-6c27-4dc1-9380-fe174dcc9521","Type":"ContainerStarted","Data":"7ec56ff253cfd4101b689a17581b810258402e3c148ba662dc2cdaee3848685f"} Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.502431 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.507801 5129 generic.go:334] "Generic (PLEG): container finished" podID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerID="6fb775ebae952897aebd29a9c3e2e726db5a3cdb0938457571da084e0d12af0e" exitCode=143 Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.507840 5129 generic.go:334] "Generic (PLEG): container finished" podID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerID="04cb67186b62d00e264e435df3d7845a66d510a1bb42d7ded1128bd4c907ef31" exitCode=143 Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.507883 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"268b884c-e047-4ce4-b6e3-122fdd2a76b3","Type":"ContainerDied","Data":"6fb775ebae952897aebd29a9c3e2e726db5a3cdb0938457571da084e0d12af0e"} Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.507938 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"268b884c-e047-4ce4-b6e3-122fdd2a76b3","Type":"ContainerDied","Data":"04cb67186b62d00e264e435df3d7845a66d510a1bb42d7ded1128bd4c907ef31"} Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.529023 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.528991734 podStartE2EDuration="4.528991734s" podCreationTimestamp="2026-03-14 09:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:34.520319449 +0000 UTC m=+7357.272234633" watchObservedRunningTime="2026-03-14 09:01:34.528991734 +0000 UTC m=+7357.280906918" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.551068 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" podStartSLOduration=4.551040983 podStartE2EDuration="4.551040983s" podCreationTimestamp="2026-03-14 09:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:34.543138778 +0000 UTC m=+7357.295053972" watchObservedRunningTime="2026-03-14 09:01:34.551040983 +0000 UTC m=+7357.302956177" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.836437 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.982500 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-combined-ca-bundle\") pod \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.982554 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-logs\") pod \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.982585 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-httpd-run\") pod \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.982717 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq8dm\" (UniqueName: \"kubernetes.io/projected/268b884c-e047-4ce4-b6e3-122fdd2a76b3-kube-api-access-sq8dm\") pod \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.982762 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-config-data\") pod \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.982796 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-scripts\") pod \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\" (UID: \"268b884c-e047-4ce4-b6e3-122fdd2a76b3\") " Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.983072 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-logs" (OuterVolumeSpecName: "logs") pod "268b884c-e047-4ce4-b6e3-122fdd2a76b3" (UID: "268b884c-e047-4ce4-b6e3-122fdd2a76b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.983684 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "268b884c-e047-4ce4-b6e3-122fdd2a76b3" (UID: "268b884c-e047-4ce4-b6e3-122fdd2a76b3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.983964 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.983978 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/268b884c-e047-4ce4-b6e3-122fdd2a76b3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:34 crc kubenswrapper[5129]: I0314 09:01:34.989857 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268b884c-e047-4ce4-b6e3-122fdd2a76b3-kube-api-access-sq8dm" (OuterVolumeSpecName: "kube-api-access-sq8dm") pod "268b884c-e047-4ce4-b6e3-122fdd2a76b3" (UID: "268b884c-e047-4ce4-b6e3-122fdd2a76b3"). InnerVolumeSpecName "kube-api-access-sq8dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.000382 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-scripts" (OuterVolumeSpecName: "scripts") pod "268b884c-e047-4ce4-b6e3-122fdd2a76b3" (UID: "268b884c-e047-4ce4-b6e3-122fdd2a76b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.014958 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "268b884c-e047-4ce4-b6e3-122fdd2a76b3" (UID: "268b884c-e047-4ce4-b6e3-122fdd2a76b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.045720 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:01:35 crc kubenswrapper[5129]: E0314 09:01:35.045961 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.061566 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-config-data" (OuterVolumeSpecName: "config-data") pod "268b884c-e047-4ce4-b6e3-122fdd2a76b3" (UID: "268b884c-e047-4ce4-b6e3-122fdd2a76b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.086382 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq8dm\" (UniqueName: \"kubernetes.io/projected/268b884c-e047-4ce4-b6e3-122fdd2a76b3-kube-api-access-sq8dm\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.086420 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.086431 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.086442 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268b884c-e047-4ce4-b6e3-122fdd2a76b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.114461 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.197351 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-httpd-run\") pod \"831185d4-1300-45aa-8999-8f4a5230ffd7\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.197417 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvnlm\" (UniqueName: \"kubernetes.io/projected/831185d4-1300-45aa-8999-8f4a5230ffd7-kube-api-access-xvnlm\") pod \"831185d4-1300-45aa-8999-8f4a5230ffd7\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.197454 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-combined-ca-bundle\") pod \"831185d4-1300-45aa-8999-8f4a5230ffd7\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.197505 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-logs\") pod \"831185d4-1300-45aa-8999-8f4a5230ffd7\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.197602 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-scripts\") pod \"831185d4-1300-45aa-8999-8f4a5230ffd7\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.197675 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-config-data\") pod \"831185d4-1300-45aa-8999-8f4a5230ffd7\" (UID: \"831185d4-1300-45aa-8999-8f4a5230ffd7\") " Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.198008 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "831185d4-1300-45aa-8999-8f4a5230ffd7" (UID: "831185d4-1300-45aa-8999-8f4a5230ffd7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.198112 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.198179 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-logs" (OuterVolumeSpecName: "logs") pod "831185d4-1300-45aa-8999-8f4a5230ffd7" (UID: "831185d4-1300-45aa-8999-8f4a5230ffd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.203813 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-scripts" (OuterVolumeSpecName: "scripts") pod "831185d4-1300-45aa-8999-8f4a5230ffd7" (UID: "831185d4-1300-45aa-8999-8f4a5230ffd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.213827 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831185d4-1300-45aa-8999-8f4a5230ffd7-kube-api-access-xvnlm" (OuterVolumeSpecName: "kube-api-access-xvnlm") pod "831185d4-1300-45aa-8999-8f4a5230ffd7" (UID: "831185d4-1300-45aa-8999-8f4a5230ffd7"). InnerVolumeSpecName "kube-api-access-xvnlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.224825 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "831185d4-1300-45aa-8999-8f4a5230ffd7" (UID: "831185d4-1300-45aa-8999-8f4a5230ffd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.277788 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-config-data" (OuterVolumeSpecName: "config-data") pod "831185d4-1300-45aa-8999-8f4a5230ffd7" (UID: "831185d4-1300-45aa-8999-8f4a5230ffd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.299578 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.299629 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/831185d4-1300-45aa-8999-8f4a5230ffd7-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.299641 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.299650 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831185d4-1300-45aa-8999-8f4a5230ffd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.299659 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvnlm\" (UniqueName: \"kubernetes.io/projected/831185d4-1300-45aa-8999-8f4a5230ffd7-kube-api-access-xvnlm\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.520446 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.521476 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"268b884c-e047-4ce4-b6e3-122fdd2a76b3","Type":"ContainerDied","Data":"033b297acfc4a4be6b84f9c28c117ddc1110490e095f9e59e21853c594fb52a2"} Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.521517 5129 scope.go:117] "RemoveContainer" containerID="6fb775ebae952897aebd29a9c3e2e726db5a3cdb0938457571da084e0d12af0e" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.523504 5129 generic.go:334] "Generic (PLEG): container finished" podID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerID="fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907" exitCode=0 Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.523533 5129 generic.go:334] "Generic (PLEG): container finished" podID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerID="90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26" exitCode=143 Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.524536 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.527553 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"831185d4-1300-45aa-8999-8f4a5230ffd7","Type":"ContainerDied","Data":"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907"} Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.527626 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"831185d4-1300-45aa-8999-8f4a5230ffd7","Type":"ContainerDied","Data":"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26"} Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.527641 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"831185d4-1300-45aa-8999-8f4a5230ffd7","Type":"ContainerDied","Data":"6ea15becd4ea94dc00024369bc0024047425e5177df75889636afe1bbbacc71d"} Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.545728 5129 scope.go:117] "RemoveContainer" containerID="04cb67186b62d00e264e435df3d7845a66d510a1bb42d7ded1128bd4c907ef31" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.567976 5129 scope.go:117] "RemoveContainer" containerID="fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.570993 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.585948 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.594314 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.618438 5129 scope.go:117] "RemoveContainer" containerID="90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.624397 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632106 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: E0314 09:01:35.632624 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-httpd" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632648 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-httpd" Mar 14 09:01:35 crc kubenswrapper[5129]: E0314 09:01:35.632674 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-log" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632683 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-log" Mar 14 09:01:35 crc kubenswrapper[5129]: E0314 09:01:35.632701 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-log" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632709 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-log" Mar 14 09:01:35 crc kubenswrapper[5129]: E0314 09:01:35.632720 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-httpd" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632730 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-httpd" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632923 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-log" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632946 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" containerName="glance-httpd" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632960 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-log" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.632983 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" containerName="glance-httpd" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.633917 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.636000 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.636449 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p4sfk" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.636639 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.638052 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.645806 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.668853 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.671537 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.686138 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.694465 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721167 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-config-data\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721227 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-logs\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721257 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-scripts\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721328 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721408 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721463 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721547 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxdn\" (UniqueName: \"kubernetes.io/projected/ab9f7b05-3883-45e6-a278-9986e3047ccb-kube-api-access-8kxdn\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721592 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-logs\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721689 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721716 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkm6l\" (UniqueName: \"kubernetes.io/projected/3243555c-242c-4b68-b367-4dc4e3237487-kube-api-access-pkm6l\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721738 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721759 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721797 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.721818 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.722759 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.729027 5129 scope.go:117] "RemoveContainer" containerID="fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907" Mar 14 09:01:35 crc kubenswrapper[5129]: E0314 09:01:35.739800 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907\": container with ID starting with fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907 not found: ID does not exist" containerID="fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.739869 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907"} err="failed to get container status \"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907\": rpc error: code = NotFound desc = could not find container \"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907\": container with ID starting with fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907 not found: ID does not exist" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.739969 5129 scope.go:117] "RemoveContainer" containerID="90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26" Mar 14 09:01:35 crc kubenswrapper[5129]: E0314 09:01:35.745630 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26\": container with ID starting with 90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26 not found: ID does not exist" containerID="90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.745680 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26"} err="failed to get container status \"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26\": rpc error: code = NotFound desc = could not find container \"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26\": container with ID starting with 90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26 not found: ID does not exist" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.745702 5129 scope.go:117] "RemoveContainer" containerID="fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.746418 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907"} err="failed to get container status \"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907\": rpc error: code = NotFound desc = could not find container \"fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907\": container with ID starting with fe0df757fa945bc4d59452aaae14c503c1d0d91e3041396aec2156e819692907 not found: ID does not exist" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.746459 5129 scope.go:117] "RemoveContainer" containerID="90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.749033 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26"} err="failed to get container status \"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26\": rpc error: code = NotFound desc = could not find container \"90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26\": container with ID starting with 90cda46eceb0c66974ba9c94907cfd6311e71d8b12e44f6d2de8be97bb2eef26 not found: ID does not exist" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823525 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823573 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkm6l\" (UniqueName: \"kubernetes.io/projected/3243555c-242c-4b68-b367-4dc4e3237487-kube-api-access-pkm6l\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823596 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823629 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823657 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823673 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823864 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-config-data\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823890 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-logs\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.824181 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.823907 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-scripts\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.824413 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.824465 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.824511 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.824566 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxdn\" (UniqueName: \"kubernetes.io/projected/ab9f7b05-3883-45e6-a278-9986e3047ccb-kube-api-access-8kxdn\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.824596 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-logs\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.825113 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.824416 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-logs\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.825459 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-logs\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.828067 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.828806 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.828910 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.830807 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-config-data\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.830999 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.832026 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-scripts\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.832550 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.839442 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkm6l\" (UniqueName: \"kubernetes.io/projected/3243555c-242c-4b68-b367-4dc4e3237487-kube-api-access-pkm6l\") pod \"glance-default-internal-api-0\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.842032 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxdn\" (UniqueName: \"kubernetes.io/projected/ab9f7b05-3883-45e6-a278-9986e3047ccb-kube-api-access-8kxdn\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:35 crc kubenswrapper[5129]: I0314 09:01:35.846116 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " pod="openstack/glance-default-external-api-0" Mar 14 09:01:36 crc kubenswrapper[5129]: I0314 09:01:36.015627 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:36 crc kubenswrapper[5129]: I0314 09:01:36.024098 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:01:36 crc kubenswrapper[5129]: I0314 09:01:36.068364 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268b884c-e047-4ce4-b6e3-122fdd2a76b3" path="/var/lib/kubelet/pods/268b884c-e047-4ce4-b6e3-122fdd2a76b3/volumes" Mar 14 09:01:36 crc kubenswrapper[5129]: I0314 09:01:36.069259 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="831185d4-1300-45aa-8999-8f4a5230ffd7" path="/var/lib/kubelet/pods/831185d4-1300-45aa-8999-8f4a5230ffd7/volumes" Mar 14 09:01:36 crc kubenswrapper[5129]: I0314 09:01:36.558932 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:01:36 crc kubenswrapper[5129]: W0314 09:01:36.562992 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab9f7b05_3883_45e6_a278_9986e3047ccb.slice/crio-8422976023f65cdaaf1a80e6fbe8ba97248d55abc8f6eed9f7e00625971e57d7 WatchSource:0}: Error finding container 8422976023f65cdaaf1a80e6fbe8ba97248d55abc8f6eed9f7e00625971e57d7: Status 404 returned error can't find the container with id 8422976023f65cdaaf1a80e6fbe8ba97248d55abc8f6eed9f7e00625971e57d7 Mar 14 09:01:37 crc kubenswrapper[5129]: I0314 09:01:37.543858 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9f7b05-3883-45e6-a278-9986e3047ccb","Type":"ContainerStarted","Data":"2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be"} Mar 14 09:01:37 crc kubenswrapper[5129]: I0314 09:01:37.544204 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9f7b05-3883-45e6-a278-9986e3047ccb","Type":"ContainerStarted","Data":"50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e"} Mar 14 09:01:37 crc kubenswrapper[5129]: I0314 09:01:37.544219 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9f7b05-3883-45e6-a278-9986e3047ccb","Type":"ContainerStarted","Data":"8422976023f65cdaaf1a80e6fbe8ba97248d55abc8f6eed9f7e00625971e57d7"} Mar 14 09:01:37 crc kubenswrapper[5129]: I0314 09:01:37.570527 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.570508212 podStartE2EDuration="2.570508212s" podCreationTimestamp="2026-03-14 09:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:37.561069726 +0000 UTC m=+7360.312984920" watchObservedRunningTime="2026-03-14 09:01:37.570508212 +0000 UTC m=+7360.322423396" Mar 14 09:01:37 crc kubenswrapper[5129]: I0314 09:01:37.600756 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:01:37 crc kubenswrapper[5129]: W0314 09:01:37.604290 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3243555c_242c_4b68_b367_4dc4e3237487.slice/crio-d117f87b84a5ad305a2330bcd4499646e0cc051a5d25a4a0cb211c31f3996ccc WatchSource:0}: Error finding container d117f87b84a5ad305a2330bcd4499646e0cc051a5d25a4a0cb211c31f3996ccc: Status 404 returned error can't find the container with id d117f87b84a5ad305a2330bcd4499646e0cc051a5d25a4a0cb211c31f3996ccc Mar 14 09:01:38 crc kubenswrapper[5129]: I0314 09:01:38.553573 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3243555c-242c-4b68-b367-4dc4e3237487","Type":"ContainerStarted","Data":"0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46"} Mar 14 09:01:38 crc kubenswrapper[5129]: I0314 09:01:38.554408 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3243555c-242c-4b68-b367-4dc4e3237487","Type":"ContainerStarted","Data":"263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb"} Mar 14 09:01:38 crc kubenswrapper[5129]: I0314 09:01:38.554428 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3243555c-242c-4b68-b367-4dc4e3237487","Type":"ContainerStarted","Data":"d117f87b84a5ad305a2330bcd4499646e0cc051a5d25a4a0cb211c31f3996ccc"} Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.171571 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.192583 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.19252437 podStartE2EDuration="6.19252437s" podCreationTimestamp="2026-03-14 09:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:38.583636115 +0000 UTC m=+7361.335551299" watchObservedRunningTime="2026-03-14 09:01:41.19252437 +0000 UTC m=+7363.944439554" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.243212 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84667d55b7-tbvxl"] Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.243485 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" podUID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerName="dnsmasq-dns" containerID="cri-o://ee8e8fde444c65972529b5a42bc21d7f38174e4edeb3492278cf8f3a62d4367a" gracePeriod=10 Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.585668 5129 generic.go:334] "Generic (PLEG): container finished" podID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerID="ee8e8fde444c65972529b5a42bc21d7f38174e4edeb3492278cf8f3a62d4367a" exitCode=0 Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.585939 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" event={"ID":"5db4cd12-35cc-4168-9e0c-d966640d5a78","Type":"ContainerDied","Data":"ee8e8fde444c65972529b5a42bc21d7f38174e4edeb3492278cf8f3a62d4367a"} Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.690699 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.846674 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-664j6\" (UniqueName: \"kubernetes.io/projected/5db4cd12-35cc-4168-9e0c-d966640d5a78-kube-api-access-664j6\") pod \"5db4cd12-35cc-4168-9e0c-d966640d5a78\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.846728 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-config\") pod \"5db4cd12-35cc-4168-9e0c-d966640d5a78\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.846755 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-nb\") pod \"5db4cd12-35cc-4168-9e0c-d966640d5a78\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.846817 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-sb\") pod \"5db4cd12-35cc-4168-9e0c-d966640d5a78\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.846860 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-dns-svc\") pod \"5db4cd12-35cc-4168-9e0c-d966640d5a78\" (UID: \"5db4cd12-35cc-4168-9e0c-d966640d5a78\") " Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.867077 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db4cd12-35cc-4168-9e0c-d966640d5a78-kube-api-access-664j6" (OuterVolumeSpecName: "kube-api-access-664j6") pod "5db4cd12-35cc-4168-9e0c-d966640d5a78" (UID: "5db4cd12-35cc-4168-9e0c-d966640d5a78"). InnerVolumeSpecName "kube-api-access-664j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.892305 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5db4cd12-35cc-4168-9e0c-d966640d5a78" (UID: "5db4cd12-35cc-4168-9e0c-d966640d5a78"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.896245 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5db4cd12-35cc-4168-9e0c-d966640d5a78" (UID: "5db4cd12-35cc-4168-9e0c-d966640d5a78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.897548 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5db4cd12-35cc-4168-9e0c-d966640d5a78" (UID: "5db4cd12-35cc-4168-9e0c-d966640d5a78"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.905581 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-config" (OuterVolumeSpecName: "config") pod "5db4cd12-35cc-4168-9e0c-d966640d5a78" (UID: "5db4cd12-35cc-4168-9e0c-d966640d5a78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.948449 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-664j6\" (UniqueName: \"kubernetes.io/projected/5db4cd12-35cc-4168-9e0c-d966640d5a78-kube-api-access-664j6\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.948497 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.949099 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.949365 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:41 crc kubenswrapper[5129]: I0314 09:01:41.949382 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5db4cd12-35cc-4168-9e0c-d966640d5a78-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:42 crc kubenswrapper[5129]: I0314 09:01:42.600011 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" event={"ID":"5db4cd12-35cc-4168-9e0c-d966640d5a78","Type":"ContainerDied","Data":"4778f023cf0417bdfa0fd621dbc511d7ab939dbcf2ec2f9676f9c5391718ccd8"} Mar 14 09:01:42 crc kubenswrapper[5129]: I0314 09:01:42.600063 5129 scope.go:117] "RemoveContainer" containerID="ee8e8fde444c65972529b5a42bc21d7f38174e4edeb3492278cf8f3a62d4367a" Mar 14 09:01:42 crc kubenswrapper[5129]: I0314 09:01:42.600140 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84667d55b7-tbvxl" Mar 14 09:01:42 crc kubenswrapper[5129]: I0314 09:01:42.635042 5129 scope.go:117] "RemoveContainer" containerID="31b4b18dc2e389e1ea0861d8d9c46699992d1c9bdd5b3207e3194f3a1b09a456" Mar 14 09:01:42 crc kubenswrapper[5129]: I0314 09:01:42.644312 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84667d55b7-tbvxl"] Mar 14 09:01:42 crc kubenswrapper[5129]: I0314 09:01:42.660093 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84667d55b7-tbvxl"] Mar 14 09:01:44 crc kubenswrapper[5129]: I0314 09:01:44.048698 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db4cd12-35cc-4168-9e0c-d966640d5a78" path="/var/lib/kubelet/pods/5db4cd12-35cc-4168-9e0c-d966640d5a78/volumes" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.016235 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.016283 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.024788 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.024827 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.058567 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.058793 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.090416 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.105946 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.643369 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.643657 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.643683 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:46 crc kubenswrapper[5129]: I0314 09:01:46.643693 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 09:01:48 crc kubenswrapper[5129]: I0314 09:01:48.575213 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 09:01:48 crc kubenswrapper[5129]: I0314 09:01:48.575765 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 09:01:48 crc kubenswrapper[5129]: I0314 09:01:48.576253 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:48 crc kubenswrapper[5129]: I0314 09:01:48.612648 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 09:01:49 crc kubenswrapper[5129]: I0314 09:01:49.039479 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:01:49 crc kubenswrapper[5129]: E0314 09:01:49.040015 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.729956 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6cbzr"] Mar 14 09:01:54 crc kubenswrapper[5129]: E0314 09:01:54.731265 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerName="init" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.731280 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerName="init" Mar 14 09:01:54 crc kubenswrapper[5129]: E0314 09:01:54.731291 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerName="dnsmasq-dns" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.731297 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerName="dnsmasq-dns" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.731505 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db4cd12-35cc-4168-9e0c-d966640d5a78" containerName="dnsmasq-dns" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.732177 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.748810 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6cbzr"] Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.831675 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0580-account-create-update-2gjwf"] Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.832713 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.833766 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xk9\" (UniqueName: \"kubernetes.io/projected/3c4d5345-6c9c-4eb4-872b-8674628d408c-kube-api-access-j2xk9\") pod \"placement-db-create-6cbzr\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.833830 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c4d5345-6c9c-4eb4-872b-8674628d408c-operator-scripts\") pod \"placement-db-create-6cbzr\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.834767 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.854827 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0580-account-create-update-2gjwf"] Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.935918 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xk9\" (UniqueName: \"kubernetes.io/projected/3c4d5345-6c9c-4eb4-872b-8674628d408c-kube-api-access-j2xk9\") pod \"placement-db-create-6cbzr\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.935999 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-operator-scripts\") pod \"placement-0580-account-create-update-2gjwf\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.936047 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c4d5345-6c9c-4eb4-872b-8674628d408c-operator-scripts\") pod \"placement-db-create-6cbzr\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.936092 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sq6\" (UniqueName: \"kubernetes.io/projected/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-kube-api-access-p8sq6\") pod \"placement-0580-account-create-update-2gjwf\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.936963 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c4d5345-6c9c-4eb4-872b-8674628d408c-operator-scripts\") pod \"placement-db-create-6cbzr\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:54 crc kubenswrapper[5129]: I0314 09:01:54.962476 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xk9\" (UniqueName: \"kubernetes.io/projected/3c4d5345-6c9c-4eb4-872b-8674628d408c-kube-api-access-j2xk9\") pod \"placement-db-create-6cbzr\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.037516 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sq6\" (UniqueName: \"kubernetes.io/projected/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-kube-api-access-p8sq6\") pod \"placement-0580-account-create-update-2gjwf\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.037680 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-operator-scripts\") pod \"placement-0580-account-create-update-2gjwf\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.038462 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-operator-scripts\") pod \"placement-0580-account-create-update-2gjwf\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.053628 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sq6\" (UniqueName: \"kubernetes.io/projected/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-kube-api-access-p8sq6\") pod \"placement-0580-account-create-update-2gjwf\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.062824 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.156621 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.500668 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6cbzr"] Mar 14 09:01:55 crc kubenswrapper[5129]: W0314 09:01:55.501806 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c4d5345_6c9c_4eb4_872b_8674628d408c.slice/crio-4ca0b4fca9ade5aa9118195202e217dcd4da7822d93a2065624beb189eab7d06 WatchSource:0}: Error finding container 4ca0b4fca9ade5aa9118195202e217dcd4da7822d93a2065624beb189eab7d06: Status 404 returned error can't find the container with id 4ca0b4fca9ade5aa9118195202e217dcd4da7822d93a2065624beb189eab7d06 Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.648535 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0580-account-create-update-2gjwf"] Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.726821 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6cbzr" event={"ID":"3c4d5345-6c9c-4eb4-872b-8674628d408c","Type":"ContainerStarted","Data":"28730beda4fa722818cea34bf81d00d50fed21bce4d72acf14170a6574183945"} Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.726865 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6cbzr" event={"ID":"3c4d5345-6c9c-4eb4-872b-8674628d408c","Type":"ContainerStarted","Data":"4ca0b4fca9ade5aa9118195202e217dcd4da7822d93a2065624beb189eab7d06"} Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.728897 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0580-account-create-update-2gjwf" event={"ID":"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8","Type":"ContainerStarted","Data":"bc2cbe62b01d72814287cb0c8934360f39a983b1339328c88ac31e4752fa4494"} Mar 14 09:01:55 crc kubenswrapper[5129]: I0314 09:01:55.744424 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-6cbzr" podStartSLOduration=1.744403921 podStartE2EDuration="1.744403921s" podCreationTimestamp="2026-03-14 09:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:55.737837172 +0000 UTC m=+7378.489752356" watchObservedRunningTime="2026-03-14 09:01:55.744403921 +0000 UTC m=+7378.496319105" Mar 14 09:01:56 crc kubenswrapper[5129]: I0314 09:01:56.742124 5129 generic.go:334] "Generic (PLEG): container finished" podID="4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8" containerID="423a028bb5506319a7f6682b3693842a761b8815e832c155f3e31dbb92185903" exitCode=0 Mar 14 09:01:56 crc kubenswrapper[5129]: I0314 09:01:56.742527 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0580-account-create-update-2gjwf" event={"ID":"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8","Type":"ContainerDied","Data":"423a028bb5506319a7f6682b3693842a761b8815e832c155f3e31dbb92185903"} Mar 14 09:01:56 crc kubenswrapper[5129]: I0314 09:01:56.748535 5129 generic.go:334] "Generic (PLEG): container finished" podID="3c4d5345-6c9c-4eb4-872b-8674628d408c" containerID="28730beda4fa722818cea34bf81d00d50fed21bce4d72acf14170a6574183945" exitCode=0 Mar 14 09:01:56 crc kubenswrapper[5129]: I0314 09:01:56.748638 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6cbzr" event={"ID":"3c4d5345-6c9c-4eb4-872b-8674628d408c","Type":"ContainerDied","Data":"28730beda4fa722818cea34bf81d00d50fed21bce4d72acf14170a6574183945"} Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.231397 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.241568 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cbzr" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.400344 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2xk9\" (UniqueName: \"kubernetes.io/projected/3c4d5345-6c9c-4eb4-872b-8674628d408c-kube-api-access-j2xk9\") pod \"3c4d5345-6c9c-4eb4-872b-8674628d408c\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.400421 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c4d5345-6c9c-4eb4-872b-8674628d408c-operator-scripts\") pod \"3c4d5345-6c9c-4eb4-872b-8674628d408c\" (UID: \"3c4d5345-6c9c-4eb4-872b-8674628d408c\") " Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.400474 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8sq6\" (UniqueName: \"kubernetes.io/projected/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-kube-api-access-p8sq6\") pod \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.400696 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-operator-scripts\") pod \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\" (UID: \"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8\") " Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.401101 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c4d5345-6c9c-4eb4-872b-8674628d408c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c4d5345-6c9c-4eb4-872b-8674628d408c" (UID: "3c4d5345-6c9c-4eb4-872b-8674628d408c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.401714 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8" (UID: "4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.405821 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-kube-api-access-p8sq6" (OuterVolumeSpecName: "kube-api-access-p8sq6") pod "4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8" (UID: "4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8"). InnerVolumeSpecName "kube-api-access-p8sq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.406488 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4d5345-6c9c-4eb4-872b-8674628d408c-kube-api-access-j2xk9" (OuterVolumeSpecName: "kube-api-access-j2xk9") pod "3c4d5345-6c9c-4eb4-872b-8674628d408c" (UID: "3c4d5345-6c9c-4eb4-872b-8674628d408c"). InnerVolumeSpecName "kube-api-access-j2xk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.503463 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.503507 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2xk9\" (UniqueName: \"kubernetes.io/projected/3c4d5345-6c9c-4eb4-872b-8674628d408c-kube-api-access-j2xk9\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.503523 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c4d5345-6c9c-4eb4-872b-8674628d408c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.503535 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8sq6\" (UniqueName: \"kubernetes.io/projected/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8-kube-api-access-p8sq6\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.782394 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0580-account-create-update-2gjwf" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.782795 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0580-account-create-update-2gjwf" event={"ID":"4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8","Type":"ContainerDied","Data":"bc2cbe62b01d72814287cb0c8934360f39a983b1339328c88ac31e4752fa4494"} Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.783001 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc2cbe62b01d72814287cb0c8934360f39a983b1339328c88ac31e4752fa4494" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.784583 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6cbzr" event={"ID":"3c4d5345-6c9c-4eb4-872b-8674628d408c","Type":"ContainerDied","Data":"4ca0b4fca9ade5aa9118195202e217dcd4da7822d93a2065624beb189eab7d06"} Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.784628 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca0b4fca9ade5aa9118195202e217dcd4da7822d93a2065624beb189eab7d06" Mar 14 09:01:58 crc kubenswrapper[5129]: I0314 09:01:58.784681 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cbzr" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.171401 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557982-p9lb8"] Mar 14 09:02:00 crc kubenswrapper[5129]: E0314 09:02:00.172250 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8" containerName="mariadb-account-create-update" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.172264 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8" containerName="mariadb-account-create-update" Mar 14 09:02:00 crc kubenswrapper[5129]: E0314 09:02:00.172299 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4d5345-6c9c-4eb4-872b-8674628d408c" containerName="mariadb-database-create" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.172306 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4d5345-6c9c-4eb4-872b-8674628d408c" containerName="mariadb-database-create" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.172509 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4d5345-6c9c-4eb4-872b-8674628d408c" containerName="mariadb-database-create" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.172546 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8" containerName="mariadb-account-create-update" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.173481 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-p9lb8" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.178292 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.178515 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.178784 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.181852 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-p9lb8"] Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.234047 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f86dd9c67-9xx9q"] Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.236755 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.299568 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f86dd9c67-9xx9q"] Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.324699 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fdncd"] Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.326001 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.331853 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fdncd"] Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.332776 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.333393 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.334397 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-22j74" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.355307 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-config\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.355349 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-dns-svc\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.355368 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55z2\" (UniqueName: \"kubernetes.io/projected/55864dec-a088-4577-a4ed-9588a43de404-kube-api-access-f55z2\") pod \"auto-csr-approver-29557982-p9lb8\" (UID: \"55864dec-a088-4577-a4ed-9588a43de404\") " pod="openshift-infra/auto-csr-approver-29557982-p9lb8" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.355415 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rzp\" (UniqueName: \"kubernetes.io/projected/f969c069-87cb-4571-85ef-3d88d8f510c5-kube-api-access-22rzp\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.355458 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.355489 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-sb\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.457841 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprcq\" (UniqueName: \"kubernetes.io/projected/3f5125ed-86c0-489c-892f-03f76a8ecc42-kube-api-access-hprcq\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.457984 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-config\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.458014 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-dns-svc\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.458062 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f55z2\" (UniqueName: \"kubernetes.io/projected/55864dec-a088-4577-a4ed-9588a43de404-kube-api-access-f55z2\") pod \"auto-csr-approver-29557982-p9lb8\" (UID: \"55864dec-a088-4577-a4ed-9588a43de404\") " pod="openshift-infra/auto-csr-approver-29557982-p9lb8" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.458963 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-dns-svc\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459066 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22rzp\" (UniqueName: \"kubernetes.io/projected/f969c069-87cb-4571-85ef-3d88d8f510c5-kube-api-access-22rzp\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459104 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-scripts\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459140 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459169 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-combined-ca-bundle\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459193 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-sb\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459208 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-config\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459215 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-config-data\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.459439 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f5125ed-86c0-489c-892f-03f76a8ecc42-logs\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.460190 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-sb\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.460374 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.488303 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f55z2\" (UniqueName: \"kubernetes.io/projected/55864dec-a088-4577-a4ed-9588a43de404-kube-api-access-f55z2\") pod \"auto-csr-approver-29557982-p9lb8\" (UID: \"55864dec-a088-4577-a4ed-9588a43de404\") " pod="openshift-infra/auto-csr-approver-29557982-p9lb8" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.493490 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rzp\" (UniqueName: \"kubernetes.io/projected/f969c069-87cb-4571-85ef-3d88d8f510c5-kube-api-access-22rzp\") pod \"dnsmasq-dns-6f86dd9c67-9xx9q\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.502109 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-p9lb8" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.560772 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-combined-ca-bundle\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.562150 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-config-data\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.562227 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f5125ed-86c0-489c-892f-03f76a8ecc42-logs\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.562260 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hprcq\" (UniqueName: \"kubernetes.io/projected/3f5125ed-86c0-489c-892f-03f76a8ecc42-kube-api-access-hprcq\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.562359 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-scripts\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.562806 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f5125ed-86c0-489c-892f-03f76a8ecc42-logs\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.564715 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.566165 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-scripts\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.566707 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-config-data\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.570003 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-combined-ca-bundle\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.583303 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprcq\" (UniqueName: \"kubernetes.io/projected/3f5125ed-86c0-489c-892f-03f76a8ecc42-kube-api-access-hprcq\") pod \"placement-db-sync-fdncd\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:00 crc kubenswrapper[5129]: I0314 09:02:00.660127 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.015763 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-p9lb8"] Mar 14 09:02:01 crc kubenswrapper[5129]: W0314 09:02:01.016453 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55864dec_a088_4577_a4ed_9588a43de404.slice/crio-1378ab223ca0acebc3dce153914f5f20af6311b20dc93855bbe0775c77ef5d85 WatchSource:0}: Error finding container 1378ab223ca0acebc3dce153914f5f20af6311b20dc93855bbe0775c77ef5d85: Status 404 returned error can't find the container with id 1378ab223ca0acebc3dce153914f5f20af6311b20dc93855bbe0775c77ef5d85 Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.147203 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f86dd9c67-9xx9q"] Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.157225 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fdncd"] Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.817272 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-p9lb8" event={"ID":"55864dec-a088-4577-a4ed-9588a43de404","Type":"ContainerStarted","Data":"1378ab223ca0acebc3dce153914f5f20af6311b20dc93855bbe0775c77ef5d85"} Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.826077 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fdncd" event={"ID":"3f5125ed-86c0-489c-892f-03f76a8ecc42","Type":"ContainerStarted","Data":"39bf95e96a6d2a58e86de85e85dc1e7ef0459494c89b6faecb9ee478d722fdf5"} Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.835135 5129 generic.go:334] "Generic (PLEG): container finished" podID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerID="4727a4f876c81f431758412f374975c2b24c9697056e67a73dfbbaa41aca8019" exitCode=0 Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.835242 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" event={"ID":"f969c069-87cb-4571-85ef-3d88d8f510c5","Type":"ContainerDied","Data":"4727a4f876c81f431758412f374975c2b24c9697056e67a73dfbbaa41aca8019"} Mar 14 09:02:01 crc kubenswrapper[5129]: I0314 09:02:01.835306 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" event={"ID":"f969c069-87cb-4571-85ef-3d88d8f510c5","Type":"ContainerStarted","Data":"81ff6188262e538ca2312ea442a7c693d59c4587620fc45e2fb5660599728a93"} Mar 14 09:02:02 crc kubenswrapper[5129]: I0314 09:02:02.846429 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" event={"ID":"f969c069-87cb-4571-85ef-3d88d8f510c5","Type":"ContainerStarted","Data":"0e0608d16261557b94aa308a8804718cfe2aa72a71bb674e79a5a9e20e88b097"} Mar 14 09:02:02 crc kubenswrapper[5129]: I0314 09:02:02.846999 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:02 crc kubenswrapper[5129]: I0314 09:02:02.850719 5129 generic.go:334] "Generic (PLEG): container finished" podID="55864dec-a088-4577-a4ed-9588a43de404" containerID="c75ba01acb4672caba99ec3711556674795f55f48bb665875d0d94d48f1f3c9e" exitCode=0 Mar 14 09:02:02 crc kubenswrapper[5129]: I0314 09:02:02.850768 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-p9lb8" event={"ID":"55864dec-a088-4577-a4ed-9588a43de404","Type":"ContainerDied","Data":"c75ba01acb4672caba99ec3711556674795f55f48bb665875d0d94d48f1f3c9e"} Mar 14 09:02:02 crc kubenswrapper[5129]: I0314 09:02:02.880578 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" podStartSLOduration=2.880554144 podStartE2EDuration="2.880554144s" podCreationTimestamp="2026-03-14 09:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:02.869108143 +0000 UTC m=+7385.621023327" watchObservedRunningTime="2026-03-14 09:02:02.880554144 +0000 UTC m=+7385.632469338" Mar 14 09:02:03 crc kubenswrapper[5129]: I0314 09:02:03.037081 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:02:03 crc kubenswrapper[5129]: E0314 09:02:03.037321 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:02:04 crc kubenswrapper[5129]: I0314 09:02:04.819834 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-p9lb8" Mar 14 09:02:04 crc kubenswrapper[5129]: I0314 09:02:04.885523 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-p9lb8" event={"ID":"55864dec-a088-4577-a4ed-9588a43de404","Type":"ContainerDied","Data":"1378ab223ca0acebc3dce153914f5f20af6311b20dc93855bbe0775c77ef5d85"} Mar 14 09:02:04 crc kubenswrapper[5129]: I0314 09:02:04.885552 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-p9lb8" Mar 14 09:02:04 crc kubenswrapper[5129]: I0314 09:02:04.885564 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1378ab223ca0acebc3dce153914f5f20af6311b20dc93855bbe0775c77ef5d85" Mar 14 09:02:04 crc kubenswrapper[5129]: I0314 09:02:04.917401 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fdncd" podStartSLOduration=1.402774931 podStartE2EDuration="4.917380957s" podCreationTimestamp="2026-03-14 09:02:00 +0000 UTC" firstStartedPulling="2026-03-14 09:02:01.174493076 +0000 UTC m=+7383.926408260" lastFinishedPulling="2026-03-14 09:02:04.689099112 +0000 UTC m=+7387.441014286" observedRunningTime="2026-03-14 09:02:04.90903144 +0000 UTC m=+7387.660946654" watchObservedRunningTime="2026-03-14 09:02:04.917380957 +0000 UTC m=+7387.669296151" Mar 14 09:02:04 crc kubenswrapper[5129]: I0314 09:02:04.960673 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f55z2\" (UniqueName: \"kubernetes.io/projected/55864dec-a088-4577-a4ed-9588a43de404-kube-api-access-f55z2\") pod \"55864dec-a088-4577-a4ed-9588a43de404\" (UID: \"55864dec-a088-4577-a4ed-9588a43de404\") " Mar 14 09:02:04 crc kubenswrapper[5129]: I0314 09:02:04.965017 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55864dec-a088-4577-a4ed-9588a43de404-kube-api-access-f55z2" (OuterVolumeSpecName: "kube-api-access-f55z2") pod "55864dec-a088-4577-a4ed-9588a43de404" (UID: "55864dec-a088-4577-a4ed-9588a43de404"). InnerVolumeSpecName "kube-api-access-f55z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:05 crc kubenswrapper[5129]: I0314 09:02:05.064346 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f55z2\" (UniqueName: \"kubernetes.io/projected/55864dec-a088-4577-a4ed-9588a43de404-kube-api-access-f55z2\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:05 crc kubenswrapper[5129]: I0314 09:02:05.888731 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-rdx55"] Mar 14 09:02:05 crc kubenswrapper[5129]: I0314 09:02:05.897522 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-rdx55"] Mar 14 09:02:05 crc kubenswrapper[5129]: I0314 09:02:05.900417 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fdncd" event={"ID":"3f5125ed-86c0-489c-892f-03f76a8ecc42","Type":"ContainerStarted","Data":"bd48ded1d58df0ee5ea93d33c74a1fb6e912acc35b6a610adedfd18e7584ed0f"} Mar 14 09:02:06 crc kubenswrapper[5129]: I0314 09:02:06.050691 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eefd94af-b3c6-42db-9fdd-186870c8a943" path="/var/lib/kubelet/pods/eefd94af-b3c6-42db-9fdd-186870c8a943/volumes" Mar 14 09:02:06 crc kubenswrapper[5129]: I0314 09:02:06.909521 5129 generic.go:334] "Generic (PLEG): container finished" podID="3f5125ed-86c0-489c-892f-03f76a8ecc42" containerID="bd48ded1d58df0ee5ea93d33c74a1fb6e912acc35b6a610adedfd18e7584ed0f" exitCode=0 Mar 14 09:02:06 crc kubenswrapper[5129]: I0314 09:02:06.909765 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fdncd" event={"ID":"3f5125ed-86c0-489c-892f-03f76a8ecc42","Type":"ContainerDied","Data":"bd48ded1d58df0ee5ea93d33c74a1fb6e912acc35b6a610adedfd18e7584ed0f"} Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.256572 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.329374 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-config-data\") pod \"3f5125ed-86c0-489c-892f-03f76a8ecc42\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.329450 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-combined-ca-bundle\") pod \"3f5125ed-86c0-489c-892f-03f76a8ecc42\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.329548 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hprcq\" (UniqueName: \"kubernetes.io/projected/3f5125ed-86c0-489c-892f-03f76a8ecc42-kube-api-access-hprcq\") pod \"3f5125ed-86c0-489c-892f-03f76a8ecc42\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.329586 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-scripts\") pod \"3f5125ed-86c0-489c-892f-03f76a8ecc42\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.329643 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f5125ed-86c0-489c-892f-03f76a8ecc42-logs\") pod \"3f5125ed-86c0-489c-892f-03f76a8ecc42\" (UID: \"3f5125ed-86c0-489c-892f-03f76a8ecc42\") " Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.330481 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f5125ed-86c0-489c-892f-03f76a8ecc42-logs" (OuterVolumeSpecName: "logs") pod "3f5125ed-86c0-489c-892f-03f76a8ecc42" (UID: "3f5125ed-86c0-489c-892f-03f76a8ecc42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.337263 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5125ed-86c0-489c-892f-03f76a8ecc42-kube-api-access-hprcq" (OuterVolumeSpecName: "kube-api-access-hprcq") pod "3f5125ed-86c0-489c-892f-03f76a8ecc42" (UID: "3f5125ed-86c0-489c-892f-03f76a8ecc42"). InnerVolumeSpecName "kube-api-access-hprcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.338754 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-scripts" (OuterVolumeSpecName: "scripts") pod "3f5125ed-86c0-489c-892f-03f76a8ecc42" (UID: "3f5125ed-86c0-489c-892f-03f76a8ecc42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.353881 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f5125ed-86c0-489c-892f-03f76a8ecc42" (UID: "3f5125ed-86c0-489c-892f-03f76a8ecc42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.355568 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-config-data" (OuterVolumeSpecName: "config-data") pod "3f5125ed-86c0-489c-892f-03f76a8ecc42" (UID: "3f5125ed-86c0-489c-892f-03f76a8ecc42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.432385 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.432663 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hprcq\" (UniqueName: \"kubernetes.io/projected/3f5125ed-86c0-489c-892f-03f76a8ecc42-kube-api-access-hprcq\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.432727 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.432791 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f5125ed-86c0-489c-892f-03f76a8ecc42-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.432848 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f5125ed-86c0-489c-892f-03f76a8ecc42-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.931871 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fdncd" event={"ID":"3f5125ed-86c0-489c-892f-03f76a8ecc42","Type":"ContainerDied","Data":"39bf95e96a6d2a58e86de85e85dc1e7ef0459494c89b6faecb9ee478d722fdf5"} Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.931949 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39bf95e96a6d2a58e86de85e85dc1e7ef0459494c89b6faecb9ee478d722fdf5" Mar 14 09:02:08 crc kubenswrapper[5129]: I0314 09:02:08.932053 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fdncd" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.023690 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b7d77b964-ktb5v"] Mar 14 09:02:09 crc kubenswrapper[5129]: E0314 09:02:09.024300 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5125ed-86c0-489c-892f-03f76a8ecc42" containerName="placement-db-sync" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.024336 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5125ed-86c0-489c-892f-03f76a8ecc42" containerName="placement-db-sync" Mar 14 09:02:09 crc kubenswrapper[5129]: E0314 09:02:09.024367 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55864dec-a088-4577-a4ed-9588a43de404" containerName="oc" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.024377 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="55864dec-a088-4577-a4ed-9588a43de404" containerName="oc" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.024676 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f5125ed-86c0-489c-892f-03f76a8ecc42" containerName="placement-db-sync" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.024710 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="55864dec-a088-4577-a4ed-9588a43de404" containerName="oc" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.026878 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.030955 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.031867 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-22j74" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.035925 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.036308 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.036742 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.043863 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b7d77b964-ktb5v"] Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.150346 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-public-tls-certs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.150482 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhcq\" (UniqueName: \"kubernetes.io/projected/e9b25878-27e7-4295-8d72-0011b2031ba3-kube-api-access-zzhcq\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.150571 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-scripts\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.150627 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b25878-27e7-4295-8d72-0011b2031ba3-logs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.150699 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-internal-tls-certs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.150817 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-config-data\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.150883 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-combined-ca-bundle\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.252744 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-config-data\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.252818 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-combined-ca-bundle\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.252858 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-public-tls-certs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.252897 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhcq\" (UniqueName: \"kubernetes.io/projected/e9b25878-27e7-4295-8d72-0011b2031ba3-kube-api-access-zzhcq\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.252932 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-scripts\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.252955 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b25878-27e7-4295-8d72-0011b2031ba3-logs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.252998 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-internal-tls-certs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.254323 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b25878-27e7-4295-8d72-0011b2031ba3-logs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.257790 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-internal-tls-certs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.258450 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-config-data\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.258799 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-scripts\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.259178 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-public-tls-certs\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.259773 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b25878-27e7-4295-8d72-0011b2031ba3-combined-ca-bundle\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.274173 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhcq\" (UniqueName: \"kubernetes.io/projected/e9b25878-27e7-4295-8d72-0011b2031ba3-kube-api-access-zzhcq\") pod \"placement-7b7d77b964-ktb5v\" (UID: \"e9b25878-27e7-4295-8d72-0011b2031ba3\") " pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.378239 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.873464 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b7d77b964-ktb5v"] Mar 14 09:02:09 crc kubenswrapper[5129]: I0314 09:02:09.946656 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b7d77b964-ktb5v" event={"ID":"e9b25878-27e7-4295-8d72-0011b2031ba3","Type":"ContainerStarted","Data":"ee2ea6aaea00a11c2bccadd95c1ce89140583867f2999ea45ae84fef97a6c738"} Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.567707 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.676975 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c4b58fb7-5mf7r"] Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.677224 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerName="dnsmasq-dns" containerID="cri-o://7ec56ff253cfd4101b689a17581b810258402e3c148ba662dc2cdaee3848685f" gracePeriod=10 Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.961698 5129 generic.go:334] "Generic (PLEG): container finished" podID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerID="7ec56ff253cfd4101b689a17581b810258402e3c148ba662dc2cdaee3848685f" exitCode=0 Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.961822 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" event={"ID":"c4715224-6c27-4dc1-9380-fe174dcc9521","Type":"ContainerDied","Data":"7ec56ff253cfd4101b689a17581b810258402e3c148ba662dc2cdaee3848685f"} Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.965728 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b7d77b964-ktb5v" event={"ID":"e9b25878-27e7-4295-8d72-0011b2031ba3","Type":"ContainerStarted","Data":"139527862af07a54d4784960034a5707fefe46629e011b1b88d9109a63ad8e3e"} Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.965783 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b7d77b964-ktb5v" event={"ID":"e9b25878-27e7-4295-8d72-0011b2031ba3","Type":"ContainerStarted","Data":"e0ac67de00b6b787e92c6192da6d89eff09b94a7eabd95ef6f18a0297b7b723d"} Mar 14 09:02:10 crc kubenswrapper[5129]: I0314 09:02:10.965953 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.011386 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b7d77b964-ktb5v" podStartSLOduration=3.011358518 podStartE2EDuration="3.011358518s" podCreationTimestamp="2026-03-14 09:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:02:11.006688022 +0000 UTC m=+7393.758603236" watchObservedRunningTime="2026-03-14 09:02:11.011358518 +0000 UTC m=+7393.763273702" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.198766 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.314703 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-sb\") pod \"c4715224-6c27-4dc1-9380-fe174dcc9521\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.314858 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-nb\") pod \"c4715224-6c27-4dc1-9380-fe174dcc9521\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.314986 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzn5h\" (UniqueName: \"kubernetes.io/projected/c4715224-6c27-4dc1-9380-fe174dcc9521-kube-api-access-gzn5h\") pod \"c4715224-6c27-4dc1-9380-fe174dcc9521\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.315078 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-dns-svc\") pod \"c4715224-6c27-4dc1-9380-fe174dcc9521\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.315148 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-config\") pod \"c4715224-6c27-4dc1-9380-fe174dcc9521\" (UID: \"c4715224-6c27-4dc1-9380-fe174dcc9521\") " Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.324769 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4715224-6c27-4dc1-9380-fe174dcc9521-kube-api-access-gzn5h" (OuterVolumeSpecName: "kube-api-access-gzn5h") pod "c4715224-6c27-4dc1-9380-fe174dcc9521" (UID: "c4715224-6c27-4dc1-9380-fe174dcc9521"). InnerVolumeSpecName "kube-api-access-gzn5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.369735 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4715224-6c27-4dc1-9380-fe174dcc9521" (UID: "c4715224-6c27-4dc1-9380-fe174dcc9521"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.373136 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4715224-6c27-4dc1-9380-fe174dcc9521" (UID: "c4715224-6c27-4dc1-9380-fe174dcc9521"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.377094 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-config" (OuterVolumeSpecName: "config") pod "c4715224-6c27-4dc1-9380-fe174dcc9521" (UID: "c4715224-6c27-4dc1-9380-fe174dcc9521"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.383312 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4715224-6c27-4dc1-9380-fe174dcc9521" (UID: "c4715224-6c27-4dc1-9380-fe174dcc9521"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.418173 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.418942 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzn5h\" (UniqueName: \"kubernetes.io/projected/c4715224-6c27-4dc1-9380-fe174dcc9521-kube-api-access-gzn5h\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.418967 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.418980 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.418992 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4715224-6c27-4dc1-9380-fe174dcc9521-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.988034 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" event={"ID":"c4715224-6c27-4dc1-9380-fe174dcc9521","Type":"ContainerDied","Data":"891b230714423c951308bf0a7d46f706930bb8d895184572fba2660cb06c6cae"} Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.988105 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.988865 5129 scope.go:117] "RemoveContainer" containerID="7ec56ff253cfd4101b689a17581b810258402e3c148ba662dc2cdaee3848685f" Mar 14 09:02:11 crc kubenswrapper[5129]: I0314 09:02:11.988833 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:12 crc kubenswrapper[5129]: I0314 09:02:12.039171 5129 scope.go:117] "RemoveContainer" containerID="e213f634472cb1b1e8e23692c3610ecd0936e58bdefd080ee41e3d5dd1a53f44" Mar 14 09:02:12 crc kubenswrapper[5129]: I0314 09:02:12.051851 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c4b58fb7-5mf7r"] Mar 14 09:02:12 crc kubenswrapper[5129]: I0314 09:02:12.058535 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56c4b58fb7-5mf7r"] Mar 14 09:02:14 crc kubenswrapper[5129]: I0314 09:02:14.050710 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" path="/var/lib/kubelet/pods/c4715224-6c27-4dc1-9380-fe174dcc9521/volumes" Mar 14 09:02:16 crc kubenswrapper[5129]: I0314 09:02:16.170929 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56c4b58fb7-5mf7r" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.114:5353: i/o timeout" Mar 14 09:02:17 crc kubenswrapper[5129]: I0314 09:02:17.037421 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:02:17 crc kubenswrapper[5129]: E0314 09:02:17.037818 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:02:28 crc kubenswrapper[5129]: I0314 09:02:28.571595 5129 scope.go:117] "RemoveContainer" containerID="f67b6b39b3736a02fcef7bb8a9cb3062492b089bba26db5fd87f34580a741cbe" Mar 14 09:02:29 crc kubenswrapper[5129]: I0314 09:02:29.037569 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:02:29 crc kubenswrapper[5129]: E0314 09:02:29.038441 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:02:40 crc kubenswrapper[5129]: I0314 09:02:40.412368 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:40 crc kubenswrapper[5129]: I0314 09:02:40.413436 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b7d77b964-ktb5v" Mar 14 09:02:42 crc kubenswrapper[5129]: I0314 09:02:42.038347 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:02:42 crc kubenswrapper[5129]: E0314 09:02:42.039155 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.011731 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cghvs"] Mar 14 09:02:48 crc kubenswrapper[5129]: E0314 09:02:48.012712 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerName="dnsmasq-dns" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.012730 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerName="dnsmasq-dns" Mar 14 09:02:48 crc kubenswrapper[5129]: E0314 09:02:48.012790 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerName="init" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.012799 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerName="init" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.013024 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4715224-6c27-4dc1-9380-fe174dcc9521" containerName="dnsmasq-dns" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.022005 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.075272 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cghvs"] Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.116386 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-utilities\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.116441 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-catalog-content\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.116471 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkx6\" (UniqueName: \"kubernetes.io/projected/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-kube-api-access-cgkx6\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.218434 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-catalog-content\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.218489 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkx6\" (UniqueName: \"kubernetes.io/projected/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-kube-api-access-cgkx6\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.218661 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-utilities\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.219021 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-catalog-content\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.219122 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-utilities\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.246707 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkx6\" (UniqueName: \"kubernetes.io/projected/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-kube-api-access-cgkx6\") pod \"redhat-marketplace-cghvs\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.359857 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:48 crc kubenswrapper[5129]: I0314 09:02:48.829417 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cghvs"] Mar 14 09:02:49 crc kubenswrapper[5129]: I0314 09:02:49.379972 5129 generic.go:334] "Generic (PLEG): container finished" podID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerID="d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91" exitCode=0 Mar 14 09:02:49 crc kubenswrapper[5129]: I0314 09:02:49.380029 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cghvs" event={"ID":"4631ea27-30b8-46d6-bd48-bc2e49cf60f0","Type":"ContainerDied","Data":"d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91"} Mar 14 09:02:49 crc kubenswrapper[5129]: I0314 09:02:49.380355 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cghvs" event={"ID":"4631ea27-30b8-46d6-bd48-bc2e49cf60f0","Type":"ContainerStarted","Data":"9efe6e796bce6a6cb0f5fb58826b7fdf85efae4794529baa8e241f17581a56dd"} Mar 14 09:02:50 crc kubenswrapper[5129]: I0314 09:02:50.390172 5129 generic.go:334] "Generic (PLEG): container finished" podID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerID="c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9" exitCode=0 Mar 14 09:02:50 crc kubenswrapper[5129]: I0314 09:02:50.390282 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cghvs" event={"ID":"4631ea27-30b8-46d6-bd48-bc2e49cf60f0","Type":"ContainerDied","Data":"c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9"} Mar 14 09:02:51 crc kubenswrapper[5129]: I0314 09:02:51.400350 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cghvs" event={"ID":"4631ea27-30b8-46d6-bd48-bc2e49cf60f0","Type":"ContainerStarted","Data":"ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01"} Mar 14 09:02:51 crc kubenswrapper[5129]: I0314 09:02:51.427571 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cghvs" podStartSLOduration=3.047178097 podStartE2EDuration="4.427552296s" podCreationTimestamp="2026-03-14 09:02:47 +0000 UTC" firstStartedPulling="2026-03-14 09:02:49.381890172 +0000 UTC m=+7432.133805376" lastFinishedPulling="2026-03-14 09:02:50.762264391 +0000 UTC m=+7433.514179575" observedRunningTime="2026-03-14 09:02:51.417819761 +0000 UTC m=+7434.169734945" watchObservedRunningTime="2026-03-14 09:02:51.427552296 +0000 UTC m=+7434.179467480" Mar 14 09:02:53 crc kubenswrapper[5129]: I0314 09:02:53.036019 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:02:53 crc kubenswrapper[5129]: I0314 09:02:53.423531 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"3d4cd803a025626b83183f6bb1c4666176a5fbed36f07324bbfa359be7b98a8d"} Mar 14 09:02:58 crc kubenswrapper[5129]: I0314 09:02:58.360914 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:58 crc kubenswrapper[5129]: I0314 09:02:58.361920 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:58 crc kubenswrapper[5129]: I0314 09:02:58.411632 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:58 crc kubenswrapper[5129]: I0314 09:02:58.521651 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:02:58 crc kubenswrapper[5129]: I0314 09:02:58.647111 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cghvs"] Mar 14 09:03:00 crc kubenswrapper[5129]: I0314 09:03:00.479380 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cghvs" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="registry-server" containerID="cri-o://ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01" gracePeriod=2 Mar 14 09:03:00 crc kubenswrapper[5129]: I0314 09:03:00.967772 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.034218 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgkx6\" (UniqueName: \"kubernetes.io/projected/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-kube-api-access-cgkx6\") pod \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.034344 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-utilities\") pod \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.034433 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-catalog-content\") pod \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\" (UID: \"4631ea27-30b8-46d6-bd48-bc2e49cf60f0\") " Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.036247 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-utilities" (OuterVolumeSpecName: "utilities") pod "4631ea27-30b8-46d6-bd48-bc2e49cf60f0" (UID: "4631ea27-30b8-46d6-bd48-bc2e49cf60f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.055565 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-kube-api-access-cgkx6" (OuterVolumeSpecName: "kube-api-access-cgkx6") pod "4631ea27-30b8-46d6-bd48-bc2e49cf60f0" (UID: "4631ea27-30b8-46d6-bd48-bc2e49cf60f0"). InnerVolumeSpecName "kube-api-access-cgkx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.084928 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4631ea27-30b8-46d6-bd48-bc2e49cf60f0" (UID: "4631ea27-30b8-46d6-bd48-bc2e49cf60f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.137123 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgkx6\" (UniqueName: \"kubernetes.io/projected/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-kube-api-access-cgkx6\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.137168 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.137188 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4631ea27-30b8-46d6-bd48-bc2e49cf60f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.497906 5129 generic.go:334] "Generic (PLEG): container finished" podID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerID="ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01" exitCode=0 Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.497920 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cghvs" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.497971 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cghvs" event={"ID":"4631ea27-30b8-46d6-bd48-bc2e49cf60f0","Type":"ContainerDied","Data":"ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01"} Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.498462 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cghvs" event={"ID":"4631ea27-30b8-46d6-bd48-bc2e49cf60f0","Type":"ContainerDied","Data":"9efe6e796bce6a6cb0f5fb58826b7fdf85efae4794529baa8e241f17581a56dd"} Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.498494 5129 scope.go:117] "RemoveContainer" containerID="ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.539181 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cghvs"] Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.540109 5129 scope.go:117] "RemoveContainer" containerID="c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.555345 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cghvs"] Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.563028 5129 scope.go:117] "RemoveContainer" containerID="d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.621228 5129 scope.go:117] "RemoveContainer" containerID="ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01" Mar 14 09:03:01 crc kubenswrapper[5129]: E0314 09:03:01.621911 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01\": container with ID starting with ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01 not found: ID does not exist" containerID="ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.621967 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01"} err="failed to get container status \"ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01\": rpc error: code = NotFound desc = could not find container \"ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01\": container with ID starting with ce2b9273cd1cc1921ba6e69e57ad6f2d833bb06acc84ad4c5850dcbdeca00c01 not found: ID does not exist" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.622001 5129 scope.go:117] "RemoveContainer" containerID="c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9" Mar 14 09:03:01 crc kubenswrapper[5129]: E0314 09:03:01.622430 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9\": container with ID starting with c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9 not found: ID does not exist" containerID="c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.622458 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9"} err="failed to get container status \"c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9\": rpc error: code = NotFound desc = could not find container \"c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9\": container with ID starting with c261f4ce2cfde9f7c327b2b6a2a720e1b0c6f7057d9693efa5e01f41488505b9 not found: ID does not exist" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.622476 5129 scope.go:117] "RemoveContainer" containerID="d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91" Mar 14 09:03:01 crc kubenswrapper[5129]: E0314 09:03:01.624100 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91\": container with ID starting with d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91 not found: ID does not exist" containerID="d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91" Mar 14 09:03:01 crc kubenswrapper[5129]: I0314 09:03:01.624135 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91"} err="failed to get container status \"d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91\": rpc error: code = NotFound desc = could not find container \"d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91\": container with ID starting with d91d929da0eed08e52a7cbb38d6198a6a445b354cfd5f72bedf9af4888ab7f91 not found: ID does not exist" Mar 14 09:03:02 crc kubenswrapper[5129]: I0314 09:03:02.050113 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" path="/var/lib/kubelet/pods/4631ea27-30b8-46d6-bd48-bc2e49cf60f0/volumes" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.810975 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-cgdm4"] Mar 14 09:03:04 crc kubenswrapper[5129]: E0314 09:03:04.811647 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="extract-content" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.811660 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="extract-content" Mar 14 09:03:04 crc kubenswrapper[5129]: E0314 09:03:04.811677 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="extract-utilities" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.811683 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="extract-utilities" Mar 14 09:03:04 crc kubenswrapper[5129]: E0314 09:03:04.811701 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="registry-server" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.811708 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="registry-server" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.811882 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4631ea27-30b8-46d6-bd48-bc2e49cf60f0" containerName="registry-server" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.812462 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.834370 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cgdm4"] Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.906224 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tls68"] Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.906637 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswpp\" (UniqueName: \"kubernetes.io/projected/298699ac-5f93-42c8-aecb-d98ef33e5d0c-kube-api-access-fswpp\") pod \"nova-api-db-create-cgdm4\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.906878 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/298699ac-5f93-42c8-aecb-d98ef33e5d0c-operator-scripts\") pod \"nova-api-db-create-cgdm4\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.907433 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:04 crc kubenswrapper[5129]: I0314 09:03:04.916655 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tls68"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.008718 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswpp\" (UniqueName: \"kubernetes.io/projected/298699ac-5f93-42c8-aecb-d98ef33e5d0c-kube-api-access-fswpp\") pod \"nova-api-db-create-cgdm4\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.008780 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b843f393-c69e-418c-9fde-9c694dba8294-operator-scripts\") pod \"nova-cell0-db-create-tls68\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.008812 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btpk4\" (UniqueName: \"kubernetes.io/projected/b843f393-c69e-418c-9fde-9c694dba8294-kube-api-access-btpk4\") pod \"nova-cell0-db-create-tls68\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.008848 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/298699ac-5f93-42c8-aecb-d98ef33e5d0c-operator-scripts\") pod \"nova-api-db-create-cgdm4\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.009793 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/298699ac-5f93-42c8-aecb-d98ef33e5d0c-operator-scripts\") pod \"nova-api-db-create-cgdm4\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.013787 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wq6v2"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.015468 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.024421 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d443-account-create-update-25726"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.025902 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.030434 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.042388 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswpp\" (UniqueName: \"kubernetes.io/projected/298699ac-5f93-42c8-aecb-d98ef33e5d0c-kube-api-access-fswpp\") pod \"nova-api-db-create-cgdm4\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.051366 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d443-account-create-update-25726"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.075836 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wq6v2"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.111419 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c95390a-bb53-4827-8c22-07ebbd28ab75-operator-scripts\") pod \"nova-cell1-db-create-wq6v2\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.111487 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c114c0ae-fd24-4e0e-86d2-0586efa897cb-operator-scripts\") pod \"nova-api-d443-account-create-update-25726\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.111540 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlnv8\" (UniqueName: \"kubernetes.io/projected/1c95390a-bb53-4827-8c22-07ebbd28ab75-kube-api-access-hlnv8\") pod \"nova-cell1-db-create-wq6v2\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.111587 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbkl\" (UniqueName: \"kubernetes.io/projected/c114c0ae-fd24-4e0e-86d2-0586efa897cb-kube-api-access-mmbkl\") pod \"nova-api-d443-account-create-update-25726\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.111759 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b843f393-c69e-418c-9fde-9c694dba8294-operator-scripts\") pod \"nova-cell0-db-create-tls68\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.112474 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b843f393-c69e-418c-9fde-9c694dba8294-operator-scripts\") pod \"nova-cell0-db-create-tls68\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.112567 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btpk4\" (UniqueName: \"kubernetes.io/projected/b843f393-c69e-418c-9fde-9c694dba8294-kube-api-access-btpk4\") pod \"nova-cell0-db-create-tls68\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.131415 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.133067 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btpk4\" (UniqueName: \"kubernetes.io/projected/b843f393-c69e-418c-9fde-9c694dba8294-kube-api-access-btpk4\") pod \"nova-cell0-db-create-tls68\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.212520 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2020-account-create-update-lxdjn"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.214165 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.214095 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c114c0ae-fd24-4e0e-86d2-0586efa897cb-operator-scripts\") pod \"nova-api-d443-account-create-update-25726\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.214328 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlnv8\" (UniqueName: \"kubernetes.io/projected/1c95390a-bb53-4827-8c22-07ebbd28ab75-kube-api-access-hlnv8\") pod \"nova-cell1-db-create-wq6v2\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.214407 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbkl\" (UniqueName: \"kubernetes.io/projected/c114c0ae-fd24-4e0e-86d2-0586efa897cb-kube-api-access-mmbkl\") pod \"nova-api-d443-account-create-update-25726\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.214726 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c114c0ae-fd24-4e0e-86d2-0586efa897cb-operator-scripts\") pod \"nova-api-d443-account-create-update-25726\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.214763 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c95390a-bb53-4827-8c22-07ebbd28ab75-operator-scripts\") pod \"nova-cell1-db-create-wq6v2\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.215495 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c95390a-bb53-4827-8c22-07ebbd28ab75-operator-scripts\") pod \"nova-cell1-db-create-wq6v2\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.222362 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.232889 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbkl\" (UniqueName: \"kubernetes.io/projected/c114c0ae-fd24-4e0e-86d2-0586efa897cb-kube-api-access-mmbkl\") pod \"nova-api-d443-account-create-update-25726\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.232917 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2020-account-create-update-lxdjn"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.238292 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlnv8\" (UniqueName: \"kubernetes.io/projected/1c95390a-bb53-4827-8c22-07ebbd28ab75-kube-api-access-hlnv8\") pod \"nova-cell1-db-create-wq6v2\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.241955 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.316861 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-operator-scripts\") pod \"nova-cell0-2020-account-create-update-lxdjn\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.316947 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt745\" (UniqueName: \"kubernetes.io/projected/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-kube-api-access-xt745\") pod \"nova-cell0-2020-account-create-update-lxdjn\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.381400 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.411991 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.418989 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-operator-scripts\") pod \"nova-cell0-2020-account-create-update-lxdjn\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.419068 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt745\" (UniqueName: \"kubernetes.io/projected/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-kube-api-access-xt745\") pod \"nova-cell0-2020-account-create-update-lxdjn\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.420250 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-operator-scripts\") pod \"nova-cell0-2020-account-create-update-lxdjn\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.426106 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-adc2-account-create-update-pcn6r"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.427454 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.429760 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.439412 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt745\" (UniqueName: \"kubernetes.io/projected/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-kube-api-access-xt745\") pod \"nova-cell0-2020-account-create-update-lxdjn\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.449672 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-adc2-account-create-update-pcn6r"] Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.521282 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gvg\" (UniqueName: \"kubernetes.io/projected/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-kube-api-access-45gvg\") pod \"nova-cell1-adc2-account-create-update-pcn6r\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.521947 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-operator-scripts\") pod \"nova-cell1-adc2-account-create-update-pcn6r\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.624961 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gvg\" (UniqueName: \"kubernetes.io/projected/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-kube-api-access-45gvg\") pod \"nova-cell1-adc2-account-create-update-pcn6r\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.625086 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-operator-scripts\") pod \"nova-cell1-adc2-account-create-update-pcn6r\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.626224 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-operator-scripts\") pod \"nova-cell1-adc2-account-create-update-pcn6r\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.628739 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.646393 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gvg\" (UniqueName: \"kubernetes.io/projected/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-kube-api-access-45gvg\") pod \"nova-cell1-adc2-account-create-update-pcn6r\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.735779 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cgdm4"] Mar 14 09:03:05 crc kubenswrapper[5129]: W0314 09:03:05.741507 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod298699ac_5f93_42c8_aecb_d98ef33e5d0c.slice/crio-f547c42f5cdaed0a1851883be7b2f3d2ddc06f26d23b0e5db26f43fc122cb0dd WatchSource:0}: Error finding container f547c42f5cdaed0a1851883be7b2f3d2ddc06f26d23b0e5db26f43fc122cb0dd: Status 404 returned error can't find the container with id f547c42f5cdaed0a1851883be7b2f3d2ddc06f26d23b0e5db26f43fc122cb0dd Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.757933 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:05 crc kubenswrapper[5129]: I0314 09:03:05.857103 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tls68"] Mar 14 09:03:05 crc kubenswrapper[5129]: W0314 09:03:05.869270 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb843f393_c69e_418c_9fde_9c694dba8294.slice/crio-b1536dbaf2f5ed4695dd76eb11f2d368763770aff798c85c911ac58485f1c7de WatchSource:0}: Error finding container b1536dbaf2f5ed4695dd76eb11f2d368763770aff798c85c911ac58485f1c7de: Status 404 returned error can't find the container with id b1536dbaf2f5ed4695dd76eb11f2d368763770aff798c85c911ac58485f1c7de Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.022740 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wq6v2"] Mar 14 09:03:06 crc kubenswrapper[5129]: W0314 09:03:06.023371 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c95390a_bb53_4827_8c22_07ebbd28ab75.slice/crio-a2d4b73351ef801c462a59d66ef406fe8b39e240f13faa64982b1418b5925178 WatchSource:0}: Error finding container a2d4b73351ef801c462a59d66ef406fe8b39e240f13faa64982b1418b5925178: Status 404 returned error can't find the container with id a2d4b73351ef801c462a59d66ef406fe8b39e240f13faa64982b1418b5925178 Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.090521 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d443-account-create-update-25726"] Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.195017 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2020-account-create-update-lxdjn"] Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.325519 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-adc2-account-create-update-pcn6r"] Mar 14 09:03:06 crc kubenswrapper[5129]: W0314 09:03:06.446120 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaab6a3ac_9b76_4c9d_8ec6_a132f4d19697.slice/crio-1f6dc858879a50522eea1b6c06026573be007feddd5d0a02c83bb2c63b6af2b5 WatchSource:0}: Error finding container 1f6dc858879a50522eea1b6c06026573be007feddd5d0a02c83bb2c63b6af2b5: Status 404 returned error can't find the container with id 1f6dc858879a50522eea1b6c06026573be007feddd5d0a02c83bb2c63b6af2b5 Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.598667 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" event={"ID":"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697","Type":"ContainerStarted","Data":"1f6dc858879a50522eea1b6c06026573be007feddd5d0a02c83bb2c63b6af2b5"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.610578 5129 generic.go:334] "Generic (PLEG): container finished" podID="298699ac-5f93-42c8-aecb-d98ef33e5d0c" containerID="bc9ce5c0f454c94ae8dbd2d23c4efa43351db43bed5bbcc474aa72f1d9bce281" exitCode=0 Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.610741 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cgdm4" event={"ID":"298699ac-5f93-42c8-aecb-d98ef33e5d0c","Type":"ContainerDied","Data":"bc9ce5c0f454c94ae8dbd2d23c4efa43351db43bed5bbcc474aa72f1d9bce281"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.610782 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cgdm4" event={"ID":"298699ac-5f93-42c8-aecb-d98ef33e5d0c","Type":"ContainerStarted","Data":"f547c42f5cdaed0a1851883be7b2f3d2ddc06f26d23b0e5db26f43fc122cb0dd"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.613337 5129 generic.go:334] "Generic (PLEG): container finished" podID="b843f393-c69e-418c-9fde-9c694dba8294" containerID="e3ae1c978c21c7c3c941a7bd0b3c16150c402fc62f311c3e5c8b036c013135b2" exitCode=0 Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.613385 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tls68" event={"ID":"b843f393-c69e-418c-9fde-9c694dba8294","Type":"ContainerDied","Data":"e3ae1c978c21c7c3c941a7bd0b3c16150c402fc62f311c3e5c8b036c013135b2"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.613403 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tls68" event={"ID":"b843f393-c69e-418c-9fde-9c694dba8294","Type":"ContainerStarted","Data":"b1536dbaf2f5ed4695dd76eb11f2d368763770aff798c85c911ac58485f1c7de"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.616549 5129 generic.go:334] "Generic (PLEG): container finished" podID="c114c0ae-fd24-4e0e-86d2-0586efa897cb" containerID="c8804c5e26971b8475808ad1b883f48f0d56e512ebade61e2da21953ffa44c28" exitCode=0 Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.616591 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d443-account-create-update-25726" event={"ID":"c114c0ae-fd24-4e0e-86d2-0586efa897cb","Type":"ContainerDied","Data":"c8804c5e26971b8475808ad1b883f48f0d56e512ebade61e2da21953ffa44c28"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.616623 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d443-account-create-update-25726" event={"ID":"c114c0ae-fd24-4e0e-86d2-0586efa897cb","Type":"ContainerStarted","Data":"4930e5545e678103ff59cfe8200a0f1f42ce1358cefc8a189bc8e4369330bd95"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.619918 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" event={"ID":"eeeeb0dd-39ba-4a10-a223-6c0d1079c766","Type":"ContainerStarted","Data":"50deee67b4488f6a60d6560ebf294e6cbfb76833bb2c4e7a2626445b081cde39"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.619949 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" event={"ID":"eeeeb0dd-39ba-4a10-a223-6c0d1079c766","Type":"ContainerStarted","Data":"d484387487da26635353e1963a4aa4eaf6b4e023148c3efd54c3405149826d64"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.624099 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wq6v2" event={"ID":"1c95390a-bb53-4827-8c22-07ebbd28ab75","Type":"ContainerStarted","Data":"6b84d60870179db615b3da66187291eb69748aa57bf32c6c0c0ecb423866f1b6"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.624139 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wq6v2" event={"ID":"1c95390a-bb53-4827-8c22-07ebbd28ab75","Type":"ContainerStarted","Data":"a2d4b73351ef801c462a59d66ef406fe8b39e240f13faa64982b1418b5925178"} Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.694342 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-wq6v2" podStartSLOduration=2.694308816 podStartE2EDuration="2.694308816s" podCreationTimestamp="2026-03-14 09:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:06.664219511 +0000 UTC m=+7449.416134695" watchObservedRunningTime="2026-03-14 09:03:06.694308816 +0000 UTC m=+7449.446224000" Mar 14 09:03:06 crc kubenswrapper[5129]: I0314 09:03:06.708512 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" podStartSLOduration=1.708484871 podStartE2EDuration="1.708484871s" podCreationTimestamp="2026-03-14 09:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:06.681712255 +0000 UTC m=+7449.433627449" watchObservedRunningTime="2026-03-14 09:03:06.708484871 +0000 UTC m=+7449.460400055" Mar 14 09:03:07 crc kubenswrapper[5129]: I0314 09:03:07.633482 5129 generic.go:334] "Generic (PLEG): container finished" podID="eeeeb0dd-39ba-4a10-a223-6c0d1079c766" containerID="50deee67b4488f6a60d6560ebf294e6cbfb76833bb2c4e7a2626445b081cde39" exitCode=0 Mar 14 09:03:07 crc kubenswrapper[5129]: I0314 09:03:07.633533 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" event={"ID":"eeeeb0dd-39ba-4a10-a223-6c0d1079c766","Type":"ContainerDied","Data":"50deee67b4488f6a60d6560ebf294e6cbfb76833bb2c4e7a2626445b081cde39"} Mar 14 09:03:07 crc kubenswrapper[5129]: I0314 09:03:07.636563 5129 generic.go:334] "Generic (PLEG): container finished" podID="1c95390a-bb53-4827-8c22-07ebbd28ab75" containerID="6b84d60870179db615b3da66187291eb69748aa57bf32c6c0c0ecb423866f1b6" exitCode=0 Mar 14 09:03:07 crc kubenswrapper[5129]: I0314 09:03:07.636634 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wq6v2" event={"ID":"1c95390a-bb53-4827-8c22-07ebbd28ab75","Type":"ContainerDied","Data":"6b84d60870179db615b3da66187291eb69748aa57bf32c6c0c0ecb423866f1b6"} Mar 14 09:03:07 crc kubenswrapper[5129]: I0314 09:03:07.639036 5129 generic.go:334] "Generic (PLEG): container finished" podID="aab6a3ac-9b76-4c9d-8ec6-a132f4d19697" containerID="e7eea9eaa7f6b76f08bded4a614e18255db83f1f1acb2f4a436c98a06f0c917f" exitCode=0 Mar 14 09:03:07 crc kubenswrapper[5129]: I0314 09:03:07.639082 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" event={"ID":"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697","Type":"ContainerDied","Data":"e7eea9eaa7f6b76f08bded4a614e18255db83f1f1acb2f4a436c98a06f0c917f"} Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.115014 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.122469 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.127757 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.292504 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmbkl\" (UniqueName: \"kubernetes.io/projected/c114c0ae-fd24-4e0e-86d2-0586efa897cb-kube-api-access-mmbkl\") pod \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.292712 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/298699ac-5f93-42c8-aecb-d98ef33e5d0c-operator-scripts\") pod \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.292741 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b843f393-c69e-418c-9fde-9c694dba8294-operator-scripts\") pod \"b843f393-c69e-418c-9fde-9c694dba8294\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.292778 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c114c0ae-fd24-4e0e-86d2-0586efa897cb-operator-scripts\") pod \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\" (UID: \"c114c0ae-fd24-4e0e-86d2-0586efa897cb\") " Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.292880 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fswpp\" (UniqueName: \"kubernetes.io/projected/298699ac-5f93-42c8-aecb-d98ef33e5d0c-kube-api-access-fswpp\") pod \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\" (UID: \"298699ac-5f93-42c8-aecb-d98ef33e5d0c\") " Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.293107 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btpk4\" (UniqueName: \"kubernetes.io/projected/b843f393-c69e-418c-9fde-9c694dba8294-kube-api-access-btpk4\") pod \"b843f393-c69e-418c-9fde-9c694dba8294\" (UID: \"b843f393-c69e-418c-9fde-9c694dba8294\") " Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.293847 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b843f393-c69e-418c-9fde-9c694dba8294-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b843f393-c69e-418c-9fde-9c694dba8294" (UID: "b843f393-c69e-418c-9fde-9c694dba8294"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.294286 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298699ac-5f93-42c8-aecb-d98ef33e5d0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "298699ac-5f93-42c8-aecb-d98ef33e5d0c" (UID: "298699ac-5f93-42c8-aecb-d98ef33e5d0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.294575 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c114c0ae-fd24-4e0e-86d2-0586efa897cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c114c0ae-fd24-4e0e-86d2-0586efa897cb" (UID: "c114c0ae-fd24-4e0e-86d2-0586efa897cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.298686 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c114c0ae-fd24-4e0e-86d2-0586efa897cb-kube-api-access-mmbkl" (OuterVolumeSpecName: "kube-api-access-mmbkl") pod "c114c0ae-fd24-4e0e-86d2-0586efa897cb" (UID: "c114c0ae-fd24-4e0e-86d2-0586efa897cb"). InnerVolumeSpecName "kube-api-access-mmbkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.298954 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b843f393-c69e-418c-9fde-9c694dba8294-kube-api-access-btpk4" (OuterVolumeSpecName: "kube-api-access-btpk4") pod "b843f393-c69e-418c-9fde-9c694dba8294" (UID: "b843f393-c69e-418c-9fde-9c694dba8294"). InnerVolumeSpecName "kube-api-access-btpk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.299519 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298699ac-5f93-42c8-aecb-d98ef33e5d0c-kube-api-access-fswpp" (OuterVolumeSpecName: "kube-api-access-fswpp") pod "298699ac-5f93-42c8-aecb-d98ef33e5d0c" (UID: "298699ac-5f93-42c8-aecb-d98ef33e5d0c"). InnerVolumeSpecName "kube-api-access-fswpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.396508 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btpk4\" (UniqueName: \"kubernetes.io/projected/b843f393-c69e-418c-9fde-9c694dba8294-kube-api-access-btpk4\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.396573 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmbkl\" (UniqueName: \"kubernetes.io/projected/c114c0ae-fd24-4e0e-86d2-0586efa897cb-kube-api-access-mmbkl\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.396587 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/298699ac-5f93-42c8-aecb-d98ef33e5d0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.396624 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b843f393-c69e-418c-9fde-9c694dba8294-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.396646 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c114c0ae-fd24-4e0e-86d2-0586efa897cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.396659 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fswpp\" (UniqueName: \"kubernetes.io/projected/298699ac-5f93-42c8-aecb-d98ef33e5d0c-kube-api-access-fswpp\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.649339 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cgdm4" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.649338 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cgdm4" event={"ID":"298699ac-5f93-42c8-aecb-d98ef33e5d0c","Type":"ContainerDied","Data":"f547c42f5cdaed0a1851883be7b2f3d2ddc06f26d23b0e5db26f43fc122cb0dd"} Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.649498 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f547c42f5cdaed0a1851883be7b2f3d2ddc06f26d23b0e5db26f43fc122cb0dd" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.651117 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tls68" event={"ID":"b843f393-c69e-418c-9fde-9c694dba8294","Type":"ContainerDied","Data":"b1536dbaf2f5ed4695dd76eb11f2d368763770aff798c85c911ac58485f1c7de"} Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.651149 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tls68" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.651352 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1536dbaf2f5ed4695dd76eb11f2d368763770aff798c85c911ac58485f1c7de" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.652831 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d443-account-create-update-25726" event={"ID":"c114c0ae-fd24-4e0e-86d2-0586efa897cb","Type":"ContainerDied","Data":"4930e5545e678103ff59cfe8200a0f1f42ce1358cefc8a189bc8e4369330bd95"} Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.652864 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4930e5545e678103ff59cfe8200a0f1f42ce1358cefc8a189bc8e4369330bd95" Mar 14 09:03:08 crc kubenswrapper[5129]: I0314 09:03:08.652941 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d443-account-create-update-25726" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.019324 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.026137 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.062820 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-operator-scripts\") pod \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.065102 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eeeeb0dd-39ba-4a10-a223-6c0d1079c766" (UID: "eeeeb0dd-39ba-4a10-a223-6c0d1079c766"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.065675 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45gvg\" (UniqueName: \"kubernetes.io/projected/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-kube-api-access-45gvg\") pod \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.066411 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt745\" (UniqueName: \"kubernetes.io/projected/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-kube-api-access-xt745\") pod \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\" (UID: \"eeeeb0dd-39ba-4a10-a223-6c0d1079c766\") " Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.068310 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-operator-scripts\") pod \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\" (UID: \"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697\") " Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.071590 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aab6a3ac-9b76-4c9d-8ec6-a132f4d19697" (UID: "aab6a3ac-9b76-4c9d-8ec6-a132f4d19697"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.075651 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.075826 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.086161 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-kube-api-access-xt745" (OuterVolumeSpecName: "kube-api-access-xt745") pod "eeeeb0dd-39ba-4a10-a223-6c0d1079c766" (UID: "eeeeb0dd-39ba-4a10-a223-6c0d1079c766"). InnerVolumeSpecName "kube-api-access-xt745". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.115860 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-kube-api-access-45gvg" (OuterVolumeSpecName: "kube-api-access-45gvg") pod "aab6a3ac-9b76-4c9d-8ec6-a132f4d19697" (UID: "aab6a3ac-9b76-4c9d-8ec6-a132f4d19697"). InnerVolumeSpecName "kube-api-access-45gvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.171799 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.177737 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45gvg\" (UniqueName: \"kubernetes.io/projected/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697-kube-api-access-45gvg\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.177783 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt745\" (UniqueName: \"kubernetes.io/projected/eeeeb0dd-39ba-4a10-a223-6c0d1079c766-kube-api-access-xt745\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.279951 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlnv8\" (UniqueName: \"kubernetes.io/projected/1c95390a-bb53-4827-8c22-07ebbd28ab75-kube-api-access-hlnv8\") pod \"1c95390a-bb53-4827-8c22-07ebbd28ab75\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.281223 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c95390a-bb53-4827-8c22-07ebbd28ab75-operator-scripts\") pod \"1c95390a-bb53-4827-8c22-07ebbd28ab75\" (UID: \"1c95390a-bb53-4827-8c22-07ebbd28ab75\") " Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.281717 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c95390a-bb53-4827-8c22-07ebbd28ab75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c95390a-bb53-4827-8c22-07ebbd28ab75" (UID: "1c95390a-bb53-4827-8c22-07ebbd28ab75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.282347 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c95390a-bb53-4827-8c22-07ebbd28ab75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.285056 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c95390a-bb53-4827-8c22-07ebbd28ab75-kube-api-access-hlnv8" (OuterVolumeSpecName: "kube-api-access-hlnv8") pod "1c95390a-bb53-4827-8c22-07ebbd28ab75" (UID: "1c95390a-bb53-4827-8c22-07ebbd28ab75"). InnerVolumeSpecName "kube-api-access-hlnv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.384081 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlnv8\" (UniqueName: \"kubernetes.io/projected/1c95390a-bb53-4827-8c22-07ebbd28ab75-kube-api-access-hlnv8\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.665492 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.665476 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2020-account-create-update-lxdjn" event={"ID":"eeeeb0dd-39ba-4a10-a223-6c0d1079c766","Type":"ContainerDied","Data":"d484387487da26635353e1963a4aa4eaf6b4e023148c3efd54c3405149826d64"} Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.666257 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d484387487da26635353e1963a4aa4eaf6b4e023148c3efd54c3405149826d64" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.667088 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wq6v2" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.667074 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wq6v2" event={"ID":"1c95390a-bb53-4827-8c22-07ebbd28ab75","Type":"ContainerDied","Data":"a2d4b73351ef801c462a59d66ef406fe8b39e240f13faa64982b1418b5925178"} Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.667240 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2d4b73351ef801c462a59d66ef406fe8b39e240f13faa64982b1418b5925178" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.670088 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" event={"ID":"aab6a3ac-9b76-4c9d-8ec6-a132f4d19697","Type":"ContainerDied","Data":"1f6dc858879a50522eea1b6c06026573be007feddd5d0a02c83bb2c63b6af2b5"} Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.670198 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-adc2-account-create-update-pcn6r" Mar 14 09:03:09 crc kubenswrapper[5129]: I0314 09:03:09.670204 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6dc858879a50522eea1b6c06026573be007feddd5d0a02c83bb2c63b6af2b5" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511120 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2lss"] Mar 14 09:03:10 crc kubenswrapper[5129]: E0314 09:03:10.511559 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeeeb0dd-39ba-4a10-a223-6c0d1079c766" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511583 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeeeb0dd-39ba-4a10-a223-6c0d1079c766" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: E0314 09:03:10.511621 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c95390a-bb53-4827-8c22-07ebbd28ab75" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511632 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c95390a-bb53-4827-8c22-07ebbd28ab75" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: E0314 09:03:10.511654 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b843f393-c69e-418c-9fde-9c694dba8294" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511665 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b843f393-c69e-418c-9fde-9c694dba8294" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: E0314 09:03:10.511676 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab6a3ac-9b76-4c9d-8ec6-a132f4d19697" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511685 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab6a3ac-9b76-4c9d-8ec6-a132f4d19697" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: E0314 09:03:10.511697 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="298699ac-5f93-42c8-aecb-d98ef33e5d0c" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511704 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="298699ac-5f93-42c8-aecb-d98ef33e5d0c" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: E0314 09:03:10.511714 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c114c0ae-fd24-4e0e-86d2-0586efa897cb" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511724 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c114c0ae-fd24-4e0e-86d2-0586efa897cb" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511953 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="298699ac-5f93-42c8-aecb-d98ef33e5d0c" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511974 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b843f393-c69e-418c-9fde-9c694dba8294" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511984 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab6a3ac-9b76-4c9d-8ec6-a132f4d19697" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.511994 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeeeb0dd-39ba-4a10-a223-6c0d1079c766" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.512007 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c114c0ae-fd24-4e0e-86d2-0586efa897cb" containerName="mariadb-account-create-update" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.512029 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c95390a-bb53-4827-8c22-07ebbd28ab75" containerName="mariadb-database-create" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.512803 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.515265 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.515368 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xjd96" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.515592 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.536187 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2lss"] Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.607641 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-scripts\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.607725 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbb6z\" (UniqueName: \"kubernetes.io/projected/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-kube-api-access-nbb6z\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.607799 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.608005 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-config-data\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.710072 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbb6z\" (UniqueName: \"kubernetes.io/projected/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-kube-api-access-nbb6z\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.710283 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.710344 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-config-data\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.710386 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-scripts\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.717477 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-config-data\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.721391 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.723027 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-scripts\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.729861 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbb6z\" (UniqueName: \"kubernetes.io/projected/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-kube-api-access-nbb6z\") pod \"nova-cell0-conductor-db-sync-x2lss\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:10 crc kubenswrapper[5129]: I0314 09:03:10.833654 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:11 crc kubenswrapper[5129]: W0314 09:03:11.453834 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c8f48bc_be9e_4c7f_93a2_1ed9fabb641b.slice/crio-3f5d25f5376ab3d310771d83969950e988170f01d224e58be7804db649e37dd5 WatchSource:0}: Error finding container 3f5d25f5376ab3d310771d83969950e988170f01d224e58be7804db649e37dd5: Status 404 returned error can't find the container with id 3f5d25f5376ab3d310771d83969950e988170f01d224e58be7804db649e37dd5 Mar 14 09:03:11 crc kubenswrapper[5129]: I0314 09:03:11.454994 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2lss"] Mar 14 09:03:11 crc kubenswrapper[5129]: I0314 09:03:11.690624 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2lss" event={"ID":"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b","Type":"ContainerStarted","Data":"3f5d25f5376ab3d310771d83969950e988170f01d224e58be7804db649e37dd5"} Mar 14 09:03:23 crc kubenswrapper[5129]: I0314 09:03:23.804242 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2lss" event={"ID":"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b","Type":"ContainerStarted","Data":"da19bf0fcb2a790ee491ccd4ea17e5a675924cf478b0af80ac7159eea0c5989e"} Mar 14 09:03:23 crc kubenswrapper[5129]: I0314 09:03:23.829537 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x2lss" podStartSLOduration=1.7732697929999999 podStartE2EDuration="13.829518403s" podCreationTimestamp="2026-03-14 09:03:10 +0000 UTC" firstStartedPulling="2026-03-14 09:03:11.456060036 +0000 UTC m=+7454.207975220" lastFinishedPulling="2026-03-14 09:03:23.512308636 +0000 UTC m=+7466.264223830" observedRunningTime="2026-03-14 09:03:23.824440595 +0000 UTC m=+7466.576355769" watchObservedRunningTime="2026-03-14 09:03:23.829518403 +0000 UTC m=+7466.581433587" Mar 14 09:03:29 crc kubenswrapper[5129]: I0314 09:03:29.867817 5129 generic.go:334] "Generic (PLEG): container finished" podID="5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" containerID="da19bf0fcb2a790ee491ccd4ea17e5a675924cf478b0af80ac7159eea0c5989e" exitCode=0 Mar 14 09:03:29 crc kubenswrapper[5129]: I0314 09:03:29.867929 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2lss" event={"ID":"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b","Type":"ContainerDied","Data":"da19bf0fcb2a790ee491ccd4ea17e5a675924cf478b0af80ac7159eea0c5989e"} Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.238267 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.323574 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbb6z\" (UniqueName: \"kubernetes.io/projected/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-kube-api-access-nbb6z\") pod \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.323734 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-config-data\") pod \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.323779 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-scripts\") pod \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.323804 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-combined-ca-bundle\") pod \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\" (UID: \"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b\") " Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.332857 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-scripts" (OuterVolumeSpecName: "scripts") pod "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" (UID: "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.332929 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-kube-api-access-nbb6z" (OuterVolumeSpecName: "kube-api-access-nbb6z") pod "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" (UID: "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b"). InnerVolumeSpecName "kube-api-access-nbb6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.349666 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" (UID: "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.367074 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-config-data" (OuterVolumeSpecName: "config-data") pod "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" (UID: "5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.425743 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbb6z\" (UniqueName: \"kubernetes.io/projected/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-kube-api-access-nbb6z\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.425808 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.425820 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.425829 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.885427 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x2lss" event={"ID":"5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b","Type":"ContainerDied","Data":"3f5d25f5376ab3d310771d83969950e988170f01d224e58be7804db649e37dd5"} Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.885765 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f5d25f5376ab3d310771d83969950e988170f01d224e58be7804db649e37dd5" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.885821 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x2lss" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.968466 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:03:31 crc kubenswrapper[5129]: E0314 09:03:31.968832 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" containerName="nova-cell0-conductor-db-sync" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.968848 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" containerName="nova-cell0-conductor-db-sync" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.969163 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" containerName="nova-cell0-conductor-db-sync" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.969780 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.972790 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xjd96" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.973117 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 09:03:31 crc kubenswrapper[5129]: I0314 09:03:31.985802 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.137801 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq52d\" (UniqueName: \"kubernetes.io/projected/6f79de3c-8258-4bcf-a312-057813424e32-kube-api-access-kq52d\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.137855 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.137870 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.239666 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq52d\" (UniqueName: \"kubernetes.io/projected/6f79de3c-8258-4bcf-a312-057813424e32-kube-api-access-kq52d\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.239725 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.239746 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.242936 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.243225 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.260846 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq52d\" (UniqueName: \"kubernetes.io/projected/6f79de3c-8258-4bcf-a312-057813424e32-kube-api-access-kq52d\") pod \"nova-cell0-conductor-0\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.295168 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.716043 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:03:32 crc kubenswrapper[5129]: I0314 09:03:32.898520 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f79de3c-8258-4bcf-a312-057813424e32","Type":"ContainerStarted","Data":"49c3271b49cf9e621325f1d86c01adba6c37200663e868c3663ef37fe49a2e0d"} Mar 14 09:03:33 crc kubenswrapper[5129]: I0314 09:03:33.907378 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f79de3c-8258-4bcf-a312-057813424e32","Type":"ContainerStarted","Data":"574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088"} Mar 14 09:03:33 crc kubenswrapper[5129]: I0314 09:03:33.907562 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:33 crc kubenswrapper[5129]: I0314 09:03:33.931962 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.931939831 podStartE2EDuration="2.931939831s" podCreationTimestamp="2026-03-14 09:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:33.925352703 +0000 UTC m=+7476.677267897" watchObservedRunningTime="2026-03-14 09:03:33.931939831 +0000 UTC m=+7476.683855015" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.324230 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.739337 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-2jltq"] Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.741439 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.743918 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.744253 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.766979 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jltq"] Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.842078 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-config-data\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.842130 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm9kx\" (UniqueName: \"kubernetes.io/projected/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-kube-api-access-hm9kx\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.842247 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.842277 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-scripts\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.862718 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.864166 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.869387 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.888524 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.904155 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.905817 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.914301 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953437 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953533 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953575 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-494dg\" (UniqueName: \"kubernetes.io/projected/35914c22-9d64-4c0d-a4ff-090f61ede928-kube-api-access-494dg\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953628 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-scripts\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953659 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-config-data\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953714 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35914c22-9d64-4c0d-a4ff-090f61ede928-logs\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953750 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-config-data\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.953783 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm9kx\" (UniqueName: \"kubernetes.io/projected/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-kube-api-access-hm9kx\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.969315 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-config-data\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.982936 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.990295 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-scripts\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:37 crc kubenswrapper[5129]: I0314 09:03:37.997719 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.015838 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jxmcs"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.017250 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm9kx\" (UniqueName: \"kubernetes.io/projected/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-kube-api-access-hm9kx\") pod \"nova-cell0-cell-mapping-2jltq\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.018461 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.057797 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-utilities\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.057845 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-config-data\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.057874 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.057890 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjjcv\" (UniqueName: \"kubernetes.io/projected/ec6278c3-c4c9-4131-8805-457c57e9aea6-kube-api-access-tjjcv\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.057911 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.057938 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52657509-057d-4e07-9e8f-6c59e2b9ebc9-logs\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.057977 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwqmv\" (UniqueName: \"kubernetes.io/projected/52657509-057d-4e07-9e8f-6c59e2b9ebc9-kube-api-access-kwqmv\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.058003 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-494dg\" (UniqueName: \"kubernetes.io/projected/35914c22-9d64-4c0d-a4ff-090f61ede928-kube-api-access-494dg\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.058032 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-config-data\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.058070 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35914c22-9d64-4c0d-a4ff-090f61ede928-logs\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.058089 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-catalog-content\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.067156 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35914c22-9d64-4c0d-a4ff-090f61ede928-logs\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.075732 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.085520 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-config-data\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.104978 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.106333 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxmcs"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.106364 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.107473 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.111659 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.143382 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-494dg\" (UniqueName: \"kubernetes.io/projected/35914c22-9d64-4c0d-a4ff-090f61ede928-kube-api-access-494dg\") pod \"nova-api-0\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.143465 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.155541 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64c7b9766c-5pbn9"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.162100 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.165777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.165812 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjjcv\" (UniqueName: \"kubernetes.io/projected/ec6278c3-c4c9-4131-8805-457c57e9aea6-kube-api-access-tjjcv\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.180456 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52657509-057d-4e07-9e8f-6c59e2b9ebc9-logs\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.180521 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.180649 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwqmv\" (UniqueName: \"kubernetes.io/projected/52657509-057d-4e07-9e8f-6c59e2b9ebc9-kube-api-access-kwqmv\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.180790 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j58qz\" (UniqueName: \"kubernetes.io/projected/bade4404-af66-46e3-8b88-401b7347b40e-kube-api-access-j58qz\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.180884 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-catalog-content\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.180924 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.181005 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-utilities\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.181051 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-config-data\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.182002 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52657509-057d-4e07-9e8f-6c59e2b9ebc9-logs\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.183197 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-utilities\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.184461 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-config-data\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.185125 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.186000 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-catalog-content\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.196191 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.212243 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwqmv\" (UniqueName: \"kubernetes.io/projected/52657509-057d-4e07-9e8f-6c59e2b9ebc9-kube-api-access-kwqmv\") pod \"nova-metadata-0\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.223969 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.233566 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64c7b9766c-5pbn9"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.234190 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjjcv\" (UniqueName: \"kubernetes.io/projected/ec6278c3-c4c9-4131-8805-457c57e9aea6-kube-api-access-tjjcv\") pod \"certified-operators-jxmcs\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.251680 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.253119 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.255361 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.260334 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.281864 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-nb\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.281916 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.281945 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47qvs\" (UniqueName: \"kubernetes.io/projected/0ecedc81-c8c1-426c-8d77-5281d664ab2a-kube-api-access-47qvs\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.281965 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-config\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.281995 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-dns-svc\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.282042 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.282062 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-sb\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.282123 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j58qz\" (UniqueName: \"kubernetes.io/projected/bade4404-af66-46e3-8b88-401b7347b40e-kube-api-access-j58qz\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.291819 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.297053 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.341304 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j58qz\" (UniqueName: \"kubernetes.io/projected/bade4404-af66-46e3-8b88-401b7347b40e-kube-api-access-j58qz\") pod \"nova-cell1-novncproxy-0\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386199 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-sb\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386253 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hbj\" (UniqueName: \"kubernetes.io/projected/6e8803c1-008f-4327-8ab4-a91fbc9861b2-kube-api-access-m2hbj\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386303 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-config-data\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386343 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-nb\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386387 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47qvs\" (UniqueName: \"kubernetes.io/projected/0ecedc81-c8c1-426c-8d77-5281d664ab2a-kube-api-access-47qvs\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386408 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-config\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386426 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.386451 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-dns-svc\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.387833 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-sb\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.387974 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-nb\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.388056 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-config\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.389058 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-dns-svc\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.415116 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47qvs\" (UniqueName: \"kubernetes.io/projected/0ecedc81-c8c1-426c-8d77-5281d664ab2a-kube-api-access-47qvs\") pod \"dnsmasq-dns-64c7b9766c-5pbn9\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.488723 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hbj\" (UniqueName: \"kubernetes.io/projected/6e8803c1-008f-4327-8ab4-a91fbc9861b2-kube-api-access-m2hbj\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.489086 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-config-data\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.489157 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.497812 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-config-data\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.499724 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.512312 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.524234 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hbj\" (UniqueName: \"kubernetes.io/projected/6e8803c1-008f-4327-8ab4-a91fbc9861b2-kube-api-access-m2hbj\") pod \"nova-scheduler-0\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.536100 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.546366 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.580424 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:03:38 crc kubenswrapper[5129]: I0314 09:03:38.793725 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jltq"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.008381 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.056879 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jltq" event={"ID":"7f7e6d99-1b35-48f2-aad6-6724ffe3629e","Type":"ContainerStarted","Data":"ea56365f04e1d169df0c254274c2b2ba9aa8f91b53f7386ad7a26e5547dae248"} Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.081534 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.223995 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxmcs"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.534286 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nsmt"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.535866 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.539225 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.545725 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.554446 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nsmt"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.570378 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.610824 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:03:39 crc kubenswrapper[5129]: W0314 09:03:39.619350 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e8803c1_008f_4327_8ab4_a91fbc9861b2.slice/crio-9b62ef0ddaaf00d0c78154a2af7496d3658db4b69f43cb6c306b7fe2b223d09b WatchSource:0}: Error finding container 9b62ef0ddaaf00d0c78154a2af7496d3658db4b69f43cb6c306b7fe2b223d09b: Status 404 returned error can't find the container with id 9b62ef0ddaaf00d0c78154a2af7496d3658db4b69f43cb6c306b7fe2b223d09b Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.649836 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.649956 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldf7h\" (UniqueName: \"kubernetes.io/projected/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-kube-api-access-ldf7h\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.650443 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-scripts\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.650556 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-config-data\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.664854 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64c7b9766c-5pbn9"] Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.757027 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.757670 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldf7h\" (UniqueName: \"kubernetes.io/projected/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-kube-api-access-ldf7h\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.758206 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-scripts\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.758324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-config-data\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.764299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.765352 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-scripts\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.768658 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-config-data\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.774822 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldf7h\" (UniqueName: \"kubernetes.io/projected/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-kube-api-access-ldf7h\") pod \"nova-cell1-conductor-db-sync-2nsmt\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:39 crc kubenswrapper[5129]: I0314 09:03:39.859695 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.117373 5129 generic.go:334] "Generic (PLEG): container finished" podID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerID="99dc6367703ae6c03abd45aa852b42eb76ee553d29759d6a227a0f9dca0238bb" exitCode=0 Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.117952 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" event={"ID":"0ecedc81-c8c1-426c-8d77-5281d664ab2a","Type":"ContainerDied","Data":"99dc6367703ae6c03abd45aa852b42eb76ee553d29759d6a227a0f9dca0238bb"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.117990 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" event={"ID":"0ecedc81-c8c1-426c-8d77-5281d664ab2a","Type":"ContainerStarted","Data":"c621b34e6db3430ecfd55a8a4b4a1f5c704c7cca71d90838d67d980153384db9"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.128822 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52657509-057d-4e07-9e8f-6c59e2b9ebc9","Type":"ContainerStarted","Data":"f1a1b08f2e3cea8a441036a63589521226d9e6e04de0fd1fe40278c8e78349b5"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.133838 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35914c22-9d64-4c0d-a4ff-090f61ede928","Type":"ContainerStarted","Data":"26ba367af6540fbffc3266b72260b0d3fa6ee3dfa32a53b0e47f65499097b86f"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.175871 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jltq" event={"ID":"7f7e6d99-1b35-48f2-aad6-6724ffe3629e","Type":"ContainerStarted","Data":"e016520c9199256e4a26a49ccdffb63f5b0cc903389cd9db3ab67bd4cb64a8c8"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.235771 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-2jltq" podStartSLOduration=3.235753296 podStartE2EDuration="3.235753296s" podCreationTimestamp="2026-03-14 09:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:40.206873113 +0000 UTC m=+7482.958788297" watchObservedRunningTime="2026-03-14 09:03:40.235753296 +0000 UTC m=+7482.987668480" Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.241928 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6e8803c1-008f-4327-8ab4-a91fbc9861b2","Type":"ContainerStarted","Data":"9b62ef0ddaaf00d0c78154a2af7496d3658db4b69f43cb6c306b7fe2b223d09b"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.269392 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bade4404-af66-46e3-8b88-401b7347b40e","Type":"ContainerStarted","Data":"5b9daaab6f1fb9548bf0fba46937a0081738d4534d76b35cfb5ed3ef413d5f3f"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.305041 5129 generic.go:334] "Generic (PLEG): container finished" podID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerID="fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da" exitCode=0 Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.305089 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxmcs" event={"ID":"ec6278c3-c4c9-4131-8805-457c57e9aea6","Type":"ContainerDied","Data":"fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.305113 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxmcs" event={"ID":"ec6278c3-c4c9-4131-8805-457c57e9aea6","Type":"ContainerStarted","Data":"79dbe62d064d4b66db7dc3df32878460cef161e55834c3a54ecc617d6ec27b52"} Mar 14 09:03:40 crc kubenswrapper[5129]: I0314 09:03:40.463694 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nsmt"] Mar 14 09:03:40 crc kubenswrapper[5129]: W0314 09:03:40.476768 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc66b44f9_85eb_4e9f_b6ca_9b1d0a43c682.slice/crio-c9df959fc54f4fa70122dc16d3374f872a2a3b4abe818c71dd18c5ada2d2edac WatchSource:0}: Error finding container c9df959fc54f4fa70122dc16d3374f872a2a3b4abe818c71dd18c5ada2d2edac: Status 404 returned error can't find the container with id c9df959fc54f4fa70122dc16d3374f872a2a3b4abe818c71dd18c5ada2d2edac Mar 14 09:03:41 crc kubenswrapper[5129]: I0314 09:03:41.322914 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" event={"ID":"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682","Type":"ContainerStarted","Data":"4fd366e6b4f84257df2dc2e4364d0afea5c792265d18ffa6544d62f22641b294"} Mar 14 09:03:41 crc kubenswrapper[5129]: I0314 09:03:41.323538 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" event={"ID":"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682","Type":"ContainerStarted","Data":"c9df959fc54f4fa70122dc16d3374f872a2a3b4abe818c71dd18c5ada2d2edac"} Mar 14 09:03:41 crc kubenswrapper[5129]: I0314 09:03:41.325776 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" event={"ID":"0ecedc81-c8c1-426c-8d77-5281d664ab2a","Type":"ContainerStarted","Data":"1fbaf02f0b38939f54dde95de534c0e790056fe23ec0f18a57c79d1bfbec3b72"} Mar 14 09:03:41 crc kubenswrapper[5129]: I0314 09:03:41.343572 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" podStartSLOduration=2.343555489 podStartE2EDuration="2.343555489s" podCreationTimestamp="2026-03-14 09:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:41.337476684 +0000 UTC m=+7484.089391868" watchObservedRunningTime="2026-03-14 09:03:41.343555489 +0000 UTC m=+7484.095470673" Mar 14 09:03:41 crc kubenswrapper[5129]: I0314 09:03:41.362897 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" podStartSLOduration=3.362875513 podStartE2EDuration="3.362875513s" podCreationTimestamp="2026-03-14 09:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:41.353365635 +0000 UTC m=+7484.105280819" watchObservedRunningTime="2026-03-14 09:03:41.362875513 +0000 UTC m=+7484.114790697" Mar 14 09:03:41 crc kubenswrapper[5129]: I0314 09:03:41.910221 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:41 crc kubenswrapper[5129]: I0314 09:03:41.921890 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:03:42 crc kubenswrapper[5129]: I0314 09:03:42.334638 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.378766 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52657509-057d-4e07-9e8f-6c59e2b9ebc9","Type":"ContainerStarted","Data":"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53"} Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.379489 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52657509-057d-4e07-9e8f-6c59e2b9ebc9","Type":"ContainerStarted","Data":"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c"} Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.379655 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-log" containerID="cri-o://abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c" gracePeriod=30 Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.380199 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-metadata" containerID="cri-o://74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53" gracePeriod=30 Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.385528 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35914c22-9d64-4c0d-a4ff-090f61ede928","Type":"ContainerStarted","Data":"b8435fbfba0bb6f87271a0b6118bc85eeab38a4c5c44f275a70db45c85e63620"} Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.385577 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35914c22-9d64-4c0d-a4ff-090f61ede928","Type":"ContainerStarted","Data":"ef6658eceda084d0cbceae53bc455b861d3374aa04e3ed4480c8ea9c1e427c5b"} Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.394741 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6e8803c1-008f-4327-8ab4-a91fbc9861b2","Type":"ContainerStarted","Data":"3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c"} Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.401511 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bade4404-af66-46e3-8b88-401b7347b40e","Type":"ContainerStarted","Data":"dddcc0f8b88b14babbdff4091b07ebbdfcaee288e79a846ff02b5275acfaace2"} Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.401732 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bade4404-af66-46e3-8b88-401b7347b40e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dddcc0f8b88b14babbdff4091b07ebbdfcaee288e79a846ff02b5275acfaace2" gracePeriod=30 Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.406135 5129 generic.go:334] "Generic (PLEG): container finished" podID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerID="10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61" exitCode=0 Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.406501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxmcs" event={"ID":"ec6278c3-c4c9-4131-8805-457c57e9aea6","Type":"ContainerDied","Data":"10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61"} Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.410041 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.114871935 podStartE2EDuration="7.410005263s" podCreationTimestamp="2026-03-14 09:03:37 +0000 UTC" firstStartedPulling="2026-03-14 09:03:39.056369741 +0000 UTC m=+7481.808284925" lastFinishedPulling="2026-03-14 09:03:43.351503069 +0000 UTC m=+7486.103418253" observedRunningTime="2026-03-14 09:03:44.399096597 +0000 UTC m=+7487.151011781" watchObservedRunningTime="2026-03-14 09:03:44.410005263 +0000 UTC m=+7487.161920447" Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.428976 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.699185962 podStartE2EDuration="6.428946366s" podCreationTimestamp="2026-03-14 09:03:38 +0000 UTC" firstStartedPulling="2026-03-14 09:03:39.622294639 +0000 UTC m=+7482.374209813" lastFinishedPulling="2026-03-14 09:03:43.352055043 +0000 UTC m=+7486.103970217" observedRunningTime="2026-03-14 09:03:44.42576821 +0000 UTC m=+7487.177683404" watchObservedRunningTime="2026-03-14 09:03:44.428946366 +0000 UTC m=+7487.180861550" Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.459699 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.27605121 podStartE2EDuration="7.45967269s" podCreationTimestamp="2026-03-14 09:03:37 +0000 UTC" firstStartedPulling="2026-03-14 09:03:39.166426219 +0000 UTC m=+7481.918341403" lastFinishedPulling="2026-03-14 09:03:43.350047709 +0000 UTC m=+7486.101962883" observedRunningTime="2026-03-14 09:03:44.451510479 +0000 UTC m=+7487.203425673" watchObservedRunningTime="2026-03-14 09:03:44.45967269 +0000 UTC m=+7487.211587864" Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.477811 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.700947041 podStartE2EDuration="7.477786302s" podCreationTimestamp="2026-03-14 09:03:37 +0000 UTC" firstStartedPulling="2026-03-14 09:03:39.57919848 +0000 UTC m=+7482.331113654" lastFinishedPulling="2026-03-14 09:03:43.356037731 +0000 UTC m=+7486.107952915" observedRunningTime="2026-03-14 09:03:44.465684893 +0000 UTC m=+7487.217600077" watchObservedRunningTime="2026-03-14 09:03:44.477786302 +0000 UTC m=+7487.229701486" Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.942322 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rw772"] Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.944894 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.953548 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rw772"] Mar 14 09:03:44 crc kubenswrapper[5129]: I0314 09:03:44.976336 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.095804 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-combined-ca-bundle\") pod \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.095883 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwqmv\" (UniqueName: \"kubernetes.io/projected/52657509-057d-4e07-9e8f-6c59e2b9ebc9-kube-api-access-kwqmv\") pod \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.095911 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-config-data\") pod \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.095976 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52657509-057d-4e07-9e8f-6c59e2b9ebc9-logs\") pod \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\" (UID: \"52657509-057d-4e07-9e8f-6c59e2b9ebc9\") " Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.096574 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttw42\" (UniqueName: \"kubernetes.io/projected/8d643556-e564-494a-be43-3769b1001f23-kube-api-access-ttw42\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.096641 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-catalog-content\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.096753 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-utilities\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.097361 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52657509-057d-4e07-9e8f-6c59e2b9ebc9-logs" (OuterVolumeSpecName: "logs") pod "52657509-057d-4e07-9e8f-6c59e2b9ebc9" (UID: "52657509-057d-4e07-9e8f-6c59e2b9ebc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.103810 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52657509-057d-4e07-9e8f-6c59e2b9ebc9-kube-api-access-kwqmv" (OuterVolumeSpecName: "kube-api-access-kwqmv") pod "52657509-057d-4e07-9e8f-6c59e2b9ebc9" (UID: "52657509-057d-4e07-9e8f-6c59e2b9ebc9"). InnerVolumeSpecName "kube-api-access-kwqmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.129204 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-config-data" (OuterVolumeSpecName: "config-data") pod "52657509-057d-4e07-9e8f-6c59e2b9ebc9" (UID: "52657509-057d-4e07-9e8f-6c59e2b9ebc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.150904 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52657509-057d-4e07-9e8f-6c59e2b9ebc9" (UID: "52657509-057d-4e07-9e8f-6c59e2b9ebc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199088 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttw42\" (UniqueName: \"kubernetes.io/projected/8d643556-e564-494a-be43-3769b1001f23-kube-api-access-ttw42\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199163 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-catalog-content\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-utilities\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199336 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199348 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwqmv\" (UniqueName: \"kubernetes.io/projected/52657509-057d-4e07-9e8f-6c59e2b9ebc9-kube-api-access-kwqmv\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199358 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52657509-057d-4e07-9e8f-6c59e2b9ebc9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199367 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52657509-057d-4e07-9e8f-6c59e2b9ebc9-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-catalog-content\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.199989 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-utilities\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.224140 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttw42\" (UniqueName: \"kubernetes.io/projected/8d643556-e564-494a-be43-3769b1001f23-kube-api-access-ttw42\") pod \"redhat-operators-rw772\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.285766 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.447215 5129 generic.go:334] "Generic (PLEG): container finished" podID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerID="74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53" exitCode=0 Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.447576 5129 generic.go:334] "Generic (PLEG): container finished" podID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerID="abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c" exitCode=143 Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.447761 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.447767 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52657509-057d-4e07-9e8f-6c59e2b9ebc9","Type":"ContainerDied","Data":"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53"} Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.447874 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52657509-057d-4e07-9e8f-6c59e2b9ebc9","Type":"ContainerDied","Data":"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c"} Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.447897 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52657509-057d-4e07-9e8f-6c59e2b9ebc9","Type":"ContainerDied","Data":"f1a1b08f2e3cea8a441036a63589521226d9e6e04de0fd1fe40278c8e78349b5"} Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.447961 5129 scope.go:117] "RemoveContainer" containerID="74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.456228 5129 generic.go:334] "Generic (PLEG): container finished" podID="7f7e6d99-1b35-48f2-aad6-6724ffe3629e" containerID="e016520c9199256e4a26a49ccdffb63f5b0cc903389cd9db3ab67bd4cb64a8c8" exitCode=0 Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.456349 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jltq" event={"ID":"7f7e6d99-1b35-48f2-aad6-6724ffe3629e","Type":"ContainerDied","Data":"e016520c9199256e4a26a49ccdffb63f5b0cc903389cd9db3ab67bd4cb64a8c8"} Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.461962 5129 generic.go:334] "Generic (PLEG): container finished" podID="c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" containerID="4fd366e6b4f84257df2dc2e4364d0afea5c792265d18ffa6544d62f22641b294" exitCode=0 Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.462252 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" event={"ID":"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682","Type":"ContainerDied","Data":"4fd366e6b4f84257df2dc2e4364d0afea5c792265d18ffa6544d62f22641b294"} Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.475405 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxmcs" event={"ID":"ec6278c3-c4c9-4131-8805-457c57e9aea6","Type":"ContainerStarted","Data":"78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77"} Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.531788 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jxmcs" podStartSLOduration=3.859313349 podStartE2EDuration="8.531761854s" podCreationTimestamp="2026-03-14 09:03:37 +0000 UTC" firstStartedPulling="2026-03-14 09:03:40.350297315 +0000 UTC m=+7483.102212499" lastFinishedPulling="2026-03-14 09:03:45.02274582 +0000 UTC m=+7487.774661004" observedRunningTime="2026-03-14 09:03:45.524073365 +0000 UTC m=+7488.275988549" watchObservedRunningTime="2026-03-14 09:03:45.531761854 +0000 UTC m=+7488.283677038" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.570290 5129 scope.go:117] "RemoveContainer" containerID="abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.581845 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.605084 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.610665 5129 scope.go:117] "RemoveContainer" containerID="74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.617550 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:45 crc kubenswrapper[5129]: E0314 09:03:45.617970 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-metadata" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.617986 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-metadata" Mar 14 09:03:45 crc kubenswrapper[5129]: E0314 09:03:45.618011 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-log" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.618017 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-log" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.618186 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-log" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.618201 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" containerName="nova-metadata-metadata" Mar 14 09:03:45 crc kubenswrapper[5129]: E0314 09:03:45.618338 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53\": container with ID starting with 74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53 not found: ID does not exist" containerID="74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.618378 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53"} err="failed to get container status \"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53\": rpc error: code = NotFound desc = could not find container \"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53\": container with ID starting with 74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53 not found: ID does not exist" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.618406 5129 scope.go:117] "RemoveContainer" containerID="abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c" Mar 14 09:03:45 crc kubenswrapper[5129]: E0314 09:03:45.618991 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c\": container with ID starting with abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c not found: ID does not exist" containerID="abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.619019 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c"} err="failed to get container status \"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c\": rpc error: code = NotFound desc = could not find container \"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c\": container with ID starting with abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c not found: ID does not exist" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.619036 5129 scope.go:117] "RemoveContainer" containerID="74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.619324 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.619860 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53"} err="failed to get container status \"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53\": rpc error: code = NotFound desc = could not find container \"74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53\": container with ID starting with 74ca91bbb48501e4b6b4826079e994733cfb46399fc93c22247304debbc68e53 not found: ID does not exist" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.619886 5129 scope.go:117] "RemoveContainer" containerID="abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.620096 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c"} err="failed to get container status \"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c\": rpc error: code = NotFound desc = could not find container \"abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c\": container with ID starting with abf0bc486647a3e0c9cc5ce09ab3f795d1d4ce48e4febfcf65928606a7648f7c not found: ID does not exist" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.622209 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.622402 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.640119 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.717126 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc39c79-b1f7-4c2a-9641-aba19b71b678-logs\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.717169 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.717249 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-config-data\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.717269 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.717296 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pspwg\" (UniqueName: \"kubernetes.io/projected/dbc39c79-b1f7-4c2a-9641-aba19b71b678-kube-api-access-pspwg\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.818681 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc39c79-b1f7-4c2a-9641-aba19b71b678-logs\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.818728 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.818803 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-config-data\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.818824 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.818851 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pspwg\" (UniqueName: \"kubernetes.io/projected/dbc39c79-b1f7-4c2a-9641-aba19b71b678-kube-api-access-pspwg\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.819282 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc39c79-b1f7-4c2a-9641-aba19b71b678-logs\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.826107 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.831279 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.833000 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-config-data\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.836347 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pspwg\" (UniqueName: \"kubernetes.io/projected/dbc39c79-b1f7-4c2a-9641-aba19b71b678-kube-api-access-pspwg\") pod \"nova-metadata-0\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.951532 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:45 crc kubenswrapper[5129]: I0314 09:03:45.971841 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rw772"] Mar 14 09:03:46 crc kubenswrapper[5129]: I0314 09:03:46.061362 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52657509-057d-4e07-9e8f-6c59e2b9ebc9" path="/var/lib/kubelet/pods/52657509-057d-4e07-9e8f-6c59e2b9ebc9/volumes" Mar 14 09:03:46 crc kubenswrapper[5129]: I0314 09:03:46.436445 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:46 crc kubenswrapper[5129]: I0314 09:03:46.484485 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc39c79-b1f7-4c2a-9641-aba19b71b678","Type":"ContainerStarted","Data":"ad0e553ce03c985d270b2670c0491b85c3435ba3c1cdc2ca766ad4ce80787d62"} Mar 14 09:03:46 crc kubenswrapper[5129]: I0314 09:03:46.487480 5129 generic.go:334] "Generic (PLEG): container finished" podID="8d643556-e564-494a-be43-3769b1001f23" containerID="3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1" exitCode=0 Mar 14 09:03:46 crc kubenswrapper[5129]: I0314 09:03:46.487820 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw772" event={"ID":"8d643556-e564-494a-be43-3769b1001f23","Type":"ContainerDied","Data":"3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1"} Mar 14 09:03:46 crc kubenswrapper[5129]: I0314 09:03:46.487889 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw772" event={"ID":"8d643556-e564-494a-be43-3769b1001f23","Type":"ContainerStarted","Data":"c2c42c4fc1f3a17c94d0014c2dc3e39a14dc95ada31c06b07608d966b0a69f8d"} Mar 14 09:03:46 crc kubenswrapper[5129]: I0314 09:03:46.961998 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.030787 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.060262 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-combined-ca-bundle\") pod \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.060363 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-scripts\") pod \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.060388 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-config-data\") pod \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.060431 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm9kx\" (UniqueName: \"kubernetes.io/projected/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-kube-api-access-hm9kx\") pod \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\" (UID: \"7f7e6d99-1b35-48f2-aad6-6724ffe3629e\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.067809 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-scripts" (OuterVolumeSpecName: "scripts") pod "7f7e6d99-1b35-48f2-aad6-6724ffe3629e" (UID: "7f7e6d99-1b35-48f2-aad6-6724ffe3629e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.071478 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-kube-api-access-hm9kx" (OuterVolumeSpecName: "kube-api-access-hm9kx") pod "7f7e6d99-1b35-48f2-aad6-6724ffe3629e" (UID: "7f7e6d99-1b35-48f2-aad6-6724ffe3629e"). InnerVolumeSpecName "kube-api-access-hm9kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.112969 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-config-data" (OuterVolumeSpecName: "config-data") pod "7f7e6d99-1b35-48f2-aad6-6724ffe3629e" (UID: "7f7e6d99-1b35-48f2-aad6-6724ffe3629e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.119723 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f7e6d99-1b35-48f2-aad6-6724ffe3629e" (UID: "7f7e6d99-1b35-48f2-aad6-6724ffe3629e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.161508 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-config-data\") pod \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.161620 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-combined-ca-bundle\") pod \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.161665 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldf7h\" (UniqueName: \"kubernetes.io/projected/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-kube-api-access-ldf7h\") pod \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.161794 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-scripts\") pod \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\" (UID: \"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682\") " Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.162222 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.162240 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.162251 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.162260 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm9kx\" (UniqueName: \"kubernetes.io/projected/7f7e6d99-1b35-48f2-aad6-6724ffe3629e-kube-api-access-hm9kx\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.168397 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-scripts" (OuterVolumeSpecName: "scripts") pod "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" (UID: "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.168487 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-kube-api-access-ldf7h" (OuterVolumeSpecName: "kube-api-access-ldf7h") pod "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" (UID: "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682"). InnerVolumeSpecName "kube-api-access-ldf7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.189624 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-config-data" (OuterVolumeSpecName: "config-data") pod "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" (UID: "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.192927 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" (UID: "c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.266867 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.267300 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldf7h\" (UniqueName: \"kubernetes.io/projected/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-kube-api-access-ldf7h\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.267316 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.267328 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.496534 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" event={"ID":"c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682","Type":"ContainerDied","Data":"c9df959fc54f4fa70122dc16d3374f872a2a3b4abe818c71dd18c5ada2d2edac"} Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.496580 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9df959fc54f4fa70122dc16d3374f872a2a3b4abe818c71dd18c5ada2d2edac" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.496668 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nsmt" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.498480 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jltq" event={"ID":"7f7e6d99-1b35-48f2-aad6-6724ffe3629e","Type":"ContainerDied","Data":"ea56365f04e1d169df0c254274c2b2ba9aa8f91b53f7386ad7a26e5547dae248"} Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.498503 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea56365f04e1d169df0c254274c2b2ba9aa8f91b53f7386ad7a26e5547dae248" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.498553 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jltq" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.505651 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc39c79-b1f7-4c2a-9641-aba19b71b678","Type":"ContainerStarted","Data":"41324406f6570c83075d42f69a3334ad4493789f019670c400a6c767a567c8d9"} Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.505790 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc39c79-b1f7-4c2a-9641-aba19b71b678","Type":"ContainerStarted","Data":"0e61e80deb368196d527d85e57f0afd35f54001ecf6d766c09aed3b99700b5e5"} Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.554690 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.554671239 podStartE2EDuration="2.554671239s" podCreationTimestamp="2026-03-14 09:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:47.550298651 +0000 UTC m=+7490.302213835" watchObservedRunningTime="2026-03-14 09:03:47.554671239 +0000 UTC m=+7490.306586423" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.622535 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:03:47 crc kubenswrapper[5129]: E0314 09:03:47.623084 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7e6d99-1b35-48f2-aad6-6724ffe3629e" containerName="nova-manage" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.623155 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7e6d99-1b35-48f2-aad6-6724ffe3629e" containerName="nova-manage" Mar 14 09:03:47 crc kubenswrapper[5129]: E0314 09:03:47.623219 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" containerName="nova-cell1-conductor-db-sync" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.623270 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" containerName="nova-cell1-conductor-db-sync" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.623487 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" containerName="nova-cell1-conductor-db-sync" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.623555 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7e6d99-1b35-48f2-aad6-6724ffe3629e" containerName="nova-manage" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.624212 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.626776 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.644089 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.777621 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v592m\" (UniqueName: \"kubernetes.io/projected/21a58896-9f4d-4489-9153-a3b8afe3cf4d-kube-api-access-v592m\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.777736 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.777780 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.880780 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v592m\" (UniqueName: \"kubernetes.io/projected/21a58896-9f4d-4489-9153-a3b8afe3cf4d-kube-api-access-v592m\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.881517 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.881573 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.891804 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.896879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.912694 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v592m\" (UniqueName: \"kubernetes.io/projected/21a58896-9f4d-4489-9153-a3b8afe3cf4d-kube-api-access-v592m\") pod \"nova-cell1-conductor-0\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.945394 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.945860 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6e8803c1-008f-4327-8ab4-a91fbc9861b2" containerName="nova-scheduler-scheduler" containerID="cri-o://3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c" gracePeriod=30 Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.963674 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.966418 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-log" containerID="cri-o://ef6658eceda084d0cbceae53bc455b861d3374aa04e3ed4480c8ea9c1e427c5b" gracePeriod=30 Mar 14 09:03:47 crc kubenswrapper[5129]: I0314 09:03:47.966666 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-api" containerID="cri-o://b8435fbfba0bb6f87271a0b6118bc85eeab38a4c5c44f275a70db45c85e63620" gracePeriod=30 Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.006408 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.023052 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.516492 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.517116 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.524085 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw772" event={"ID":"8d643556-e564-494a-be43-3769b1001f23","Type":"ContainerStarted","Data":"fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38"} Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.528794 5129 generic.go:334] "Generic (PLEG): container finished" podID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerID="b8435fbfba0bb6f87271a0b6118bc85eeab38a4c5c44f275a70db45c85e63620" exitCode=0 Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.528819 5129 generic.go:334] "Generic (PLEG): container finished" podID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerID="ef6658eceda084d0cbceae53bc455b861d3374aa04e3ed4480c8ea9c1e427c5b" exitCode=143 Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.529731 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35914c22-9d64-4c0d-a4ff-090f61ede928","Type":"ContainerDied","Data":"b8435fbfba0bb6f87271a0b6118bc85eeab38a4c5c44f275a70db45c85e63620"} Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.529764 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35914c22-9d64-4c0d-a4ff-090f61ede928","Type":"ContainerDied","Data":"ef6658eceda084d0cbceae53bc455b861d3374aa04e3ed4480c8ea9c1e427c5b"} Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.537858 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.537915 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.550731 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.581307 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.593693 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.623280 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f86dd9c67-9xx9q"] Mar 14 09:03:48 crc kubenswrapper[5129]: I0314 09:03:48.623762 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" podUID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerName="dnsmasq-dns" containerID="cri-o://0e0608d16261557b94aa308a8804718cfe2aa72a71bb674e79a5a9e20e88b097" gracePeriod=10 Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.188634 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.313545 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35914c22-9d64-4c0d-a4ff-090f61ede928-logs\") pod \"35914c22-9d64-4c0d-a4ff-090f61ede928\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.313744 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-combined-ca-bundle\") pod \"35914c22-9d64-4c0d-a4ff-090f61ede928\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.313767 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-config-data\") pod \"35914c22-9d64-4c0d-a4ff-090f61ede928\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.313871 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-494dg\" (UniqueName: \"kubernetes.io/projected/35914c22-9d64-4c0d-a4ff-090f61ede928-kube-api-access-494dg\") pod \"35914c22-9d64-4c0d-a4ff-090f61ede928\" (UID: \"35914c22-9d64-4c0d-a4ff-090f61ede928\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.315405 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35914c22-9d64-4c0d-a4ff-090f61ede928-logs" (OuterVolumeSpecName: "logs") pod "35914c22-9d64-4c0d-a4ff-090f61ede928" (UID: "35914c22-9d64-4c0d-a4ff-090f61ede928"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.319328 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35914c22-9d64-4c0d-a4ff-090f61ede928-kube-api-access-494dg" (OuterVolumeSpecName: "kube-api-access-494dg") pod "35914c22-9d64-4c0d-a4ff-090f61ede928" (UID: "35914c22-9d64-4c0d-a4ff-090f61ede928"). InnerVolumeSpecName "kube-api-access-494dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.355797 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35914c22-9d64-4c0d-a4ff-090f61ede928" (UID: "35914c22-9d64-4c0d-a4ff-090f61ede928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.357510 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-config-data" (OuterVolumeSpecName: "config-data") pod "35914c22-9d64-4c0d-a4ff-090f61ede928" (UID: "35914c22-9d64-4c0d-a4ff-090f61ede928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.416461 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.416488 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35914c22-9d64-4c0d-a4ff-090f61ede928-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.416499 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-494dg\" (UniqueName: \"kubernetes.io/projected/35914c22-9d64-4c0d-a4ff-090f61ede928-kube-api-access-494dg\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.416508 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35914c22-9d64-4c0d-a4ff-090f61ede928-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.540675 5129 generic.go:334] "Generic (PLEG): container finished" podID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerID="0e0608d16261557b94aa308a8804718cfe2aa72a71bb674e79a5a9e20e88b097" exitCode=0 Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.540728 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" event={"ID":"f969c069-87cb-4571-85ef-3d88d8f510c5","Type":"ContainerDied","Data":"0e0608d16261557b94aa308a8804718cfe2aa72a71bb674e79a5a9e20e88b097"} Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.540774 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" event={"ID":"f969c069-87cb-4571-85ef-3d88d8f510c5","Type":"ContainerDied","Data":"81ff6188262e538ca2312ea442a7c693d59c4587620fc45e2fb5660599728a93"} Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.540791 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81ff6188262e538ca2312ea442a7c693d59c4587620fc45e2fb5660599728a93" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.542422 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"21a58896-9f4d-4489-9153-a3b8afe3cf4d","Type":"ContainerStarted","Data":"83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff"} Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.542454 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"21a58896-9f4d-4489-9153-a3b8afe3cf4d","Type":"ContainerStarted","Data":"903bb0fc346f245841ed75c5f73cadcd8387c83124e697ca8f88a5c6754e4418"} Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.543908 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.548447 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35914c22-9d64-4c0d-a4ff-090f61ede928","Type":"ContainerDied","Data":"26ba367af6540fbffc3266b72260b0d3fa6ee3dfa32a53b0e47f65499097b86f"} Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.548495 5129 scope.go:117] "RemoveContainer" containerID="b8435fbfba0bb6f87271a0b6118bc85eeab38a4c5c44f275a70db45c85e63620" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.548818 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.548960 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-log" containerID="cri-o://0e61e80deb368196d527d85e57f0afd35f54001ecf6d766c09aed3b99700b5e5" gracePeriod=30 Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.549142 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-metadata" containerID="cri-o://41324406f6570c83075d42f69a3334ad4493789f019670c400a6c767a567c8d9" gracePeriod=30 Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.574724 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.574702226 podStartE2EDuration="2.574702226s" podCreationTimestamp="2026-03-14 09:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:49.571783427 +0000 UTC m=+7492.323698611" watchObservedRunningTime="2026-03-14 09:03:49.574702226 +0000 UTC m=+7492.326617420" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.638837 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.641056 5129 scope.go:117] "RemoveContainer" containerID="ef6658eceda084d0cbceae53bc455b861d3374aa04e3ed4480c8ea9c1e427c5b" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.665063 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.700998 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.709259 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:49 crc kubenswrapper[5129]: E0314 09:03:49.709953 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-api" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.709968 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-api" Mar 14 09:03:49 crc kubenswrapper[5129]: E0314 09:03:49.709996 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerName="dnsmasq-dns" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.710002 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerName="dnsmasq-dns" Mar 14 09:03:49 crc kubenswrapper[5129]: E0314 09:03:49.710020 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-log" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.710075 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-log" Mar 14 09:03:49 crc kubenswrapper[5129]: E0314 09:03:49.710098 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerName="init" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.710105 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerName="init" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.710306 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-log" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.710333 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f969c069-87cb-4571-85ef-3d88d8f510c5" containerName="dnsmasq-dns" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.710349 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" containerName="nova-api-api" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.711464 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.715838 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.722120 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22rzp\" (UniqueName: \"kubernetes.io/projected/f969c069-87cb-4571-85ef-3d88d8f510c5-kube-api-access-22rzp\") pod \"f969c069-87cb-4571-85ef-3d88d8f510c5\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.724946 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-nb\") pod \"f969c069-87cb-4571-85ef-3d88d8f510c5\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.725022 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-config\") pod \"f969c069-87cb-4571-85ef-3d88d8f510c5\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.725119 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-dns-svc\") pod \"f969c069-87cb-4571-85ef-3d88d8f510c5\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.725142 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-sb\") pod \"f969c069-87cb-4571-85ef-3d88d8f510c5\" (UID: \"f969c069-87cb-4571-85ef-3d88d8f510c5\") " Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.732029 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f969c069-87cb-4571-85ef-3d88d8f510c5-kube-api-access-22rzp" (OuterVolumeSpecName: "kube-api-access-22rzp") pod "f969c069-87cb-4571-85ef-3d88d8f510c5" (UID: "f969c069-87cb-4571-85ef-3d88d8f510c5"). InnerVolumeSpecName "kube-api-access-22rzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.732137 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.784180 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f969c069-87cb-4571-85ef-3d88d8f510c5" (UID: "f969c069-87cb-4571-85ef-3d88d8f510c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.792884 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f969c069-87cb-4571-85ef-3d88d8f510c5" (UID: "f969c069-87cb-4571-85ef-3d88d8f510c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.806155 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-config" (OuterVolumeSpecName: "config") pod "f969c069-87cb-4571-85ef-3d88d8f510c5" (UID: "f969c069-87cb-4571-85ef-3d88d8f510c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.821320 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f969c069-87cb-4571-85ef-3d88d8f510c5" (UID: "f969c069-87cb-4571-85ef-3d88d8f510c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.827694 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22976fae-93cc-4036-8595-8d793ff0d00a-logs\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.827760 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z6cp\" (UniqueName: \"kubernetes.io/projected/22976fae-93cc-4036-8595-8d793ff0d00a-kube-api-access-7z6cp\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.828135 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.828256 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-config-data\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.828622 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.828657 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.828673 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.828686 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f969c069-87cb-4571-85ef-3d88d8f510c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.828701 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22rzp\" (UniqueName: \"kubernetes.io/projected/f969c069-87cb-4571-85ef-3d88d8f510c5-kube-api-access-22rzp\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.932548 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.932898 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-config-data\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.933015 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22976fae-93cc-4036-8595-8d793ff0d00a-logs\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.933057 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z6cp\" (UniqueName: \"kubernetes.io/projected/22976fae-93cc-4036-8595-8d793ff0d00a-kube-api-access-7z6cp\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.934891 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22976fae-93cc-4036-8595-8d793ff0d00a-logs\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.938903 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.938944 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-config-data\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:49 crc kubenswrapper[5129]: I0314 09:03:49.950169 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z6cp\" (UniqueName: \"kubernetes.io/projected/22976fae-93cc-4036-8595-8d793ff0d00a-kube-api-access-7z6cp\") pod \"nova-api-0\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " pod="openstack/nova-api-0" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.047470 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35914c22-9d64-4c0d-a4ff-090f61ede928" path="/var/lib/kubelet/pods/35914c22-9d64-4c0d-a4ff-090f61ede928/volumes" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.084993 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.528802 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.561122 5129 generic.go:334] "Generic (PLEG): container finished" podID="8d643556-e564-494a-be43-3769b1001f23" containerID="fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38" exitCode=0 Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.561176 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw772" event={"ID":"8d643556-e564-494a-be43-3769b1001f23","Type":"ContainerDied","Data":"fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38"} Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.568788 5129 generic.go:334] "Generic (PLEG): container finished" podID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerID="41324406f6570c83075d42f69a3334ad4493789f019670c400a6c767a567c8d9" exitCode=0 Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.568813 5129 generic.go:334] "Generic (PLEG): container finished" podID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerID="0e61e80deb368196d527d85e57f0afd35f54001ecf6d766c09aed3b99700b5e5" exitCode=143 Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.568865 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc39c79-b1f7-4c2a-9641-aba19b71b678","Type":"ContainerDied","Data":"41324406f6570c83075d42f69a3334ad4493789f019670c400a6c767a567c8d9"} Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.568894 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc39c79-b1f7-4c2a-9641-aba19b71b678","Type":"ContainerDied","Data":"0e61e80deb368196d527d85e57f0afd35f54001ecf6d766c09aed3b99700b5e5"} Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.571387 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22976fae-93cc-4036-8595-8d793ff0d00a","Type":"ContainerStarted","Data":"8548046d2ce8156bc5d9d916333d0092a73689f444873a6de50e4d11bfe227cc"} Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.571482 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f86dd9c67-9xx9q" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.688100 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.709212 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f86dd9c67-9xx9q"] Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.717947 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f86dd9c67-9xx9q"] Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.745547 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-combined-ca-bundle\") pod \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.745648 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pspwg\" (UniqueName: \"kubernetes.io/projected/dbc39c79-b1f7-4c2a-9641-aba19b71b678-kube-api-access-pspwg\") pod \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.745711 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc39c79-b1f7-4c2a-9641-aba19b71b678-logs\") pod \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.745800 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-config-data\") pod \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.745854 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-nova-metadata-tls-certs\") pod \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\" (UID: \"dbc39c79-b1f7-4c2a-9641-aba19b71b678\") " Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.746317 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc39c79-b1f7-4c2a-9641-aba19b71b678-logs" (OuterVolumeSpecName: "logs") pod "dbc39c79-b1f7-4c2a-9641-aba19b71b678" (UID: "dbc39c79-b1f7-4c2a-9641-aba19b71b678"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.765292 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc39c79-b1f7-4c2a-9641-aba19b71b678-kube-api-access-pspwg" (OuterVolumeSpecName: "kube-api-access-pspwg") pod "dbc39c79-b1f7-4c2a-9641-aba19b71b678" (UID: "dbc39c79-b1f7-4c2a-9641-aba19b71b678"). InnerVolumeSpecName "kube-api-access-pspwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.788662 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbc39c79-b1f7-4c2a-9641-aba19b71b678" (UID: "dbc39c79-b1f7-4c2a-9641-aba19b71b678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.789616 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-config-data" (OuterVolumeSpecName: "config-data") pod "dbc39c79-b1f7-4c2a-9641-aba19b71b678" (UID: "dbc39c79-b1f7-4c2a-9641-aba19b71b678"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.826544 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dbc39c79-b1f7-4c2a-9641-aba19b71b678" (UID: "dbc39c79-b1f7-4c2a-9641-aba19b71b678"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.847804 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.847853 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.847866 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc39c79-b1f7-4c2a-9641-aba19b71b678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.847879 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pspwg\" (UniqueName: \"kubernetes.io/projected/dbc39c79-b1f7-4c2a-9641-aba19b71b678-kube-api-access-pspwg\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:50 crc kubenswrapper[5129]: I0314 09:03:50.847890 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc39c79-b1f7-4c2a-9641-aba19b71b678-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.592949 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22976fae-93cc-4036-8595-8d793ff0d00a","Type":"ContainerStarted","Data":"588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54"} Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.595350 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc39c79-b1f7-4c2a-9641-aba19b71b678","Type":"ContainerDied","Data":"ad0e553ce03c985d270b2670c0491b85c3435ba3c1cdc2ca766ad4ce80787d62"} Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.595386 5129 scope.go:117] "RemoveContainer" containerID="41324406f6570c83075d42f69a3334ad4493789f019670c400a6c767a567c8d9" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.595409 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.632054 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.644445 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.657317 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:51 crc kubenswrapper[5129]: E0314 09:03:51.657849 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-log" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.657878 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-log" Mar 14 09:03:51 crc kubenswrapper[5129]: E0314 09:03:51.657898 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-metadata" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.657905 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-metadata" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.658106 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-log" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.658139 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" containerName="nova-metadata-metadata" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.659096 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.661300 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.663462 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.665502 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.763527 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.763791 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-config-data\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.763832 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf4pd\" (UniqueName: \"kubernetes.io/projected/5b61cf66-0088-4e61-b1a7-54816b9db5c8-kube-api-access-pf4pd\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.763860 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b61cf66-0088-4e61-b1a7-54816b9db5c8-logs\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.764117 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.866174 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.866286 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-config-data\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.866339 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf4pd\" (UniqueName: \"kubernetes.io/projected/5b61cf66-0088-4e61-b1a7-54816b9db5c8-kube-api-access-pf4pd\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.866374 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b61cf66-0088-4e61-b1a7-54816b9db5c8-logs\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.866457 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.868557 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b61cf66-0088-4e61-b1a7-54816b9db5c8-logs\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.870153 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.870307 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.871489 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-config-data\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.885245 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf4pd\" (UniqueName: \"kubernetes.io/projected/5b61cf66-0088-4e61-b1a7-54816b9db5c8-kube-api-access-pf4pd\") pod \"nova-metadata-0\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " pod="openstack/nova-metadata-0" Mar 14 09:03:51 crc kubenswrapper[5129]: I0314 09:03:51.994110 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.024260 5129 scope.go:117] "RemoveContainer" containerID="0e61e80deb368196d527d85e57f0afd35f54001ecf6d766c09aed3b99700b5e5" Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.046638 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc39c79-b1f7-4c2a-9641-aba19b71b678" path="/var/lib/kubelet/pods/dbc39c79-b1f7-4c2a-9641-aba19b71b678/volumes" Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.047218 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f969c069-87cb-4571-85ef-3d88d8f510c5" path="/var/lib/kubelet/pods/f969c069-87cb-4571-85ef-3d88d8f510c5/volumes" Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.475086 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:03:52 crc kubenswrapper[5129]: W0314 09:03:52.479097 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b61cf66_0088_4e61_b1a7_54816b9db5c8.slice/crio-0a2893d314ef2a12c9f1517d632de4def7092e1b4dff0ab70e9e58c210616a9a WatchSource:0}: Error finding container 0a2893d314ef2a12c9f1517d632de4def7092e1b4dff0ab70e9e58c210616a9a: Status 404 returned error can't find the container with id 0a2893d314ef2a12c9f1517d632de4def7092e1b4dff0ab70e9e58c210616a9a Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.605913 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b61cf66-0088-4e61-b1a7-54816b9db5c8","Type":"ContainerStarted","Data":"0a2893d314ef2a12c9f1517d632de4def7092e1b4dff0ab70e9e58c210616a9a"} Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.608132 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22976fae-93cc-4036-8595-8d793ff0d00a","Type":"ContainerStarted","Data":"e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8"} Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.612287 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw772" event={"ID":"8d643556-e564-494a-be43-3769b1001f23","Type":"ContainerStarted","Data":"281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100"} Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.629254 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.6292297270000002 podStartE2EDuration="3.629229727s" podCreationTimestamp="2026-03-14 09:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:52.626272276 +0000 UTC m=+7495.378187480" watchObservedRunningTime="2026-03-14 09:03:52.629229727 +0000 UTC m=+7495.381144911" Mar 14 09:03:52 crc kubenswrapper[5129]: I0314 09:03:52.646794 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rw772" podStartSLOduration=3.112297555 podStartE2EDuration="8.646754513s" podCreationTimestamp="2026-03-14 09:03:44 +0000 UTC" firstStartedPulling="2026-03-14 09:03:46.489796172 +0000 UTC m=+7489.241711346" lastFinishedPulling="2026-03-14 09:03:52.02425312 +0000 UTC m=+7494.776168304" observedRunningTime="2026-03-14 09:03:52.644879062 +0000 UTC m=+7495.396794266" watchObservedRunningTime="2026-03-14 09:03:52.646754513 +0000 UTC m=+7495.398669707" Mar 14 09:03:53 crc kubenswrapper[5129]: I0314 09:03:53.042975 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 14 09:03:53 crc kubenswrapper[5129]: I0314 09:03:53.624551 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b61cf66-0088-4e61-b1a7-54816b9db5c8","Type":"ContainerStarted","Data":"5953e78670d22d327077632a3d8f1a6ff552120883c447e1f403ab185d4c5f94"} Mar 14 09:03:53 crc kubenswrapper[5129]: I0314 09:03:53.624961 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b61cf66-0088-4e61-b1a7-54816b9db5c8","Type":"ContainerStarted","Data":"7c6a3836a45c8831f6e0a63a19a0beba8404ea9e9b282a30a3a39fd7feb8fa98"} Mar 14 09:03:55 crc kubenswrapper[5129]: I0314 09:03:55.286274 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:55 crc kubenswrapper[5129]: I0314 09:03:55.286352 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:03:56 crc kubenswrapper[5129]: I0314 09:03:56.339350 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rw772" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="registry-server" probeResult="failure" output=< Mar 14 09:03:56 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:03:56 crc kubenswrapper[5129]: > Mar 14 09:03:58 crc kubenswrapper[5129]: I0314 09:03:58.564493 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:58 crc kubenswrapper[5129]: I0314 09:03:58.581042 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=7.58102481 podStartE2EDuration="7.58102481s" podCreationTimestamp="2026-03-14 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:03:53.645753772 +0000 UTC m=+7496.397668966" watchObservedRunningTime="2026-03-14 09:03:58.58102481 +0000 UTC m=+7501.332939994" Mar 14 09:03:58 crc kubenswrapper[5129]: I0314 09:03:58.614104 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxmcs"] Mar 14 09:03:58 crc kubenswrapper[5129]: I0314 09:03:58.669883 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jxmcs" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="registry-server" containerID="cri-o://78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77" gracePeriod=2 Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.210918 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.311784 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-catalog-content\") pod \"ec6278c3-c4c9-4131-8805-457c57e9aea6\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.312214 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjjcv\" (UniqueName: \"kubernetes.io/projected/ec6278c3-c4c9-4131-8805-457c57e9aea6-kube-api-access-tjjcv\") pod \"ec6278c3-c4c9-4131-8805-457c57e9aea6\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.312285 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-utilities\") pod \"ec6278c3-c4c9-4131-8805-457c57e9aea6\" (UID: \"ec6278c3-c4c9-4131-8805-457c57e9aea6\") " Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.312973 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-utilities" (OuterVolumeSpecName: "utilities") pod "ec6278c3-c4c9-4131-8805-457c57e9aea6" (UID: "ec6278c3-c4c9-4131-8805-457c57e9aea6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.319760 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6278c3-c4c9-4131-8805-457c57e9aea6-kube-api-access-tjjcv" (OuterVolumeSpecName: "kube-api-access-tjjcv") pod "ec6278c3-c4c9-4131-8805-457c57e9aea6" (UID: "ec6278c3-c4c9-4131-8805-457c57e9aea6"). InnerVolumeSpecName "kube-api-access-tjjcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.358521 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec6278c3-c4c9-4131-8805-457c57e9aea6" (UID: "ec6278c3-c4c9-4131-8805-457c57e9aea6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.414233 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjjcv\" (UniqueName: \"kubernetes.io/projected/ec6278c3-c4c9-4131-8805-457c57e9aea6-kube-api-access-tjjcv\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.414443 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.414462 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec6278c3-c4c9-4131-8805-457c57e9aea6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.682742 5129 generic.go:334] "Generic (PLEG): container finished" podID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerID="78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77" exitCode=0 Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.682832 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxmcs" event={"ID":"ec6278c3-c4c9-4131-8805-457c57e9aea6","Type":"ContainerDied","Data":"78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77"} Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.682837 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxmcs" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.682864 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxmcs" event={"ID":"ec6278c3-c4c9-4131-8805-457c57e9aea6","Type":"ContainerDied","Data":"79dbe62d064d4b66db7dc3df32878460cef161e55834c3a54ecc617d6ec27b52"} Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.682885 5129 scope.go:117] "RemoveContainer" containerID="78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.729078 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxmcs"] Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.735489 5129 scope.go:117] "RemoveContainer" containerID="10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.736886 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jxmcs"] Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.757619 5129 scope.go:117] "RemoveContainer" containerID="fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.799403 5129 scope.go:117] "RemoveContainer" containerID="78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77" Mar 14 09:03:59 crc kubenswrapper[5129]: E0314 09:03:59.799958 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77\": container with ID starting with 78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77 not found: ID does not exist" containerID="78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.799996 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77"} err="failed to get container status \"78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77\": rpc error: code = NotFound desc = could not find container \"78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77\": container with ID starting with 78eea08aba2100afa5e44b6c6ff177957c8878bcacf063208fed23b05ec22f77 not found: ID does not exist" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.800016 5129 scope.go:117] "RemoveContainer" containerID="10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61" Mar 14 09:03:59 crc kubenswrapper[5129]: E0314 09:03:59.800409 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61\": container with ID starting with 10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61 not found: ID does not exist" containerID="10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.800447 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61"} err="failed to get container status \"10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61\": rpc error: code = NotFound desc = could not find container \"10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61\": container with ID starting with 10a3e1f3a4ea2ee7c0204ea0975930a5e6df1c5a4a82c64570fc47d5e4ee0b61 not found: ID does not exist" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.800464 5129 scope.go:117] "RemoveContainer" containerID="fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da" Mar 14 09:03:59 crc kubenswrapper[5129]: E0314 09:03:59.800725 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da\": container with ID starting with fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da not found: ID does not exist" containerID="fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da" Mar 14 09:03:59 crc kubenswrapper[5129]: I0314 09:03:59.800759 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da"} err="failed to get container status \"fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da\": rpc error: code = NotFound desc = could not find container \"fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da\": container with ID starting with fffd483c0061adb7bb16cc1044c37dbc065df794ad8960af6cf39a2cc47bf0da not found: ID does not exist" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.051101 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" path="/var/lib/kubelet/pods/ec6278c3-c4c9-4131-8805-457c57e9aea6/volumes" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.086303 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.086363 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.132930 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557984-t2fgw"] Mar 14 09:04:00 crc kubenswrapper[5129]: E0314 09:04:00.133350 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="registry-server" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.133365 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="registry-server" Mar 14 09:04:00 crc kubenswrapper[5129]: E0314 09:04:00.133397 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="extract-content" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.133404 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="extract-content" Mar 14 09:04:00 crc kubenswrapper[5129]: E0314 09:04:00.133413 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="extract-utilities" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.133419 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="extract-utilities" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.133591 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6278c3-c4c9-4131-8805-457c57e9aea6" containerName="registry-server" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.134208 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-t2fgw" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.136597 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.136850 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.142269 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.143770 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-t2fgw"] Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.241812 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbt9r\" (UniqueName: \"kubernetes.io/projected/3376bd41-1051-449e-a7cf-1e3500a04bcf-kube-api-access-pbt9r\") pod \"auto-csr-approver-29557984-t2fgw\" (UID: \"3376bd41-1051-449e-a7cf-1e3500a04bcf\") " pod="openshift-infra/auto-csr-approver-29557984-t2fgw" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.343149 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbt9r\" (UniqueName: \"kubernetes.io/projected/3376bd41-1051-449e-a7cf-1e3500a04bcf-kube-api-access-pbt9r\") pod \"auto-csr-approver-29557984-t2fgw\" (UID: \"3376bd41-1051-449e-a7cf-1e3500a04bcf\") " pod="openshift-infra/auto-csr-approver-29557984-t2fgw" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.365153 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbt9r\" (UniqueName: \"kubernetes.io/projected/3376bd41-1051-449e-a7cf-1e3500a04bcf-kube-api-access-pbt9r\") pod \"auto-csr-approver-29557984-t2fgw\" (UID: \"3376bd41-1051-449e-a7cf-1e3500a04bcf\") " pod="openshift-infra/auto-csr-approver-29557984-t2fgw" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.458812 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-t2fgw" Mar 14 09:04:00 crc kubenswrapper[5129]: I0314 09:04:00.895952 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-t2fgw"] Mar 14 09:04:01 crc kubenswrapper[5129]: I0314 09:04:01.171995 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:01 crc kubenswrapper[5129]: I0314 09:04:01.172116 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:01 crc kubenswrapper[5129]: I0314 09:04:01.707652 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-t2fgw" event={"ID":"3376bd41-1051-449e-a7cf-1e3500a04bcf","Type":"ContainerStarted","Data":"149ec17e1038bf1220c34a124b59fcf7e89da38ce372dc84468b37fb4c959547"} Mar 14 09:04:01 crc kubenswrapper[5129]: I0314 09:04:01.995037 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 09:04:01 crc kubenswrapper[5129]: I0314 09:04:01.996584 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 09:04:02 crc kubenswrapper[5129]: I0314 09:04:02.723185 5129 generic.go:334] "Generic (PLEG): container finished" podID="3376bd41-1051-449e-a7cf-1e3500a04bcf" containerID="5bd98513322662e1f7e7fc825437c846359baf0cb15c38a39da6ba98b9e0960b" exitCode=0 Mar 14 09:04:02 crc kubenswrapper[5129]: I0314 09:04:02.723556 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-t2fgw" event={"ID":"3376bd41-1051-449e-a7cf-1e3500a04bcf","Type":"ContainerDied","Data":"5bd98513322662e1f7e7fc825437c846359baf0cb15c38a39da6ba98b9e0960b"} Mar 14 09:04:03 crc kubenswrapper[5129]: I0314 09:04:03.006708 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:03 crc kubenswrapper[5129]: I0314 09:04:03.006714 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:04 crc kubenswrapper[5129]: I0314 09:04:04.212659 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-t2fgw" Mar 14 09:04:04 crc kubenswrapper[5129]: I0314 09:04:04.318808 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbt9r\" (UniqueName: \"kubernetes.io/projected/3376bd41-1051-449e-a7cf-1e3500a04bcf-kube-api-access-pbt9r\") pod \"3376bd41-1051-449e-a7cf-1e3500a04bcf\" (UID: \"3376bd41-1051-449e-a7cf-1e3500a04bcf\") " Mar 14 09:04:04 crc kubenswrapper[5129]: I0314 09:04:04.327700 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3376bd41-1051-449e-a7cf-1e3500a04bcf-kube-api-access-pbt9r" (OuterVolumeSpecName: "kube-api-access-pbt9r") pod "3376bd41-1051-449e-a7cf-1e3500a04bcf" (UID: "3376bd41-1051-449e-a7cf-1e3500a04bcf"). InnerVolumeSpecName "kube-api-access-pbt9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:04 crc kubenswrapper[5129]: I0314 09:04:04.422095 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbt9r\" (UniqueName: \"kubernetes.io/projected/3376bd41-1051-449e-a7cf-1e3500a04bcf-kube-api-access-pbt9r\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:04 crc kubenswrapper[5129]: I0314 09:04:04.744046 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-t2fgw" event={"ID":"3376bd41-1051-449e-a7cf-1e3500a04bcf","Type":"ContainerDied","Data":"149ec17e1038bf1220c34a124b59fcf7e89da38ce372dc84468b37fb4c959547"} Mar 14 09:04:04 crc kubenswrapper[5129]: I0314 09:04:04.744088 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="149ec17e1038bf1220c34a124b59fcf7e89da38ce372dc84468b37fb4c959547" Mar 14 09:04:04 crc kubenswrapper[5129]: I0314 09:04:04.744129 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-t2fgw" Mar 14 09:04:05 crc kubenswrapper[5129]: I0314 09:04:05.285633 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7j4sn"] Mar 14 09:04:05 crc kubenswrapper[5129]: I0314 09:04:05.294736 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7j4sn"] Mar 14 09:04:06 crc kubenswrapper[5129]: I0314 09:04:06.051406 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="116977c9-6bcd-4783-9885-f1b73bda04e6" path="/var/lib/kubelet/pods/116977c9-6bcd-4783-9885-f1b73bda04e6/volumes" Mar 14 09:04:06 crc kubenswrapper[5129]: I0314 09:04:06.348179 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rw772" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="registry-server" probeResult="failure" output=< Mar 14 09:04:06 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:04:06 crc kubenswrapper[5129]: > Mar 14 09:04:07 crc kubenswrapper[5129]: I0314 09:04:07.045370 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-77ds8"] Mar 14 09:04:07 crc kubenswrapper[5129]: I0314 09:04:07.058294 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4e64-account-create-update-ztnfr"] Mar 14 09:04:07 crc kubenswrapper[5129]: I0314 09:04:07.069799 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-77ds8"] Mar 14 09:04:07 crc kubenswrapper[5129]: I0314 09:04:07.083013 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4e64-account-create-update-ztnfr"] Mar 14 09:04:08 crc kubenswrapper[5129]: I0314 09:04:08.049119 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b" path="/var/lib/kubelet/pods/8cab510a-bf82-4dcd-bf7e-b4e0ccf3fd5b/volumes" Mar 14 09:04:08 crc kubenswrapper[5129]: I0314 09:04:08.050415 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1ef377-9a00-46b1-b508-be733492d498" path="/var/lib/kubelet/pods/8e1ef377-9a00-46b1-b508-be733492d498/volumes" Mar 14 09:04:08 crc kubenswrapper[5129]: I0314 09:04:08.086392 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:04:08 crc kubenswrapper[5129]: I0314 09:04:08.087114 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:04:09 crc kubenswrapper[5129]: I0314 09:04:09.994565 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 09:04:09 crc kubenswrapper[5129]: I0314 09:04:09.994679 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 09:04:11 crc kubenswrapper[5129]: I0314 09:04:11.128878 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:11 crc kubenswrapper[5129]: I0314 09:04:11.169822 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:13 crc kubenswrapper[5129]: I0314 09:04:13.003775 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:13 crc kubenswrapper[5129]: I0314 09:04:13.003827 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:14 crc kubenswrapper[5129]: I0314 09:04:14.836744 5129 generic.go:334] "Generic (PLEG): container finished" podID="bade4404-af66-46e3-8b88-401b7347b40e" containerID="dddcc0f8b88b14babbdff4091b07ebbdfcaee288e79a846ff02b5275acfaace2" exitCode=137 Mar 14 09:04:14 crc kubenswrapper[5129]: I0314 09:04:14.836776 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bade4404-af66-46e3-8b88-401b7347b40e","Type":"ContainerDied","Data":"dddcc0f8b88b14babbdff4091b07ebbdfcaee288e79a846ff02b5275acfaace2"} Mar 14 09:04:14 crc kubenswrapper[5129]: I0314 09:04:14.925520 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.017794 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-combined-ca-bundle\") pod \"bade4404-af66-46e3-8b88-401b7347b40e\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.017938 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j58qz\" (UniqueName: \"kubernetes.io/projected/bade4404-af66-46e3-8b88-401b7347b40e-kube-api-access-j58qz\") pod \"bade4404-af66-46e3-8b88-401b7347b40e\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.018001 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-config-data\") pod \"bade4404-af66-46e3-8b88-401b7347b40e\" (UID: \"bade4404-af66-46e3-8b88-401b7347b40e\") " Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.023807 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bade4404-af66-46e3-8b88-401b7347b40e-kube-api-access-j58qz" (OuterVolumeSpecName: "kube-api-access-j58qz") pod "bade4404-af66-46e3-8b88-401b7347b40e" (UID: "bade4404-af66-46e3-8b88-401b7347b40e"). InnerVolumeSpecName "kube-api-access-j58qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.043485 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-config-data" (OuterVolumeSpecName: "config-data") pod "bade4404-af66-46e3-8b88-401b7347b40e" (UID: "bade4404-af66-46e3-8b88-401b7347b40e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.049928 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bade4404-af66-46e3-8b88-401b7347b40e" (UID: "bade4404-af66-46e3-8b88-401b7347b40e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.120118 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.120684 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j58qz\" (UniqueName: \"kubernetes.io/projected/bade4404-af66-46e3-8b88-401b7347b40e-kube-api-access-j58qz\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.120767 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bade4404-af66-46e3-8b88-401b7347b40e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.847986 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bade4404-af66-46e3-8b88-401b7347b40e","Type":"ContainerDied","Data":"5b9daaab6f1fb9548bf0fba46937a0081738d4534d76b35cfb5ed3ef413d5f3f"} Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.848039 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.848038 5129 scope.go:117] "RemoveContainer" containerID="dddcc0f8b88b14babbdff4091b07ebbdfcaee288e79a846ff02b5275acfaace2" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.888489 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.896697 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.927031 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:04:15 crc kubenswrapper[5129]: E0314 09:04:15.927872 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bade4404-af66-46e3-8b88-401b7347b40e" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.927893 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bade4404-af66-46e3-8b88-401b7347b40e" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 09:04:15 crc kubenswrapper[5129]: E0314 09:04:15.927932 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3376bd41-1051-449e-a7cf-1e3500a04bcf" containerName="oc" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.927940 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3376bd41-1051-449e-a7cf-1e3500a04bcf" containerName="oc" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.928198 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3376bd41-1051-449e-a7cf-1e3500a04bcf" containerName="oc" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.928241 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bade4404-af66-46e3-8b88-401b7347b40e" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.931069 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.933701 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.934030 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.934114 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 14 09:04:15 crc kubenswrapper[5129]: I0314 09:04:15.944145 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.036577 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.036719 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.036809 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxv65\" (UniqueName: \"kubernetes.io/projected/407815b2-d340-427a-8b62-b87fab475772-kube-api-access-vxv65\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.037902 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.038040 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.047950 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bade4404-af66-46e3-8b88-401b7347b40e" path="/var/lib/kubelet/pods/bade4404-af66-46e3-8b88-401b7347b40e/volumes" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.139866 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.140320 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxv65\" (UniqueName: \"kubernetes.io/projected/407815b2-d340-427a-8b62-b87fab475772-kube-api-access-vxv65\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.140426 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.140510 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.140574 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.152126 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.152200 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.154949 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.159395 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407815b2-d340-427a-8b62-b87fab475772-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.163944 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxv65\" (UniqueName: \"kubernetes.io/projected/407815b2-d340-427a-8b62-b87fab475772-kube-api-access-vxv65\") pod \"nova-cell1-novncproxy-0\" (UID: \"407815b2-d340-427a-8b62-b87fab475772\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.261747 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.348075 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rw772" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="registry-server" probeResult="failure" output=< Mar 14 09:04:16 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:04:16 crc kubenswrapper[5129]: > Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.706170 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 09:04:16 crc kubenswrapper[5129]: I0314 09:04:16.856072 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"407815b2-d340-427a-8b62-b87fab475772","Type":"ContainerStarted","Data":"ce3ae7f4cac0a21a0521efd8930e5c3b7a07a98ed67c03ee3c8f5d3f3c8c56c9"} Mar 14 09:04:17 crc kubenswrapper[5129]: I0314 09:04:17.870487 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"407815b2-d340-427a-8b62-b87fab475772","Type":"ContainerStarted","Data":"1f048eb6c2c910a8a0a68291cbb9fc4b9b92be6650da1d74f8acd571aa3a090c"} Mar 14 09:04:17 crc kubenswrapper[5129]: I0314 09:04:17.911378 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.911356755 podStartE2EDuration="2.911356755s" podCreationTimestamp="2026-03-14 09:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:04:17.902195756 +0000 UTC m=+7520.654110950" watchObservedRunningTime="2026-03-14 09:04:17.911356755 +0000 UTC m=+7520.663271929" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.470073 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.610286 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2hbj\" (UniqueName: \"kubernetes.io/projected/6e8803c1-008f-4327-8ab4-a91fbc9861b2-kube-api-access-m2hbj\") pod \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.610424 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-combined-ca-bundle\") pod \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.610655 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-config-data\") pod \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\" (UID: \"6e8803c1-008f-4327-8ab4-a91fbc9861b2\") " Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.616877 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e8803c1-008f-4327-8ab4-a91fbc9861b2-kube-api-access-m2hbj" (OuterVolumeSpecName: "kube-api-access-m2hbj") pod "6e8803c1-008f-4327-8ab4-a91fbc9861b2" (UID: "6e8803c1-008f-4327-8ab4-a91fbc9861b2"). InnerVolumeSpecName "kube-api-access-m2hbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.637384 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e8803c1-008f-4327-8ab4-a91fbc9861b2" (UID: "6e8803c1-008f-4327-8ab4-a91fbc9861b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.639841 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-config-data" (OuterVolumeSpecName: "config-data") pod "6e8803c1-008f-4327-8ab4-a91fbc9861b2" (UID: "6e8803c1-008f-4327-8ab4-a91fbc9861b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.713370 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.713403 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2hbj\" (UniqueName: \"kubernetes.io/projected/6e8803c1-008f-4327-8ab4-a91fbc9861b2-kube-api-access-m2hbj\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.713415 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8803c1-008f-4327-8ab4-a91fbc9861b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.882908 5129 generic.go:334] "Generic (PLEG): container finished" podID="6e8803c1-008f-4327-8ab4-a91fbc9861b2" containerID="3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c" exitCode=137 Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.883029 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.882964 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6e8803c1-008f-4327-8ab4-a91fbc9861b2","Type":"ContainerDied","Data":"3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c"} Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.883078 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6e8803c1-008f-4327-8ab4-a91fbc9861b2","Type":"ContainerDied","Data":"9b62ef0ddaaf00d0c78154a2af7496d3658db4b69f43cb6c306b7fe2b223d09b"} Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.883096 5129 scope.go:117] "RemoveContainer" containerID="3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.905703 5129 scope.go:117] "RemoveContainer" containerID="3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c" Mar 14 09:04:18 crc kubenswrapper[5129]: E0314 09:04:18.906205 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c\": container with ID starting with 3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c not found: ID does not exist" containerID="3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.906234 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c"} err="failed to get container status \"3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c\": rpc error: code = NotFound desc = could not find container \"3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c\": container with ID starting with 3d77cb7287cb90a6f89f7af4ee76a80f227449b858a496d3c90b2c68b846fb8c not found: ID does not exist" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.924203 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.937083 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.950118 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:04:18 crc kubenswrapper[5129]: E0314 09:04:18.950874 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8803c1-008f-4327-8ab4-a91fbc9861b2" containerName="nova-scheduler-scheduler" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.951016 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8803c1-008f-4327-8ab4-a91fbc9861b2" containerName="nova-scheduler-scheduler" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.951359 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8803c1-008f-4327-8ab4-a91fbc9861b2" containerName="nova-scheduler-scheduler" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.952208 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.954275 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 09:04:18 crc kubenswrapper[5129]: I0314 09:04:18.970208 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.125496 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.125630 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-config-data\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.125707 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qrt\" (UniqueName: \"kubernetes.io/projected/0becf0c4-d0bc-40b1-9677-1024fcb4a525-kube-api-access-n4qrt\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.226908 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qrt\" (UniqueName: \"kubernetes.io/projected/0becf0c4-d0bc-40b1-9677-1024fcb4a525-kube-api-access-n4qrt\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.227023 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.227097 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-config-data\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.231562 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-config-data\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.232100 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.251030 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qrt\" (UniqueName: \"kubernetes.io/projected/0becf0c4-d0bc-40b1-9677-1024fcb4a525-kube-api-access-n4qrt\") pod \"nova-scheduler-0\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.270525 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.771165 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:04:19 crc kubenswrapper[5129]: I0314 09:04:19.894956 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0becf0c4-d0bc-40b1-9677-1024fcb4a525","Type":"ContainerStarted","Data":"66bfc1dbd08db6eeee0a4d795bbda570b6f19d3d641ff14a9b183175025bf3d6"} Mar 14 09:04:20 crc kubenswrapper[5129]: I0314 09:04:20.055179 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e8803c1-008f-4327-8ab4-a91fbc9861b2" path="/var/lib/kubelet/pods/6e8803c1-008f-4327-8ab4-a91fbc9861b2/volumes" Mar 14 09:04:20 crc kubenswrapper[5129]: I0314 09:04:20.912454 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0becf0c4-d0bc-40b1-9677-1024fcb4a525","Type":"ContainerStarted","Data":"47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef"} Mar 14 09:04:20 crc kubenswrapper[5129]: I0314 09:04:20.950163 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.950132438 podStartE2EDuration="2.950132438s" podCreationTimestamp="2026-03-14 09:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:04:20.938658327 +0000 UTC m=+7523.690573571" watchObservedRunningTime="2026-03-14 09:04:20.950132438 +0000 UTC m=+7523.702047642" Mar 14 09:04:21 crc kubenswrapper[5129]: I0314 09:04:21.167927 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:21 crc kubenswrapper[5129]: I0314 09:04:21.167925 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:21 crc kubenswrapper[5129]: I0314 09:04:21.262585 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:23 crc kubenswrapper[5129]: I0314 09:04:23.002782 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:23 crc kubenswrapper[5129]: I0314 09:04:23.002845 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:23 crc kubenswrapper[5129]: I0314 09:04:23.045233 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-lvvc4"] Mar 14 09:04:23 crc kubenswrapper[5129]: I0314 09:04:23.058565 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-lvvc4"] Mar 14 09:04:24 crc kubenswrapper[5129]: I0314 09:04:24.060240 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5241695-295b-4ce3-809e-79a7790a54c3" path="/var/lib/kubelet/pods/a5241695-295b-4ce3-809e-79a7790a54c3/volumes" Mar 14 09:04:24 crc kubenswrapper[5129]: I0314 09:04:24.271064 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 09:04:25 crc kubenswrapper[5129]: I0314 09:04:25.356174 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:04:25 crc kubenswrapper[5129]: I0314 09:04:25.403789 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:04:25 crc kubenswrapper[5129]: I0314 09:04:25.601798 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rw772"] Mar 14 09:04:26 crc kubenswrapper[5129]: I0314 09:04:26.262137 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:26 crc kubenswrapper[5129]: I0314 09:04:26.283159 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:26 crc kubenswrapper[5129]: I0314 09:04:26.981507 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rw772" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="registry-server" containerID="cri-o://281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100" gracePeriod=2 Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.006366 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.182179 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hznb2"] Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.183501 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.186080 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.186579 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.203817 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hznb2"] Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.296722 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-config-data\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.296771 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-scripts\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.296810 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.296985 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csld2\" (UniqueName: \"kubernetes.io/projected/bf5ebad0-4050-4467-918f-18b373bd269a-kube-api-access-csld2\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.398736 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-config-data\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.398784 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-scripts\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.398836 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.398874 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csld2\" (UniqueName: \"kubernetes.io/projected/bf5ebad0-4050-4467-918f-18b373bd269a-kube-api-access-csld2\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.405082 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.405274 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-config-data\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.405728 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-scripts\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.419354 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csld2\" (UniqueName: \"kubernetes.io/projected/bf5ebad0-4050-4467-918f-18b373bd269a-kube-api-access-csld2\") pod \"nova-cell1-cell-mapping-hznb2\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.491464 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.520246 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.602381 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-catalog-content\") pod \"8d643556-e564-494a-be43-3769b1001f23\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.602789 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttw42\" (UniqueName: \"kubernetes.io/projected/8d643556-e564-494a-be43-3769b1001f23-kube-api-access-ttw42\") pod \"8d643556-e564-494a-be43-3769b1001f23\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.602818 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-utilities\") pod \"8d643556-e564-494a-be43-3769b1001f23\" (UID: \"8d643556-e564-494a-be43-3769b1001f23\") " Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.603512 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-utilities" (OuterVolumeSpecName: "utilities") pod "8d643556-e564-494a-be43-3769b1001f23" (UID: "8d643556-e564-494a-be43-3769b1001f23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.604207 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.606681 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d643556-e564-494a-be43-3769b1001f23-kube-api-access-ttw42" (OuterVolumeSpecName: "kube-api-access-ttw42") pod "8d643556-e564-494a-be43-3769b1001f23" (UID: "8d643556-e564-494a-be43-3769b1001f23"). InnerVolumeSpecName "kube-api-access-ttw42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.705952 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttw42\" (UniqueName: \"kubernetes.io/projected/8d643556-e564-494a-be43-3769b1001f23-kube-api-access-ttw42\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.746492 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d643556-e564-494a-be43-3769b1001f23" (UID: "8d643556-e564-494a-be43-3769b1001f23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.807983 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d643556-e564-494a-be43-3769b1001f23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:27 crc kubenswrapper[5129]: W0314 09:04:27.971652 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5ebad0_4050_4467_918f_18b373bd269a.slice/crio-fc532e5a45f3b501bcefa56633d33562287eb89b5f9636254588fcd88e30c5e2 WatchSource:0}: Error finding container fc532e5a45f3b501bcefa56633d33562287eb89b5f9636254588fcd88e30c5e2: Status 404 returned error can't find the container with id fc532e5a45f3b501bcefa56633d33562287eb89b5f9636254588fcd88e30c5e2 Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.974812 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hznb2"] Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.990710 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hznb2" event={"ID":"bf5ebad0-4050-4467-918f-18b373bd269a","Type":"ContainerStarted","Data":"fc532e5a45f3b501bcefa56633d33562287eb89b5f9636254588fcd88e30c5e2"} Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.994434 5129 generic.go:334] "Generic (PLEG): container finished" podID="8d643556-e564-494a-be43-3769b1001f23" containerID="281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100" exitCode=0 Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.994476 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw772" event={"ID":"8d643556-e564-494a-be43-3769b1001f23","Type":"ContainerDied","Data":"281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100"} Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.994491 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw772" Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.994539 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw772" event={"ID":"8d643556-e564-494a-be43-3769b1001f23","Type":"ContainerDied","Data":"c2c42c4fc1f3a17c94d0014c2dc3e39a14dc95ada31c06b07608d966b0a69f8d"} Mar 14 09:04:27 crc kubenswrapper[5129]: I0314 09:04:27.994573 5129 scope.go:117] "RemoveContainer" containerID="281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.030033 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rw772"] Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.033657 5129 scope.go:117] "RemoveContainer" containerID="fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.050487 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rw772"] Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.074682 5129 scope.go:117] "RemoveContainer" containerID="3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.096431 5129 scope.go:117] "RemoveContainer" containerID="281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100" Mar 14 09:04:28 crc kubenswrapper[5129]: E0314 09:04:28.097018 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100\": container with ID starting with 281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100 not found: ID does not exist" containerID="281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.097069 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100"} err="failed to get container status \"281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100\": rpc error: code = NotFound desc = could not find container \"281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100\": container with ID starting with 281ae190f0d48cb5dc7055c1cd9a4fb183af5ebff19bee776205b9713e266100 not found: ID does not exist" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.097102 5129 scope.go:117] "RemoveContainer" containerID="fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38" Mar 14 09:04:28 crc kubenswrapper[5129]: E0314 09:04:28.097460 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38\": container with ID starting with fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38 not found: ID does not exist" containerID="fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.097489 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38"} err="failed to get container status \"fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38\": rpc error: code = NotFound desc = could not find container \"fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38\": container with ID starting with fe3f28efe9a9c4d987adda9ecf41281aa168b9297cb665015f3efce4931afa38 not found: ID does not exist" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.097504 5129 scope.go:117] "RemoveContainer" containerID="3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1" Mar 14 09:04:28 crc kubenswrapper[5129]: E0314 09:04:28.097798 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1\": container with ID starting with 3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1 not found: ID does not exist" containerID="3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.097841 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1"} err="failed to get container status \"3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1\": rpc error: code = NotFound desc = could not find container \"3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1\": container with ID starting with 3760d89cafc8f7a3ca247e5e3d98540cb9d5dce2ffbf740a5b708686d4b7ddf1 not found: ID does not exist" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.792408 5129 scope.go:117] "RemoveContainer" containerID="414d9c5c05d97bc88d53f026d62517aa4927eb14d00fb3e8cbbcd138777f70dd" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.821787 5129 scope.go:117] "RemoveContainer" containerID="382d7502da58377d9faf810b28da60562ff0d64caac9ea231ad2d6fd4230293f" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.880019 5129 scope.go:117] "RemoveContainer" containerID="1385d9621809d4f434148307c54f316ee93c6020b6d6a59a8fb650aabc0b0935" Mar 14 09:04:28 crc kubenswrapper[5129]: I0314 09:04:28.904465 5129 scope.go:117] "RemoveContainer" containerID="9984b133d4c499298b9d153d514aec2a47f57568d35839d7ac4a95078c392cdc" Mar 14 09:04:29 crc kubenswrapper[5129]: I0314 09:04:29.006364 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hznb2" event={"ID":"bf5ebad0-4050-4467-918f-18b373bd269a","Type":"ContainerStarted","Data":"e5d963130402a99df6b3b5de440dd2ced4e4272adf9e84093aee663e35af0886"} Mar 14 09:04:29 crc kubenswrapper[5129]: I0314 09:04:29.024430 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hznb2" podStartSLOduration=2.024413309 podStartE2EDuration="2.024413309s" podCreationTimestamp="2026-03-14 09:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:04:29.020452322 +0000 UTC m=+7531.772367536" watchObservedRunningTime="2026-03-14 09:04:29.024413309 +0000 UTC m=+7531.776328493" Mar 14 09:04:29 crc kubenswrapper[5129]: I0314 09:04:29.271387 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 09:04:29 crc kubenswrapper[5129]: I0314 09:04:29.310082 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 09:04:30 crc kubenswrapper[5129]: I0314 09:04:30.057006 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d643556-e564-494a-be43-3769b1001f23" path="/var/lib/kubelet/pods/8d643556-e564-494a-be43-3769b1001f23/volumes" Mar 14 09:04:30 crc kubenswrapper[5129]: I0314 09:04:30.057840 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 09:04:31 crc kubenswrapper[5129]: I0314 09:04:31.126850 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:31 crc kubenswrapper[5129]: I0314 09:04:31.168802 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.144:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.637549 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kzhjd"] Mar 14 09:04:32 crc kubenswrapper[5129]: E0314 09:04:32.638268 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="registry-server" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.638287 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="registry-server" Mar 14 09:04:32 crc kubenswrapper[5129]: E0314 09:04:32.638309 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="extract-content" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.638317 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="extract-content" Mar 14 09:04:32 crc kubenswrapper[5129]: E0314 09:04:32.638350 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="extract-utilities" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.638358 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="extract-utilities" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.638594 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d643556-e564-494a-be43-3769b1001f23" containerName="registry-server" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.639286 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.645226 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.645512 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.665143 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kzhjd"] Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.722538 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqsm5\" (UniqueName: \"kubernetes.io/projected/7f98c195-d11b-4583-bdc6-5708e9e72c03-kube-api-access-jqsm5\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.722663 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-combined-ca-bundle\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.722812 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-dispersionconf\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.722887 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f98c195-d11b-4583-bdc6-5708e9e72c03-etc-swift\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.722976 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-scripts\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.723058 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-swiftconf\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.723103 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-ring-data-devices\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.824023 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-dispersionconf\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.824070 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f98c195-d11b-4583-bdc6-5708e9e72c03-etc-swift\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.824110 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-scripts\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.824151 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-swiftconf\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.824178 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-ring-data-devices\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.824207 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqsm5\" (UniqueName: \"kubernetes.io/projected/7f98c195-d11b-4583-bdc6-5708e9e72c03-kube-api-access-jqsm5\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.824227 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-combined-ca-bundle\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.825442 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f98c195-d11b-4583-bdc6-5708e9e72c03-etc-swift\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.826082 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-ring-data-devices\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.826681 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-scripts\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.829632 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-dispersionconf\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.829997 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-swiftconf\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.830151 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-combined-ca-bundle\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.845103 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqsm5\" (UniqueName: \"kubernetes.io/projected/7f98c195-d11b-4583-bdc6-5708e9e72c03-kube-api-access-jqsm5\") pod \"swift-ring-rebalance-kzhjd\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:32 crc kubenswrapper[5129]: I0314 09:04:32.955521 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:33 crc kubenswrapper[5129]: I0314 09:04:33.002752 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:33 crc kubenswrapper[5129]: I0314 09:04:33.002771 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.145:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:04:33 crc kubenswrapper[5129]: I0314 09:04:33.403276 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kzhjd"] Mar 14 09:04:33 crc kubenswrapper[5129]: W0314 09:04:33.404075 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f98c195_d11b_4583_bdc6_5708e9e72c03.slice/crio-b07b8d4d264d2366c03add8622d31a97623c802c17c1f57d834a8d3d9cd56bc7 WatchSource:0}: Error finding container b07b8d4d264d2366c03add8622d31a97623c802c17c1f57d834a8d3d9cd56bc7: Status 404 returned error can't find the container with id b07b8d4d264d2366c03add8622d31a97623c802c17c1f57d834a8d3d9cd56bc7 Mar 14 09:04:34 crc kubenswrapper[5129]: I0314 09:04:34.056660 5129 generic.go:334] "Generic (PLEG): container finished" podID="bf5ebad0-4050-4467-918f-18b373bd269a" containerID="e5d963130402a99df6b3b5de440dd2ced4e4272adf9e84093aee663e35af0886" exitCode=0 Mar 14 09:04:34 crc kubenswrapper[5129]: I0314 09:04:34.056752 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hznb2" event={"ID":"bf5ebad0-4050-4467-918f-18b373bd269a","Type":"ContainerDied","Data":"e5d963130402a99df6b3b5de440dd2ced4e4272adf9e84093aee663e35af0886"} Mar 14 09:04:34 crc kubenswrapper[5129]: I0314 09:04:34.061162 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzhjd" event={"ID":"7f98c195-d11b-4583-bdc6-5708e9e72c03","Type":"ContainerStarted","Data":"0f35e53b5e6b5b25ca0e102dffddff07fca6d934bf209786fa74497323d689fc"} Mar 14 09:04:34 crc kubenswrapper[5129]: I0314 09:04:34.061199 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzhjd" event={"ID":"7f98c195-d11b-4583-bdc6-5708e9e72c03","Type":"ContainerStarted","Data":"b07b8d4d264d2366c03add8622d31a97623c802c17c1f57d834a8d3d9cd56bc7"} Mar 14 09:04:34 crc kubenswrapper[5129]: I0314 09:04:34.098487 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kzhjd" podStartSLOduration=2.098468704 podStartE2EDuration="2.098468704s" podCreationTimestamp="2026-03-14 09:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:04:34.096740117 +0000 UTC m=+7536.848655311" watchObservedRunningTime="2026-03-14 09:04:34.098468704 +0000 UTC m=+7536.850383888" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.513557 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.613306 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-scripts\") pod \"bf5ebad0-4050-4467-918f-18b373bd269a\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.613373 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csld2\" (UniqueName: \"kubernetes.io/projected/bf5ebad0-4050-4467-918f-18b373bd269a-kube-api-access-csld2\") pod \"bf5ebad0-4050-4467-918f-18b373bd269a\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.613557 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-combined-ca-bundle\") pod \"bf5ebad0-4050-4467-918f-18b373bd269a\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.614416 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-config-data\") pod \"bf5ebad0-4050-4467-918f-18b373bd269a\" (UID: \"bf5ebad0-4050-4467-918f-18b373bd269a\") " Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.619168 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5ebad0-4050-4467-918f-18b373bd269a-kube-api-access-csld2" (OuterVolumeSpecName: "kube-api-access-csld2") pod "bf5ebad0-4050-4467-918f-18b373bd269a" (UID: "bf5ebad0-4050-4467-918f-18b373bd269a"). InnerVolumeSpecName "kube-api-access-csld2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.620733 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-scripts" (OuterVolumeSpecName: "scripts") pod "bf5ebad0-4050-4467-918f-18b373bd269a" (UID: "bf5ebad0-4050-4467-918f-18b373bd269a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.639352 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf5ebad0-4050-4467-918f-18b373bd269a" (UID: "bf5ebad0-4050-4467-918f-18b373bd269a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.647430 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-config-data" (OuterVolumeSpecName: "config-data") pod "bf5ebad0-4050-4467-918f-18b373bd269a" (UID: "bf5ebad0-4050-4467-918f-18b373bd269a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.722160 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.722206 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csld2\" (UniqueName: \"kubernetes.io/projected/bf5ebad0-4050-4467-918f-18b373bd269a-kube-api-access-csld2\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.722222 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:35 crc kubenswrapper[5129]: I0314 09:04:35.722234 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5ebad0-4050-4467-918f-18b373bd269a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.075657 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lfzdp"] Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.147434 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hznb2" event={"ID":"bf5ebad0-4050-4467-918f-18b373bd269a","Type":"ContainerDied","Data":"fc532e5a45f3b501bcefa56633d33562287eb89b5f9636254588fcd88e30c5e2"} Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.147482 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc532e5a45f3b501bcefa56633d33562287eb89b5f9636254588fcd88e30c5e2" Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.147570 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hznb2" Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.152872 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lfzdp"] Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.309944 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.310164 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" containerID="cri-o://588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54" gracePeriod=30 Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.310293 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" containerID="cri-o://e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8" gracePeriod=30 Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.351007 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.351353 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" containerID="cri-o://47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" gracePeriod=30 Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.452912 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.453140 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" containerID="cri-o://7c6a3836a45c8831f6e0a63a19a0beba8404ea9e9b282a30a3a39fd7feb8fa98" gracePeriod=30 Mar 14 09:04:36 crc kubenswrapper[5129]: I0314 09:04:36.453283 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" containerID="cri-o://5953e78670d22d327077632a3d8f1a6ff552120883c447e1f403ab185d4c5f94" gracePeriod=30 Mar 14 09:04:37 crc kubenswrapper[5129]: I0314 09:04:37.157169 5129 generic.go:334] "Generic (PLEG): container finished" podID="22976fae-93cc-4036-8595-8d793ff0d00a" containerID="588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54" exitCode=143 Mar 14 09:04:37 crc kubenswrapper[5129]: I0314 09:04:37.157253 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22976fae-93cc-4036-8595-8d793ff0d00a","Type":"ContainerDied","Data":"588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54"} Mar 14 09:04:37 crc kubenswrapper[5129]: I0314 09:04:37.160520 5129 generic.go:334] "Generic (PLEG): container finished" podID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerID="7c6a3836a45c8831f6e0a63a19a0beba8404ea9e9b282a30a3a39fd7feb8fa98" exitCode=143 Mar 14 09:04:37 crc kubenswrapper[5129]: I0314 09:04:37.160554 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b61cf66-0088-4e61-b1a7-54816b9db5c8","Type":"ContainerDied","Data":"7c6a3836a45c8831f6e0a63a19a0beba8404ea9e9b282a30a3a39fd7feb8fa98"} Mar 14 09:04:38 crc kubenswrapper[5129]: I0314 09:04:38.046993 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9" path="/var/lib/kubelet/pods/b24e9f8c-2cf7-45a8-bc1c-524ed5de8de9/volumes" Mar 14 09:04:38 crc kubenswrapper[5129]: I0314 09:04:38.170358 5129 generic.go:334] "Generic (PLEG): container finished" podID="7f98c195-d11b-4583-bdc6-5708e9e72c03" containerID="0f35e53b5e6b5b25ca0e102dffddff07fca6d934bf209786fa74497323d689fc" exitCode=0 Mar 14 09:04:38 crc kubenswrapper[5129]: I0314 09:04:38.170404 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzhjd" event={"ID":"7f98c195-d11b-4583-bdc6-5708e9e72c03","Type":"ContainerDied","Data":"0f35e53b5e6b5b25ca0e102dffddff07fca6d934bf209786fa74497323d689fc"} Mar 14 09:04:39 crc kubenswrapper[5129]: E0314 09:04:39.273751 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:39 crc kubenswrapper[5129]: E0314 09:04:39.275664 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:39 crc kubenswrapper[5129]: E0314 09:04:39.277118 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:39 crc kubenswrapper[5129]: E0314 09:04:39.277186 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.543045 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.631191 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-ring-data-devices\") pod \"7f98c195-d11b-4583-bdc6-5708e9e72c03\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.631522 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-dispersionconf\") pod \"7f98c195-d11b-4583-bdc6-5708e9e72c03\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.631554 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqsm5\" (UniqueName: \"kubernetes.io/projected/7f98c195-d11b-4583-bdc6-5708e9e72c03-kube-api-access-jqsm5\") pod \"7f98c195-d11b-4583-bdc6-5708e9e72c03\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.631617 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-combined-ca-bundle\") pod \"7f98c195-d11b-4583-bdc6-5708e9e72c03\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.631665 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-scripts\") pod \"7f98c195-d11b-4583-bdc6-5708e9e72c03\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.631689 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-swiftconf\") pod \"7f98c195-d11b-4583-bdc6-5708e9e72c03\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.631809 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f98c195-d11b-4583-bdc6-5708e9e72c03-etc-swift\") pod \"7f98c195-d11b-4583-bdc6-5708e9e72c03\" (UID: \"7f98c195-d11b-4583-bdc6-5708e9e72c03\") " Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.635423 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f98c195-d11b-4583-bdc6-5708e9e72c03-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7f98c195-d11b-4583-bdc6-5708e9e72c03" (UID: "7f98c195-d11b-4583-bdc6-5708e9e72c03"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.638214 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f98c195-d11b-4583-bdc6-5708e9e72c03-kube-api-access-jqsm5" (OuterVolumeSpecName: "kube-api-access-jqsm5") pod "7f98c195-d11b-4583-bdc6-5708e9e72c03" (UID: "7f98c195-d11b-4583-bdc6-5708e9e72c03"). InnerVolumeSpecName "kube-api-access-jqsm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.638615 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7f98c195-d11b-4583-bdc6-5708e9e72c03" (UID: "7f98c195-d11b-4583-bdc6-5708e9e72c03"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.678874 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7f98c195-d11b-4583-bdc6-5708e9e72c03" (UID: "7f98c195-d11b-4583-bdc6-5708e9e72c03"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.697758 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f98c195-d11b-4583-bdc6-5708e9e72c03" (UID: "7f98c195-d11b-4583-bdc6-5708e9e72c03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.709420 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-scripts" (OuterVolumeSpecName: "scripts") pod "7f98c195-d11b-4583-bdc6-5708e9e72c03" (UID: "7f98c195-d11b-4583-bdc6-5708e9e72c03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.730464 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7f98c195-d11b-4583-bdc6-5708e9e72c03" (UID: "7f98c195-d11b-4583-bdc6-5708e9e72c03"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.732967 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f98c195-d11b-4583-bdc6-5708e9e72c03-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.732996 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.733006 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.733016 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqsm5\" (UniqueName: \"kubernetes.io/projected/7f98c195-d11b-4583-bdc6-5708e9e72c03-kube-api-access-jqsm5\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.733024 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.733032 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f98c195-d11b-4583-bdc6-5708e9e72c03-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[5129]: I0314 09:04:39.733041 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f98c195-d11b-4583-bdc6-5708e9e72c03-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:40 crc kubenswrapper[5129]: I0314 09:04:40.185966 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzhjd" event={"ID":"7f98c195-d11b-4583-bdc6-5708e9e72c03","Type":"ContainerDied","Data":"b07b8d4d264d2366c03add8622d31a97623c802c17c1f57d834a8d3d9cd56bc7"} Mar 14 09:04:40 crc kubenswrapper[5129]: I0314 09:04:40.186227 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b07b8d4d264d2366c03add8622d31a97623c802c17c1f57d834a8d3d9cd56bc7" Mar 14 09:04:40 crc kubenswrapper[5129]: I0314 09:04:40.186044 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzhjd" Mar 14 09:04:44 crc kubenswrapper[5129]: E0314 09:04:44.273255 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:44 crc kubenswrapper[5129]: E0314 09:04:44.275680 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:44 crc kubenswrapper[5129]: E0314 09:04:44.277423 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:44 crc kubenswrapper[5129]: E0314 09:04:44.277498 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:04:49 crc kubenswrapper[5129]: E0314 09:04:49.274430 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:49 crc kubenswrapper[5129]: E0314 09:04:49.279186 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:49 crc kubenswrapper[5129]: E0314 09:04:49.280377 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:49 crc kubenswrapper[5129]: E0314 09:04:49.280419 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.205737 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.282205 5129 generic.go:334] "Generic (PLEG): container finished" podID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerID="5953e78670d22d327077632a3d8f1a6ff552120883c447e1f403ab185d4c5f94" exitCode=0 Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.282272 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b61cf66-0088-4e61-b1a7-54816b9db5c8","Type":"ContainerDied","Data":"5953e78670d22d327077632a3d8f1a6ff552120883c447e1f403ab185d4c5f94"} Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.288164 5129 generic.go:334] "Generic (PLEG): container finished" podID="22976fae-93cc-4036-8595-8d793ff0d00a" containerID="e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8" exitCode=0 Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.288235 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22976fae-93cc-4036-8595-8d793ff0d00a","Type":"ContainerDied","Data":"e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8"} Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.288341 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.288459 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22976fae-93cc-4036-8595-8d793ff0d00a","Type":"ContainerDied","Data":"8548046d2ce8156bc5d9d916333d0092a73689f444873a6de50e4d11bfe227cc"} Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.288472 5129 scope.go:117] "RemoveContainer" containerID="e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.334988 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-config-data\") pod \"22976fae-93cc-4036-8595-8d793ff0d00a\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.335157 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-combined-ca-bundle\") pod \"22976fae-93cc-4036-8595-8d793ff0d00a\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.335226 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22976fae-93cc-4036-8595-8d793ff0d00a-logs\") pod \"22976fae-93cc-4036-8595-8d793ff0d00a\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.335302 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z6cp\" (UniqueName: \"kubernetes.io/projected/22976fae-93cc-4036-8595-8d793ff0d00a-kube-api-access-7z6cp\") pod \"22976fae-93cc-4036-8595-8d793ff0d00a\" (UID: \"22976fae-93cc-4036-8595-8d793ff0d00a\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.335778 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22976fae-93cc-4036-8595-8d793ff0d00a-logs" (OuterVolumeSpecName: "logs") pod "22976fae-93cc-4036-8595-8d793ff0d00a" (UID: "22976fae-93cc-4036-8595-8d793ff0d00a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.340296 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.344190 5129 scope.go:117] "RemoveContainer" containerID="588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.345233 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22976fae-93cc-4036-8595-8d793ff0d00a-kube-api-access-7z6cp" (OuterVolumeSpecName: "kube-api-access-7z6cp") pod "22976fae-93cc-4036-8595-8d793ff0d00a" (UID: "22976fae-93cc-4036-8595-8d793ff0d00a"). InnerVolumeSpecName "kube-api-access-7z6cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.366197 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22976fae-93cc-4036-8595-8d793ff0d00a" (UID: "22976fae-93cc-4036-8595-8d793ff0d00a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.373282 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-config-data" (OuterVolumeSpecName: "config-data") pod "22976fae-93cc-4036-8595-8d793ff0d00a" (UID: "22976fae-93cc-4036-8595-8d793ff0d00a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.379034 5129 scope.go:117] "RemoveContainer" containerID="e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8" Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.379690 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8\": container with ID starting with e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8 not found: ID does not exist" containerID="e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.379741 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8"} err="failed to get container status \"e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8\": rpc error: code = NotFound desc = could not find container \"e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8\": container with ID starting with e80bb1dda50866d20650a784c76bff1b62612577687259889f9f0c1ec58ffdd8 not found: ID does not exist" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.379793 5129 scope.go:117] "RemoveContainer" containerID="588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54" Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.380222 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54\": container with ID starting with 588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54 not found: ID does not exist" containerID="588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.380263 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54"} err="failed to get container status \"588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54\": rpc error: code = NotFound desc = could not find container \"588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54\": container with ID starting with 588221254b003d105919c7f82445142cafd963c150049d826976ece79e42ba54 not found: ID does not exist" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.436851 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b61cf66-0088-4e61-b1a7-54816b9db5c8-logs\") pod \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437086 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf4pd\" (UniqueName: \"kubernetes.io/projected/5b61cf66-0088-4e61-b1a7-54816b9db5c8-kube-api-access-pf4pd\") pod \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437139 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-combined-ca-bundle\") pod \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437173 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-config-data\") pod \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437256 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-nova-metadata-tls-certs\") pod \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\" (UID: \"5b61cf66-0088-4e61-b1a7-54816b9db5c8\") " Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437782 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437805 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22976fae-93cc-4036-8595-8d793ff0d00a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437818 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22976fae-93cc-4036-8595-8d793ff0d00a-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.437830 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z6cp\" (UniqueName: \"kubernetes.io/projected/22976fae-93cc-4036-8595-8d793ff0d00a-kube-api-access-7z6cp\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.439313 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b61cf66-0088-4e61-b1a7-54816b9db5c8-logs" (OuterVolumeSpecName: "logs") pod "5b61cf66-0088-4e61-b1a7-54816b9db5c8" (UID: "5b61cf66-0088-4e61-b1a7-54816b9db5c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.441274 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b61cf66-0088-4e61-b1a7-54816b9db5c8-kube-api-access-pf4pd" (OuterVolumeSpecName: "kube-api-access-pf4pd") pod "5b61cf66-0088-4e61-b1a7-54816b9db5c8" (UID: "5b61cf66-0088-4e61-b1a7-54816b9db5c8"). InnerVolumeSpecName "kube-api-access-pf4pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.461534 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b61cf66-0088-4e61-b1a7-54816b9db5c8" (UID: "5b61cf66-0088-4e61-b1a7-54816b9db5c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.463642 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-config-data" (OuterVolumeSpecName: "config-data") pod "5b61cf66-0088-4e61-b1a7-54816b9db5c8" (UID: "5b61cf66-0088-4e61-b1a7-54816b9db5c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.482496 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5b61cf66-0088-4e61-b1a7-54816b9db5c8" (UID: "5b61cf66-0088-4e61-b1a7-54816b9db5c8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.538956 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf4pd\" (UniqueName: \"kubernetes.io/projected/5b61cf66-0088-4e61-b1a7-54816b9db5c8-kube-api-access-pf4pd\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.538995 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.539007 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.539016 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b61cf66-0088-4e61-b1a7-54816b9db5c8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.539024 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b61cf66-0088-4e61-b1a7-54816b9db5c8-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.621269 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.630319 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.649570 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.650063 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650090 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.650118 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650127 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.650144 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5ebad0-4050-4467-918f-18b373bd269a" containerName="nova-manage" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650154 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5ebad0-4050-4467-918f-18b373bd269a" containerName="nova-manage" Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.650175 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f98c195-d11b-4583-bdc6-5708e9e72c03" containerName="swift-ring-rebalance" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650184 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f98c195-d11b-4583-bdc6-5708e9e72c03" containerName="swift-ring-rebalance" Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.650194 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650203 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" Mar 14 09:04:50 crc kubenswrapper[5129]: E0314 09:04:50.650218 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650227 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650642 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5ebad0-4050-4467-918f-18b373bd269a" containerName="nova-manage" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650661 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-log" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650684 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" containerName="nova-metadata-metadata" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650701 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-log" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650715 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f98c195-d11b-4583-bdc6-5708e9e72c03" containerName="swift-ring-rebalance" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.650724 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" containerName="nova-api-api" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.652062 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.657322 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.675594 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.741916 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.741958 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l6bl\" (UniqueName: \"kubernetes.io/projected/d483bc2c-233b-4127-bd2e-6c69a25977db-kube-api-access-8l6bl\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.742000 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-config-data\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.742016 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d483bc2c-233b-4127-bd2e-6c69a25977db-logs\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.843357 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-config-data\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.843399 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d483bc2c-233b-4127-bd2e-6c69a25977db-logs\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.843525 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.843548 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l6bl\" (UniqueName: \"kubernetes.io/projected/d483bc2c-233b-4127-bd2e-6c69a25977db-kube-api-access-8l6bl\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.844205 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d483bc2c-233b-4127-bd2e-6c69a25977db-logs\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.848329 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-config-data\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.848864 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.862059 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l6bl\" (UniqueName: \"kubernetes.io/projected/d483bc2c-233b-4127-bd2e-6c69a25977db-kube-api-access-8l6bl\") pod \"nova-api-0\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " pod="openstack/nova-api-0" Mar 14 09:04:50 crc kubenswrapper[5129]: I0314 09:04:50.972496 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.300115 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b61cf66-0088-4e61-b1a7-54816b9db5c8","Type":"ContainerDied","Data":"0a2893d314ef2a12c9f1517d632de4def7092e1b4dff0ab70e9e58c210616a9a"} Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.300539 5129 scope.go:117] "RemoveContainer" containerID="5953e78670d22d327077632a3d8f1a6ff552120883c447e1f403ab185d4c5f94" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.300443 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.328125 5129 scope.go:117] "RemoveContainer" containerID="7c6a3836a45c8831f6e0a63a19a0beba8404ea9e9b282a30a3a39fd7feb8fa98" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.349424 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.365866 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.379275 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.381095 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.383103 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.384204 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.395454 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.428011 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.562348 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.562400 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc194f90-df3e-4808-a5a6-8cbff23f04a0-logs\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.562426 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-config-data\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.562486 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.562536 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dtbt\" (UniqueName: \"kubernetes.io/projected/fc194f90-df3e-4808-a5a6-8cbff23f04a0-kube-api-access-6dtbt\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.663761 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.664032 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc194f90-df3e-4808-a5a6-8cbff23f04a0-logs\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.664056 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-config-data\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.664218 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.664381 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dtbt\" (UniqueName: \"kubernetes.io/projected/fc194f90-df3e-4808-a5a6-8cbff23f04a0-kube-api-access-6dtbt\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.665148 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc194f90-df3e-4808-a5a6-8cbff23f04a0-logs\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.667936 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.668066 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-config-data\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.669515 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.682899 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dtbt\" (UniqueName: \"kubernetes.io/projected/fc194f90-df3e-4808-a5a6-8cbff23f04a0-kube-api-access-6dtbt\") pod \"nova-metadata-0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " pod="openstack/nova-metadata-0" Mar 14 09:04:51 crc kubenswrapper[5129]: I0314 09:04:51.698228 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.057835 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22976fae-93cc-4036-8595-8d793ff0d00a" path="/var/lib/kubelet/pods/22976fae-93cc-4036-8595-8d793ff0d00a/volumes" Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.059320 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b61cf66-0088-4e61-b1a7-54816b9db5c8" path="/var/lib/kubelet/pods/5b61cf66-0088-4e61-b1a7-54816b9db5c8/volumes" Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.162129 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:04:52 crc kubenswrapper[5129]: W0314 09:04:52.173939 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc194f90_df3e_4808_a5a6_8cbff23f04a0.slice/crio-f8feb0337bea051e42aafdc8ef7645d42f7f95b44c47e1b553956263e88a3a80 WatchSource:0}: Error finding container f8feb0337bea051e42aafdc8ef7645d42f7f95b44c47e1b553956263e88a3a80: Status 404 returned error can't find the container with id f8feb0337bea051e42aafdc8ef7645d42f7f95b44c47e1b553956263e88a3a80 Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.324832 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc194f90-df3e-4808-a5a6-8cbff23f04a0","Type":"ContainerStarted","Data":"f8feb0337bea051e42aafdc8ef7645d42f7f95b44c47e1b553956263e88a3a80"} Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.328631 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d483bc2c-233b-4127-bd2e-6c69a25977db","Type":"ContainerStarted","Data":"06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247"} Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.328729 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d483bc2c-233b-4127-bd2e-6c69a25977db","Type":"ContainerStarted","Data":"14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea"} Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.328745 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d483bc2c-233b-4127-bd2e-6c69a25977db","Type":"ContainerStarted","Data":"9a4b1f2194ced21585467135455613cd00a9a94953f9bce58979fe19177ff593"} Mar 14 09:04:52 crc kubenswrapper[5129]: I0314 09:04:52.380692 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.380672186 podStartE2EDuration="2.380672186s" podCreationTimestamp="2026-03-14 09:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:04:52.368098144 +0000 UTC m=+7555.120013328" watchObservedRunningTime="2026-03-14 09:04:52.380672186 +0000 UTC m=+7555.132587390" Mar 14 09:04:53 crc kubenswrapper[5129]: I0314 09:04:53.345517 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc194f90-df3e-4808-a5a6-8cbff23f04a0","Type":"ContainerStarted","Data":"180360d91d5ab64db48f11db1ed5a475e0f0bcbb71ddadbcad80fb051c73ecb0"} Mar 14 09:04:53 crc kubenswrapper[5129]: I0314 09:04:53.345810 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc194f90-df3e-4808-a5a6-8cbff23f04a0","Type":"ContainerStarted","Data":"4bf71143b0559213036068f1f3d659044bcc0bf54ddd158b895ec54f9bae65f0"} Mar 14 09:04:53 crc kubenswrapper[5129]: I0314 09:04:53.370383 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.370362363 podStartE2EDuration="2.370362363s" podCreationTimestamp="2026-03-14 09:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:04:53.367043523 +0000 UTC m=+7556.118958707" watchObservedRunningTime="2026-03-14 09:04:53.370362363 +0000 UTC m=+7556.122277547" Mar 14 09:04:54 crc kubenswrapper[5129]: E0314 09:04:54.273053 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:54 crc kubenswrapper[5129]: E0314 09:04:54.275276 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:54 crc kubenswrapper[5129]: E0314 09:04:54.276436 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:54 crc kubenswrapper[5129]: E0314 09:04:54.276478 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:04:59 crc kubenswrapper[5129]: E0314 09:04:59.273728 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:59 crc kubenswrapper[5129]: E0314 09:04:59.277034 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:59 crc kubenswrapper[5129]: E0314 09:04:59.278686 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:04:59 crc kubenswrapper[5129]: E0314 09:04:59.278728 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:05:00 crc kubenswrapper[5129]: I0314 09:05:00.973069 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:05:00 crc kubenswrapper[5129]: I0314 09:05:00.973130 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:05:01 crc kubenswrapper[5129]: I0314 09:05:01.699170 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 09:05:01 crc kubenswrapper[5129]: I0314 09:05:01.699842 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 09:05:02 crc kubenswrapper[5129]: I0314 09:05:02.015079 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.151:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:05:02 crc kubenswrapper[5129]: I0314 09:05:02.057105 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.151:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:05:02 crc kubenswrapper[5129]: I0314 09:05:02.719112 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.152:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:05:02 crc kubenswrapper[5129]: I0314 09:05:02.719572 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.152:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:05:04 crc kubenswrapper[5129]: E0314 09:05:04.273974 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:05:04 crc kubenswrapper[5129]: E0314 09:05:04.276403 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:05:04 crc kubenswrapper[5129]: E0314 09:05:04.278649 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 09:05:04 crc kubenswrapper[5129]: E0314 09:05:04.278773 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:05:06 crc kubenswrapper[5129]: I0314 09:05:06.465451 5129 generic.go:334] "Generic (PLEG): container finished" podID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" exitCode=137 Mar 14 09:05:06 crc kubenswrapper[5129]: I0314 09:05:06.465531 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0becf0c4-d0bc-40b1-9677-1024fcb4a525","Type":"ContainerDied","Data":"47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef"} Mar 14 09:05:06 crc kubenswrapper[5129]: I0314 09:05:06.962437 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.120353 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-config-data\") pod \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.120487 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4qrt\" (UniqueName: \"kubernetes.io/projected/0becf0c4-d0bc-40b1-9677-1024fcb4a525-kube-api-access-n4qrt\") pod \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.120545 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-combined-ca-bundle\") pod \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\" (UID: \"0becf0c4-d0bc-40b1-9677-1024fcb4a525\") " Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.127731 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0becf0c4-d0bc-40b1-9677-1024fcb4a525-kube-api-access-n4qrt" (OuterVolumeSpecName: "kube-api-access-n4qrt") pod "0becf0c4-d0bc-40b1-9677-1024fcb4a525" (UID: "0becf0c4-d0bc-40b1-9677-1024fcb4a525"). InnerVolumeSpecName "kube-api-access-n4qrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.149431 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0becf0c4-d0bc-40b1-9677-1024fcb4a525" (UID: "0becf0c4-d0bc-40b1-9677-1024fcb4a525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.151833 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-config-data" (OuterVolumeSpecName: "config-data") pod "0becf0c4-d0bc-40b1-9677-1024fcb4a525" (UID: "0becf0c4-d0bc-40b1-9677-1024fcb4a525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.222721 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.222751 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4qrt\" (UniqueName: \"kubernetes.io/projected/0becf0c4-d0bc-40b1-9677-1024fcb4a525-kube-api-access-n4qrt\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.222762 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0becf0c4-d0bc-40b1-9677-1024fcb4a525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.475649 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0becf0c4-d0bc-40b1-9677-1024fcb4a525","Type":"ContainerDied","Data":"66bfc1dbd08db6eeee0a4d795bbda570b6f19d3d641ff14a9b183175025bf3d6"} Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.475716 5129 scope.go:117] "RemoveContainer" containerID="47872ff8520ce6defa5f1a6a87284c44e7388ddb5cc3a0374c0094cb65696cef" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.475747 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.511047 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.520241 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.535580 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:05:07 crc kubenswrapper[5129]: E0314 09:05:07.536111 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.536138 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.536431 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" containerName="nova-scheduler-scheduler" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.537235 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.539505 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.566948 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.630231 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.630378 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc489\" (UniqueName: \"kubernetes.io/projected/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-kube-api-access-mc489\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.630408 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-config-data\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.732297 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.732443 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc489\" (UniqueName: \"kubernetes.io/projected/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-kube-api-access-mc489\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.732473 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-config-data\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.735917 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.736021 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-config-data\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.749663 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc489\" (UniqueName: \"kubernetes.io/projected/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-kube-api-access-mc489\") pod \"nova-scheduler-0\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " pod="openstack/nova-scheduler-0" Mar 14 09:05:07 crc kubenswrapper[5129]: I0314 09:05:07.857622 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:05:08 crc kubenswrapper[5129]: I0314 09:05:08.061129 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0becf0c4-d0bc-40b1-9677-1024fcb4a525" path="/var/lib/kubelet/pods/0becf0c4-d0bc-40b1-9677-1024fcb4a525/volumes" Mar 14 09:05:08 crc kubenswrapper[5129]: I0314 09:05:08.311966 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:05:08 crc kubenswrapper[5129]: W0314 09:05:08.325638 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ba9afa_3daa_45ae_849c_ebfa4cbc08a5.slice/crio-6212e83dc95f430abcb73e89c4a4c82a0dd7dc29612f3c0a1ed0d1008b793d1f WatchSource:0}: Error finding container 6212e83dc95f430abcb73e89c4a4c82a0dd7dc29612f3c0a1ed0d1008b793d1f: Status 404 returned error can't find the container with id 6212e83dc95f430abcb73e89c4a4c82a0dd7dc29612f3c0a1ed0d1008b793d1f Mar 14 09:05:08 crc kubenswrapper[5129]: I0314 09:05:08.492400 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5","Type":"ContainerStarted","Data":"6212e83dc95f430abcb73e89c4a4c82a0dd7dc29612f3c0a1ed0d1008b793d1f"} Mar 14 09:05:08 crc kubenswrapper[5129]: I0314 09:05:08.973279 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:05:08 crc kubenswrapper[5129]: I0314 09:05:08.973867 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:05:09 crc kubenswrapper[5129]: I0314 09:05:09.501923 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5","Type":"ContainerStarted","Data":"472634747f52059da3d761b3a3bdd41094664ee0ace37406a9752c847b14d94e"} Mar 14 09:05:09 crc kubenswrapper[5129]: I0314 09:05:09.523994 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.523978651 podStartE2EDuration="2.523978651s" podCreationTimestamp="2026-03-14 09:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:05:09.522037639 +0000 UTC m=+7572.273952823" watchObservedRunningTime="2026-03-14 09:05:09.523978651 +0000 UTC m=+7572.275893825" Mar 14 09:05:09 crc kubenswrapper[5129]: I0314 09:05:09.699417 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 09:05:09 crc kubenswrapper[5129]: I0314 09:05:09.699479 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 09:05:10 crc kubenswrapper[5129]: I0314 09:05:10.977974 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 09:05:10 crc kubenswrapper[5129]: I0314 09:05:10.978701 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 09:05:10 crc kubenswrapper[5129]: I0314 09:05:10.982921 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.525441 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.716086 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.717091 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.723863 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bdb48dc75-jmswb"] Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.739359 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.739486 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.753764 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bdb48dc75-jmswb"] Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.920667 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-config\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.921095 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-nb\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.921138 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.921186 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-dns-svc\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:11 crc kubenswrapper[5129]: I0314 09:05:11.921255 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5xr\" (UniqueName: \"kubernetes.io/projected/40bec282-5c17-4ec5-ac05-5608aedeae3e-kube-api-access-hx5xr\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.023172 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5xr\" (UniqueName: \"kubernetes.io/projected/40bec282-5c17-4ec5-ac05-5608aedeae3e-kube-api-access-hx5xr\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.023259 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-config\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.023322 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-nb\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.023352 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.023427 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-dns-svc\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.024290 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-dns-svc\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.024434 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-nb\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.024682 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.025510 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-config\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.047594 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5xr\" (UniqueName: \"kubernetes.io/projected/40bec282-5c17-4ec5-ac05-5608aedeae3e-kube-api-access-hx5xr\") pod \"dnsmasq-dns-6bdb48dc75-jmswb\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.095744 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.612010 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 09:05:12 crc kubenswrapper[5129]: W0314 09:05:12.650629 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40bec282_5c17_4ec5_ac05_5608aedeae3e.slice/crio-de219b65384cc862895ba55ddaca50a24b1ccda11bef1390261ac25cb0729d5e WatchSource:0}: Error finding container de219b65384cc862895ba55ddaca50a24b1ccda11bef1390261ac25cb0729d5e: Status 404 returned error can't find the container with id de219b65384cc862895ba55ddaca50a24b1ccda11bef1390261ac25cb0729d5e Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.655848 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bdb48dc75-jmswb"] Mar 14 09:05:12 crc kubenswrapper[5129]: I0314 09:05:12.859771 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 09:05:13 crc kubenswrapper[5129]: I0314 09:05:13.555048 5129 generic.go:334] "Generic (PLEG): container finished" podID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerID="956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af" exitCode=0 Mar 14 09:05:13 crc kubenswrapper[5129]: I0314 09:05:13.557158 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" event={"ID":"40bec282-5c17-4ec5-ac05-5608aedeae3e","Type":"ContainerDied","Data":"956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af"} Mar 14 09:05:13 crc kubenswrapper[5129]: I0314 09:05:13.557275 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" event={"ID":"40bec282-5c17-4ec5-ac05-5608aedeae3e","Type":"ContainerStarted","Data":"de219b65384cc862895ba55ddaca50a24b1ccda11bef1390261ac25cb0729d5e"} Mar 14 09:05:14 crc kubenswrapper[5129]: I0314 09:05:14.590479 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" event={"ID":"40bec282-5c17-4ec5-ac05-5608aedeae3e","Type":"ContainerStarted","Data":"aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a"} Mar 14 09:05:14 crc kubenswrapper[5129]: I0314 09:05:14.590816 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:14 crc kubenswrapper[5129]: I0314 09:05:14.633041 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" podStartSLOduration=3.6330243749999998 podStartE2EDuration="3.633024375s" podCreationTimestamp="2026-03-14 09:05:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:05:14.625063649 +0000 UTC m=+7577.376978833" watchObservedRunningTime="2026-03-14 09:05:14.633024375 +0000 UTC m=+7577.384939559" Mar 14 09:05:17 crc kubenswrapper[5129]: I0314 09:05:17.585447 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:05:17 crc kubenswrapper[5129]: I0314 09:05:17.586430 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-log" containerID="cri-o://14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea" gracePeriod=30 Mar 14 09:05:17 crc kubenswrapper[5129]: I0314 09:05:17.586513 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-api" containerID="cri-o://06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247" gracePeriod=30 Mar 14 09:05:17 crc kubenswrapper[5129]: I0314 09:05:17.858505 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 09:05:17 crc kubenswrapper[5129]: I0314 09:05:17.897894 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 09:05:18 crc kubenswrapper[5129]: I0314 09:05:18.633026 5129 generic.go:334] "Generic (PLEG): container finished" podID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerID="14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea" exitCode=143 Mar 14 09:05:18 crc kubenswrapper[5129]: I0314 09:05:18.633150 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d483bc2c-233b-4127-bd2e-6c69a25977db","Type":"ContainerDied","Data":"14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea"} Mar 14 09:05:18 crc kubenswrapper[5129]: I0314 09:05:18.668405 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 09:05:19 crc kubenswrapper[5129]: I0314 09:05:19.574516 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:05:19 crc kubenswrapper[5129]: I0314 09:05:19.574580 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.256336 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.307887 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-config-data\") pod \"d483bc2c-233b-4127-bd2e-6c69a25977db\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.307935 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l6bl\" (UniqueName: \"kubernetes.io/projected/d483bc2c-233b-4127-bd2e-6c69a25977db-kube-api-access-8l6bl\") pod \"d483bc2c-233b-4127-bd2e-6c69a25977db\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.308073 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d483bc2c-233b-4127-bd2e-6c69a25977db-logs\") pod \"d483bc2c-233b-4127-bd2e-6c69a25977db\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.308196 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-combined-ca-bundle\") pod \"d483bc2c-233b-4127-bd2e-6c69a25977db\" (UID: \"d483bc2c-233b-4127-bd2e-6c69a25977db\") " Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.311047 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d483bc2c-233b-4127-bd2e-6c69a25977db-logs" (OuterVolumeSpecName: "logs") pod "d483bc2c-233b-4127-bd2e-6c69a25977db" (UID: "d483bc2c-233b-4127-bd2e-6c69a25977db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.316330 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d483bc2c-233b-4127-bd2e-6c69a25977db-kube-api-access-8l6bl" (OuterVolumeSpecName: "kube-api-access-8l6bl") pod "d483bc2c-233b-4127-bd2e-6c69a25977db" (UID: "d483bc2c-233b-4127-bd2e-6c69a25977db"). InnerVolumeSpecName "kube-api-access-8l6bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.340311 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-config-data" (OuterVolumeSpecName: "config-data") pod "d483bc2c-233b-4127-bd2e-6c69a25977db" (UID: "d483bc2c-233b-4127-bd2e-6c69a25977db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.340319 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d483bc2c-233b-4127-bd2e-6c69a25977db" (UID: "d483bc2c-233b-4127-bd2e-6c69a25977db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.410458 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d483bc2c-233b-4127-bd2e-6c69a25977db-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.410506 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.410520 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d483bc2c-233b-4127-bd2e-6c69a25977db-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.410534 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l6bl\" (UniqueName: \"kubernetes.io/projected/d483bc2c-233b-4127-bd2e-6c69a25977db-kube-api-access-8l6bl\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.665690 5129 generic.go:334] "Generic (PLEG): container finished" podID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerID="06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247" exitCode=0 Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.665993 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d483bc2c-233b-4127-bd2e-6c69a25977db","Type":"ContainerDied","Data":"06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247"} Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.666102 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d483bc2c-233b-4127-bd2e-6c69a25977db","Type":"ContainerDied","Data":"9a4b1f2194ced21585467135455613cd00a9a94953f9bce58979fe19177ff593"} Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.666190 5129 scope.go:117] "RemoveContainer" containerID="06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.666391 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.748279 5129 scope.go:117] "RemoveContainer" containerID="14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.774124 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.787778 5129 scope.go:117] "RemoveContainer" containerID="06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247" Mar 14 09:05:21 crc kubenswrapper[5129]: E0314 09:05:21.788325 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247\": container with ID starting with 06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247 not found: ID does not exist" containerID="06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.788362 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247"} err="failed to get container status \"06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247\": rpc error: code = NotFound desc = could not find container \"06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247\": container with ID starting with 06bfcbb1b21c0e7cc2cb4f726c9885f848ad40fb91738027c37dfc68aea2c247 not found: ID does not exist" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.788392 5129 scope.go:117] "RemoveContainer" containerID="14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea" Mar 14 09:05:21 crc kubenswrapper[5129]: E0314 09:05:21.788870 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea\": container with ID starting with 14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea not found: ID does not exist" containerID="14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.788895 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea"} err="failed to get container status \"14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea\": rpc error: code = NotFound desc = could not find container \"14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea\": container with ID starting with 14547c674cf5c7ffd7675e3d17a20c1a79af76ad10e64c23bad523bf78adaeea not found: ID does not exist" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.792863 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.809697 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 09:05:21 crc kubenswrapper[5129]: E0314 09:05:21.810572 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-log" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.810594 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-log" Mar 14 09:05:21 crc kubenswrapper[5129]: E0314 09:05:21.810638 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-api" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.810646 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-api" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.810976 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-api" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.811028 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" containerName="nova-api-log" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.812710 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.820070 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.820302 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.820942 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.831110 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.952774 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.952834 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.952901 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-public-tls-certs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.952966 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-config-data\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.952986 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-logs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:21 crc kubenswrapper[5129]: I0314 09:05:21.953125 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djc67\" (UniqueName: \"kubernetes.io/projected/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-kube-api-access-djc67\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.046838 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d483bc2c-233b-4127-bd2e-6c69a25977db" path="/var/lib/kubelet/pods/d483bc2c-233b-4127-bd2e-6c69a25977db/volumes" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.056125 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djc67\" (UniqueName: \"kubernetes.io/projected/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-kube-api-access-djc67\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.056310 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.056364 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.056520 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-public-tls-certs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.056731 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-logs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.056772 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-config-data\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.057330 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-logs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.061918 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.064868 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-public-tls-certs\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.064876 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.068330 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-config-data\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.076262 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djc67\" (UniqueName: \"kubernetes.io/projected/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-kube-api-access-djc67\") pod \"nova-api-0\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.097738 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.154290 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.171237 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c7b9766c-5pbn9"] Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.171484 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" podUID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerName="dnsmasq-dns" containerID="cri-o://1fbaf02f0b38939f54dde95de534c0e790056fe23ec0f18a57c79d1bfbec3b72" gracePeriod=10 Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.690694 5129 generic.go:334] "Generic (PLEG): container finished" podID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerID="1fbaf02f0b38939f54dde95de534c0e790056fe23ec0f18a57c79d1bfbec3b72" exitCode=0 Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.690761 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" event={"ID":"0ecedc81-c8c1-426c-8d77-5281d664ab2a","Type":"ContainerDied","Data":"1fbaf02f0b38939f54dde95de534c0e790056fe23ec0f18a57c79d1bfbec3b72"} Mar 14 09:05:22 crc kubenswrapper[5129]: I0314 09:05:22.998490 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.096759 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-config\") pod \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.097254 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-dns-svc\") pod \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.097709 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47qvs\" (UniqueName: \"kubernetes.io/projected/0ecedc81-c8c1-426c-8d77-5281d664ab2a-kube-api-access-47qvs\") pod \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.097956 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-nb\") pod \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.098111 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-sb\") pod \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\" (UID: \"0ecedc81-c8c1-426c-8d77-5281d664ab2a\") " Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.109941 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecedc81-c8c1-426c-8d77-5281d664ab2a-kube-api-access-47qvs" (OuterVolumeSpecName: "kube-api-access-47qvs") pod "0ecedc81-c8c1-426c-8d77-5281d664ab2a" (UID: "0ecedc81-c8c1-426c-8d77-5281d664ab2a"). InnerVolumeSpecName "kube-api-access-47qvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.155630 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ecedc81-c8c1-426c-8d77-5281d664ab2a" (UID: "0ecedc81-c8c1-426c-8d77-5281d664ab2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.175121 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ecedc81-c8c1-426c-8d77-5281d664ab2a" (UID: "0ecedc81-c8c1-426c-8d77-5281d664ab2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.186156 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-config" (OuterVolumeSpecName: "config") pod "0ecedc81-c8c1-426c-8d77-5281d664ab2a" (UID: "0ecedc81-c8c1-426c-8d77-5281d664ab2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.186416 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.197421 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ecedc81-c8c1-426c-8d77-5281d664ab2a" (UID: "0ecedc81-c8c1-426c-8d77-5281d664ab2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.202313 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.202336 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.202347 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.202357 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecedc81-c8c1-426c-8d77-5281d664ab2a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.202365 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47qvs\" (UniqueName: \"kubernetes.io/projected/0ecedc81-c8c1-426c-8d77-5281d664ab2a-kube-api-access-47qvs\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.706296 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" event={"ID":"0ecedc81-c8c1-426c-8d77-5281d664ab2a","Type":"ContainerDied","Data":"c621b34e6db3430ecfd55a8a4b4a1f5c704c7cca71d90838d67d980153384db9"} Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.708486 5129 scope.go:117] "RemoveContainer" containerID="1fbaf02f0b38939f54dde95de534c0e790056fe23ec0f18a57c79d1bfbec3b72" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.708902 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c7b9766c-5pbn9" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.720845 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af1b5d6b-7c27-46cc-932c-cccec1fd8eca","Type":"ContainerStarted","Data":"cbdfb8b0ab3287febbf034c3a49f809061aa20d007aa0cba7d27620d21d4a9b7"} Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.721093 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af1b5d6b-7c27-46cc-932c-cccec1fd8eca","Type":"ContainerStarted","Data":"9e8a4770203afb39f79326205afcb095edbeea6008e03a7fe92d9458e7d894a6"} Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.721193 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af1b5d6b-7c27-46cc-932c-cccec1fd8eca","Type":"ContainerStarted","Data":"c433ca61254fc6f2e629a9a400f35f007ceab10a56bc9983afe3850841ddcdf0"} Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.763860 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7638405969999997 podStartE2EDuration="2.763840597s" podCreationTimestamp="2026-03-14 09:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:05:23.75256637 +0000 UTC m=+7586.504481564" watchObservedRunningTime="2026-03-14 09:05:23.763840597 +0000 UTC m=+7586.515755781" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.771416 5129 scope.go:117] "RemoveContainer" containerID="99dc6367703ae6c03abd45aa852b42eb76ee553d29759d6a227a0f9dca0238bb" Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.791388 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c7b9766c-5pbn9"] Mar 14 09:05:23 crc kubenswrapper[5129]: I0314 09:05:23.798865 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64c7b9766c-5pbn9"] Mar 14 09:05:24 crc kubenswrapper[5129]: I0314 09:05:24.046288 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" path="/var/lib/kubelet/pods/0ecedc81-c8c1-426c-8d77-5281d664ab2a/volumes" Mar 14 09:05:29 crc kubenswrapper[5129]: I0314 09:05:29.186981 5129 scope.go:117] "RemoveContainer" containerID="9aaa0f15556aaa444aa66ef937a46622e6b85e36772b38a226cbe5609d296c72" Mar 14 09:05:32 crc kubenswrapper[5129]: I0314 09:05:32.155557 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:05:32 crc kubenswrapper[5129]: I0314 09:05:32.159409 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:05:33 crc kubenswrapper[5129]: I0314 09:05:33.163899 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.155:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:05:33 crc kubenswrapper[5129]: I0314 09:05:33.168794 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.155:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:05:40 crc kubenswrapper[5129]: I0314 09:05:40.155158 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:05:40 crc kubenswrapper[5129]: I0314 09:05:40.155760 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:05:42 crc kubenswrapper[5129]: I0314 09:05:42.162954 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 09:05:42 crc kubenswrapper[5129]: I0314 09:05:42.170009 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 09:05:42 crc kubenswrapper[5129]: I0314 09:05:42.180458 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 09:05:42 crc kubenswrapper[5129]: I0314 09:05:42.920425 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 09:05:49 crc kubenswrapper[5129]: I0314 09:05:49.574699 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:05:49 crc kubenswrapper[5129]: I0314 09:05:49.575948 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.122206 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7955df7789-47mhr"] Mar 14 09:05:53 crc kubenswrapper[5129]: E0314 09:05:53.122970 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerName="init" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.122984 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerName="init" Mar 14 09:05:53 crc kubenswrapper[5129]: E0314 09:05:53.123007 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerName="dnsmasq-dns" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.123013 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerName="dnsmasq-dns" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.123205 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecedc81-c8c1-426c-8d77-5281d664ab2a" containerName="dnsmasq-dns" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.124150 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.131110 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.131365 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-zkrnd" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.133508 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.134085 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.139293 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7955df7789-47mhr"] Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.209422 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.209684 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-log" containerID="cri-o://50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e" gracePeriod=30 Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.210072 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-httpd" containerID="cri-o://2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be" gracePeriod=30 Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.237825 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c75df6b9-7gvq5"] Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.239274 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.264258 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c75df6b9-7gvq5"] Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.285477 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57e4649-42d3-4992-b04e-8697a98f4148-logs\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.285650 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-scripts\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.285688 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rn8\" (UniqueName: \"kubernetes.io/projected/d57e4649-42d3-4992-b04e-8697a98f4148-kube-api-access-48rn8\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.285722 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d57e4649-42d3-4992-b04e-8697a98f4148-horizon-secret-key\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.285753 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-config-data\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.329532 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.330292 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-log" containerID="cri-o://263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb" gracePeriod=30 Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.330516 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-httpd" containerID="cri-o://0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46" gracePeriod=30 Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.387770 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57e4649-42d3-4992-b04e-8697a98f4148-logs\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.387836 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4km8\" (UniqueName: \"kubernetes.io/projected/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-kube-api-access-f4km8\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.387860 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-horizon-secret-key\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.387992 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-scripts\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.388015 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-logs\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.388293 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57e4649-42d3-4992-b04e-8697a98f4148-logs\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.388826 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-scripts\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.388980 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rn8\" (UniqueName: \"kubernetes.io/projected/d57e4649-42d3-4992-b04e-8697a98f4148-kube-api-access-48rn8\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.389315 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-scripts\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.389392 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d57e4649-42d3-4992-b04e-8697a98f4148-horizon-secret-key\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.389423 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-config-data\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.390659 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-config-data\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.390683 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-config-data\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.399554 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d57e4649-42d3-4992-b04e-8697a98f4148-horizon-secret-key\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.408428 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rn8\" (UniqueName: \"kubernetes.io/projected/d57e4649-42d3-4992-b04e-8697a98f4148-kube-api-access-48rn8\") pod \"horizon-7955df7789-47mhr\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.459344 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.493234 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-logs\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.493638 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-scripts\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.493690 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-config-data\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.494183 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-logs\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.494364 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4km8\" (UniqueName: \"kubernetes.io/projected/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-kube-api-access-f4km8\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.494402 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-horizon-secret-key\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.494403 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-scripts\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.495385 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-config-data\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.498467 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-horizon-secret-key\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.520200 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4km8\" (UniqueName: \"kubernetes.io/projected/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-kube-api-access-f4km8\") pod \"horizon-7c75df6b9-7gvq5\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.566257 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.966917 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7955df7789-47mhr"] Mar 14 09:05:53 crc kubenswrapper[5129]: I0314 09:05:53.979041 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:05:54 crc kubenswrapper[5129]: I0314 09:05:54.021044 5129 generic.go:334] "Generic (PLEG): container finished" podID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerID="50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e" exitCode=143 Mar 14 09:05:54 crc kubenswrapper[5129]: I0314 09:05:54.021140 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9f7b05-3883-45e6-a278-9986e3047ccb","Type":"ContainerDied","Data":"50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e"} Mar 14 09:05:54 crc kubenswrapper[5129]: I0314 09:05:54.022709 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7955df7789-47mhr" event={"ID":"d57e4649-42d3-4992-b04e-8697a98f4148","Type":"ContainerStarted","Data":"01aba64e73dd914607a7ac4eddc760caa963d17bde470a2a057d2477eec4d72d"} Mar 14 09:05:54 crc kubenswrapper[5129]: I0314 09:05:54.025199 5129 generic.go:334] "Generic (PLEG): container finished" podID="3243555c-242c-4b68-b367-4dc4e3237487" containerID="263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb" exitCode=143 Mar 14 09:05:54 crc kubenswrapper[5129]: I0314 09:05:54.025299 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3243555c-242c-4b68-b367-4dc4e3237487","Type":"ContainerDied","Data":"263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb"} Mar 14 09:05:54 crc kubenswrapper[5129]: W0314 09:05:54.082151 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33b20ab0_6f98_4a46_9abf_95c2e9755b6f.slice/crio-f20fb35e170af7e7deb2ebc56546b076591624028a374c787a8732c6f85af3e5 WatchSource:0}: Error finding container f20fb35e170af7e7deb2ebc56546b076591624028a374c787a8732c6f85af3e5: Status 404 returned error can't find the container with id f20fb35e170af7e7deb2ebc56546b076591624028a374c787a8732c6f85af3e5 Mar 14 09:05:54 crc kubenswrapper[5129]: I0314 09:05:54.096943 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c75df6b9-7gvq5"] Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.072156 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75df6b9-7gvq5" event={"ID":"33b20ab0-6f98-4a46-9abf-95c2e9755b6f","Type":"ContainerStarted","Data":"f20fb35e170af7e7deb2ebc56546b076591624028a374c787a8732c6f85af3e5"} Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.102624 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7955df7789-47mhr"] Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.136830 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-665f878d64-829mz"] Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.138426 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.148336 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.154501 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665f878d64-829mz"] Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.251920 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c75df6b9-7gvq5"] Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.257952 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c90e758-caeb-4efd-a15b-cb78f7a803dc-logs\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.258038 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllt5\" (UniqueName: \"kubernetes.io/projected/8c90e758-caeb-4efd-a15b-cb78f7a803dc-kube-api-access-zllt5\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.258075 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-scripts\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.258100 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-tls-certs\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.258137 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-combined-ca-bundle\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.258163 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-secret-key\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.258200 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-config-data\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.300282 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-659c856df6-q8n7k"] Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.302635 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.312354 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-659c856df6-q8n7k"] Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.360867 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-config-data\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.360925 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-secret-key\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.360970 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllt5\" (UniqueName: \"kubernetes.io/projected/8c90e758-caeb-4efd-a15b-cb78f7a803dc-kube-api-access-zllt5\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.360987 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-combined-ca-bundle\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361028 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-scripts\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361059 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-scripts\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361074 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-tls-certs\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361119 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-combined-ca-bundle\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361148 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-secret-key\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361175 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pccq5\" (UniqueName: \"kubernetes.io/projected/444cdcb0-f68c-4943-aa88-dc3710848a7d-kube-api-access-pccq5\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361210 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-config-data\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361228 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-tls-certs\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361270 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/444cdcb0-f68c-4943-aa88-dc3710848a7d-logs\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361294 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c90e758-caeb-4efd-a15b-cb78f7a803dc-logs\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.361727 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c90e758-caeb-4efd-a15b-cb78f7a803dc-logs\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.362873 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-scripts\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.363911 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-config-data\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.369365 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-combined-ca-bundle\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.370306 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-tls-certs\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.375558 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-secret-key\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.381286 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllt5\" (UniqueName: \"kubernetes.io/projected/8c90e758-caeb-4efd-a15b-cb78f7a803dc-kube-api-access-zllt5\") pod \"horizon-665f878d64-829mz\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.462885 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pccq5\" (UniqueName: \"kubernetes.io/projected/444cdcb0-f68c-4943-aa88-dc3710848a7d-kube-api-access-pccq5\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.462966 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-tls-certs\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.463029 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/444cdcb0-f68c-4943-aa88-dc3710848a7d-logs\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.463099 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-config-data\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.463125 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-secret-key\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.463174 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-combined-ca-bundle\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.463240 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-scripts\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.464309 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-scripts\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.464693 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/444cdcb0-f68c-4943-aa88-dc3710848a7d-logs\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.464916 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-config-data\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.468034 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-secret-key\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.468102 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-combined-ca-bundle\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.469320 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-tls-certs\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.481520 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.481918 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pccq5\" (UniqueName: \"kubernetes.io/projected/444cdcb0-f68c-4943-aa88-dc3710848a7d-kube-api-access-pccq5\") pod \"horizon-659c856df6-q8n7k\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.630583 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:05:55 crc kubenswrapper[5129]: I0314 09:05:55.925616 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665f878d64-829mz"] Mar 14 09:05:55 crc kubenswrapper[5129]: W0314 09:05:55.932232 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c90e758_caeb_4efd_a15b_cb78f7a803dc.slice/crio-fef262d71cae2801b3eed734064c53504316947c7dc28f1bc73fc76c16de1c40 WatchSource:0}: Error finding container fef262d71cae2801b3eed734064c53504316947c7dc28f1bc73fc76c16de1c40: Status 404 returned error can't find the container with id fef262d71cae2801b3eed734064c53504316947c7dc28f1bc73fc76c16de1c40 Mar 14 09:05:56 crc kubenswrapper[5129]: I0314 09:05:56.094480 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665f878d64-829mz" event={"ID":"8c90e758-caeb-4efd-a15b-cb78f7a803dc","Type":"ContainerStarted","Data":"fef262d71cae2801b3eed734064c53504316947c7dc28f1bc73fc76c16de1c40"} Mar 14 09:05:56 crc kubenswrapper[5129]: I0314 09:05:56.151201 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-659c856df6-q8n7k"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.050328 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.059072 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.135379 5129 generic.go:334] "Generic (PLEG): container finished" podID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerID="2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be" exitCode=0 Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.135450 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9f7b05-3883-45e6-a278-9986e3047ccb","Type":"ContainerDied","Data":"2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be"} Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.135476 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9f7b05-3883-45e6-a278-9986e3047ccb","Type":"ContainerDied","Data":"8422976023f65cdaaf1a80e6fbe8ba97248d55abc8f6eed9f7e00625971e57d7"} Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.135494 5129 scope.go:117] "RemoveContainer" containerID="2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.135634 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.142128 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659c856df6-q8n7k" event={"ID":"444cdcb0-f68c-4943-aa88-dc3710848a7d","Type":"ContainerStarted","Data":"a933796dd05a4bfa5c1ff82d3425e434da909b082607d96906d73b42c5be8b7a"} Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.145830 5129 generic.go:334] "Generic (PLEG): container finished" podID="3243555c-242c-4b68-b367-4dc4e3237487" containerID="0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46" exitCode=0 Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.145859 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3243555c-242c-4b68-b367-4dc4e3237487","Type":"ContainerDied","Data":"0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46"} Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.145896 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3243555c-242c-4b68-b367-4dc4e3237487","Type":"ContainerDied","Data":"d117f87b84a5ad305a2330bcd4499646e0cc051a5d25a4a0cb211c31f3996ccc"} Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.145951 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207337 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-logs\") pod \"ab9f7b05-3883-45e6-a278-9986e3047ccb\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207433 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-scripts\") pod \"3243555c-242c-4b68-b367-4dc4e3237487\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207476 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-scripts\") pod \"ab9f7b05-3883-45e6-a278-9986e3047ccb\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207516 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-combined-ca-bundle\") pod \"ab9f7b05-3883-45e6-a278-9986e3047ccb\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207562 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-httpd-run\") pod \"3243555c-242c-4b68-b367-4dc4e3237487\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207583 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-logs\") pod \"3243555c-242c-4b68-b367-4dc4e3237487\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207621 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-httpd-run\") pod \"ab9f7b05-3883-45e6-a278-9986e3047ccb\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207668 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-public-tls-certs\") pod \"ab9f7b05-3883-45e6-a278-9986e3047ccb\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207721 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-config-data\") pod \"ab9f7b05-3883-45e6-a278-9986e3047ccb\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207762 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-internal-tls-certs\") pod \"3243555c-242c-4b68-b367-4dc4e3237487\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207814 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-config-data\") pod \"3243555c-242c-4b68-b367-4dc4e3237487\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207870 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kxdn\" (UniqueName: \"kubernetes.io/projected/ab9f7b05-3883-45e6-a278-9986e3047ccb-kube-api-access-8kxdn\") pod \"ab9f7b05-3883-45e6-a278-9986e3047ccb\" (UID: \"ab9f7b05-3883-45e6-a278-9986e3047ccb\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207921 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkm6l\" (UniqueName: \"kubernetes.io/projected/3243555c-242c-4b68-b367-4dc4e3237487-kube-api-access-pkm6l\") pod \"3243555c-242c-4b68-b367-4dc4e3237487\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.207950 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-combined-ca-bundle\") pod \"3243555c-242c-4b68-b367-4dc4e3237487\" (UID: \"3243555c-242c-4b68-b367-4dc4e3237487\") " Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.209119 5129 scope.go:117] "RemoveContainer" containerID="50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.209554 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-logs" (OuterVolumeSpecName: "logs") pod "3243555c-242c-4b68-b367-4dc4e3237487" (UID: "3243555c-242c-4b68-b367-4dc4e3237487"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.217078 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-scripts" (OuterVolumeSpecName: "scripts") pod "ab9f7b05-3883-45e6-a278-9986e3047ccb" (UID: "ab9f7b05-3883-45e6-a278-9986e3047ccb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.218567 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3243555c-242c-4b68-b367-4dc4e3237487" (UID: "3243555c-242c-4b68-b367-4dc4e3237487"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.218862 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-logs" (OuterVolumeSpecName: "logs") pod "ab9f7b05-3883-45e6-a278-9986e3047ccb" (UID: "ab9f7b05-3883-45e6-a278-9986e3047ccb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.222631 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3243555c-242c-4b68-b367-4dc4e3237487-kube-api-access-pkm6l" (OuterVolumeSpecName: "kube-api-access-pkm6l") pod "3243555c-242c-4b68-b367-4dc4e3237487" (UID: "3243555c-242c-4b68-b367-4dc4e3237487"). InnerVolumeSpecName "kube-api-access-pkm6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.224646 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ab9f7b05-3883-45e6-a278-9986e3047ccb" (UID: "ab9f7b05-3883-45e6-a278-9986e3047ccb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.234078 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-scripts" (OuterVolumeSpecName: "scripts") pod "3243555c-242c-4b68-b367-4dc4e3237487" (UID: "3243555c-242c-4b68-b367-4dc4e3237487"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.234153 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9f7b05-3883-45e6-a278-9986e3047ccb-kube-api-access-8kxdn" (OuterVolumeSpecName: "kube-api-access-8kxdn") pod "ab9f7b05-3883-45e6-a278-9986e3047ccb" (UID: "ab9f7b05-3883-45e6-a278-9986e3047ccb"). InnerVolumeSpecName "kube-api-access-8kxdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.271371 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab9f7b05-3883-45e6-a278-9986e3047ccb" (UID: "ab9f7b05-3883-45e6-a278-9986e3047ccb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.284691 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3243555c-242c-4b68-b367-4dc4e3237487" (UID: "3243555c-242c-4b68-b367-4dc4e3237487"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.294689 5129 scope.go:117] "RemoveContainer" containerID="2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be" Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.295124 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be\": container with ID starting with 2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be not found: ID does not exist" containerID="2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.295191 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be"} err="failed to get container status \"2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be\": rpc error: code = NotFound desc = could not find container \"2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be\": container with ID starting with 2d5d763524ba3a9771e52ad844c84d1f978247e8f3b9ab2e88bfe3140a05d8be not found: ID does not exist" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.295237 5129 scope.go:117] "RemoveContainer" containerID="50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e" Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.298276 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e\": container with ID starting with 50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e not found: ID does not exist" containerID="50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.298315 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e"} err="failed to get container status \"50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e\": rpc error: code = NotFound desc = could not find container \"50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e\": container with ID starting with 50a4835c6be65acd958bcc98db8f78b72b28015bc8412e95c1959cc664dea41e not found: ID does not exist" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.298334 5129 scope.go:117] "RemoveContainer" containerID="0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.298501 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-config-data" (OuterVolumeSpecName: "config-data") pod "ab9f7b05-3883-45e6-a278-9986e3047ccb" (UID: "ab9f7b05-3883-45e6-a278-9986e3047ccb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.303107 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3243555c-242c-4b68-b367-4dc4e3237487" (UID: "3243555c-242c-4b68-b367-4dc4e3237487"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310518 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310558 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310570 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310584 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310596 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3243555c-242c-4b68-b367-4dc4e3237487-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310682 5129 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310693 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310703 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310715 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kxdn\" (UniqueName: \"kubernetes.io/projected/ab9f7b05-3883-45e6-a278-9986e3047ccb-kube-api-access-8kxdn\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310727 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkm6l\" (UniqueName: \"kubernetes.io/projected/3243555c-242c-4b68-b367-4dc4e3237487-kube-api-access-pkm6l\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310737 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.310750 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9f7b05-3883-45e6-a278-9986e3047ccb-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.336981 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ab9f7b05-3883-45e6-a278-9986e3047ccb" (UID: "ab9f7b05-3883-45e6-a278-9986e3047ccb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.337298 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-config-data" (OuterVolumeSpecName: "config-data") pod "3243555c-242c-4b68-b367-4dc4e3237487" (UID: "3243555c-242c-4b68-b367-4dc4e3237487"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.350368 5129 scope.go:117] "RemoveContainer" containerID="263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.375370 5129 scope.go:117] "RemoveContainer" containerID="0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46" Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.375963 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46\": container with ID starting with 0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46 not found: ID does not exist" containerID="0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.375997 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46"} err="failed to get container status \"0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46\": rpc error: code = NotFound desc = could not find container \"0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46\": container with ID starting with 0ed26f7b2dfedecfdbf3b3dade751cc41f7c1b770f6c1525d44d42c663df6e46 not found: ID does not exist" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.376016 5129 scope.go:117] "RemoveContainer" containerID="263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb" Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.376431 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb\": container with ID starting with 263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb not found: ID does not exist" containerID="263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.376455 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb"} err="failed to get container status \"263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb\": rpc error: code = NotFound desc = could not find container \"263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb\": container with ID starting with 263da2fa39ed26cef4853bb7060d968dda8405bb3146ddb942c8698852b983cb not found: ID does not exist" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.425089 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3243555c-242c-4b68-b367-4dc4e3237487-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.425118 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9f7b05-3883-45e6-a278-9986e3047ccb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.509793 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.545547 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.567827 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.568362 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-log" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568384 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-log" Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.568397 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-httpd" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568404 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-httpd" Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.568431 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-httpd" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568439 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-httpd" Mar 14 09:05:57 crc kubenswrapper[5129]: E0314 09:05:57.568464 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-log" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568472 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-log" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568711 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-log" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568729 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-httpd" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568745 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3243555c-242c-4b68-b367-4dc4e3237487" containerName="glance-httpd" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.568758 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" containerName="glance-log" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.570023 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.575109 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.575210 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p4sfk" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.575644 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.575961 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.581998 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.589159 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.600860 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.609684 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.611802 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.616633 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.616898 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.617914 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.733697 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-config-data\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.733927 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvhp\" (UniqueName: \"kubernetes.io/projected/9958e447-a9f1-4513-bd78-f13804f89650-kube-api-access-jkvhp\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.733975 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12904d5e-9819-4d47-b8bd-94b42f8b12d2-logs\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734036 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-scripts\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734093 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734133 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9958e447-a9f1-4513-bd78-f13804f89650-logs\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734166 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734195 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pljsb\" (UniqueName: \"kubernetes.io/projected/12904d5e-9819-4d47-b8bd-94b42f8b12d2-kube-api-access-pljsb\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734216 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9958e447-a9f1-4513-bd78-f13804f89650-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734236 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734278 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734304 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734356 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.734376 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12904d5e-9819-4d47-b8bd-94b42f8b12d2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836046 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-scripts\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836145 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836195 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9958e447-a9f1-4513-bd78-f13804f89650-logs\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836230 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836261 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pljsb\" (UniqueName: \"kubernetes.io/projected/12904d5e-9819-4d47-b8bd-94b42f8b12d2-kube-api-access-pljsb\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836277 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9958e447-a9f1-4513-bd78-f13804f89650-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836299 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836346 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836379 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836451 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836471 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12904d5e-9819-4d47-b8bd-94b42f8b12d2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836509 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-config-data\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836527 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvhp\" (UniqueName: \"kubernetes.io/projected/9958e447-a9f1-4513-bd78-f13804f89650-kube-api-access-jkvhp\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836568 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12904d5e-9819-4d47-b8bd-94b42f8b12d2-logs\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.836866 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9958e447-a9f1-4513-bd78-f13804f89650-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.837002 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12904d5e-9819-4d47-b8bd-94b42f8b12d2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.837024 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12904d5e-9819-4d47-b8bd-94b42f8b12d2-logs\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.837634 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9958e447-a9f1-4513-bd78-f13804f89650-logs\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.840858 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.842963 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.843359 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.843863 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-config-data\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.846634 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.850382 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.856222 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12904d5e-9819-4d47-b8bd-94b42f8b12d2-scripts\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.860008 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvhp\" (UniqueName: \"kubernetes.io/projected/9958e447-a9f1-4513-bd78-f13804f89650-kube-api-access-jkvhp\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.860058 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9958e447-a9f1-4513-bd78-f13804f89650-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9958e447-a9f1-4513-bd78-f13804f89650\") " pod="openstack/glance-default-internal-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.862237 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pljsb\" (UniqueName: \"kubernetes.io/projected/12904d5e-9819-4d47-b8bd-94b42f8b12d2-kube-api-access-pljsb\") pod \"glance-default-external-api-0\" (UID: \"12904d5e-9819-4d47-b8bd-94b42f8b12d2\") " pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.897544 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 09:05:57 crc kubenswrapper[5129]: I0314 09:05:57.976591 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 09:05:58 crc kubenswrapper[5129]: I0314 09:05:58.062369 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3243555c-242c-4b68-b367-4dc4e3237487" path="/var/lib/kubelet/pods/3243555c-242c-4b68-b367-4dc4e3237487/volumes" Mar 14 09:05:58 crc kubenswrapper[5129]: I0314 09:05:58.063074 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9f7b05-3883-45e6-a278-9986e3047ccb" path="/var/lib/kubelet/pods/ab9f7b05-3883-45e6-a278-9986e3047ccb/volumes" Mar 14 09:05:58 crc kubenswrapper[5129]: I0314 09:05:58.683947 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 09:05:58 crc kubenswrapper[5129]: I0314 09:05:58.807510 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.139381 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557986-cmf8d"] Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.140972 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-cmf8d" Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.144629 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.144850 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.147469 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.153850 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-cmf8d"] Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.196252 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/8b8dcd92-d201-4e88-8c81-8b3083bc1a76-kube-api-access-snf7b\") pod \"auto-csr-approver-29557986-cmf8d\" (UID: \"8b8dcd92-d201-4e88-8c81-8b3083bc1a76\") " pod="openshift-infra/auto-csr-approver-29557986-cmf8d" Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.298481 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/8b8dcd92-d201-4e88-8c81-8b3083bc1a76-kube-api-access-snf7b\") pod \"auto-csr-approver-29557986-cmf8d\" (UID: \"8b8dcd92-d201-4e88-8c81-8b3083bc1a76\") " pod="openshift-infra/auto-csr-approver-29557986-cmf8d" Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.323025 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/8b8dcd92-d201-4e88-8c81-8b3083bc1a76-kube-api-access-snf7b\") pod \"auto-csr-approver-29557986-cmf8d\" (UID: \"8b8dcd92-d201-4e88-8c81-8b3083bc1a76\") " pod="openshift-infra/auto-csr-approver-29557986-cmf8d" Mar 14 09:06:00 crc kubenswrapper[5129]: I0314 09:06:00.482093 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-cmf8d" Mar 14 09:06:05 crc kubenswrapper[5129]: W0314 09:06:05.010457 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9958e447_a9f1_4513_bd78_f13804f89650.slice/crio-8e2a39d9defe696219e81cd1b0b2405d8f15f64cf8ee8e6bbd8b2d9a4f834b27 WatchSource:0}: Error finding container 8e2a39d9defe696219e81cd1b0b2405d8f15f64cf8ee8e6bbd8b2d9a4f834b27: Status 404 returned error can't find the container with id 8e2a39d9defe696219e81cd1b0b2405d8f15f64cf8ee8e6bbd8b2d9a4f834b27 Mar 14 09:06:05 crc kubenswrapper[5129]: I0314 09:06:05.321812 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12904d5e-9819-4d47-b8bd-94b42f8b12d2","Type":"ContainerStarted","Data":"fc145ef29a7c32f96eeab1d6d78a848771fb2c2acd9cc417d0115c45b26111f5"} Mar 14 09:06:05 crc kubenswrapper[5129]: I0314 09:06:05.333825 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9958e447-a9f1-4513-bd78-f13804f89650","Type":"ContainerStarted","Data":"8e2a39d9defe696219e81cd1b0b2405d8f15f64cf8ee8e6bbd8b2d9a4f834b27"} Mar 14 09:06:05 crc kubenswrapper[5129]: I0314 09:06:05.563772 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-cmf8d"] Mar 14 09:06:05 crc kubenswrapper[5129]: W0314 09:06:05.582812 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b8dcd92_d201_4e88_8c81_8b3083bc1a76.slice/crio-286f4900ccb01e12f5b75423e1d6bcc68c758a3dfc08c140d37758d489ef9eb3 WatchSource:0}: Error finding container 286f4900ccb01e12f5b75423e1d6bcc68c758a3dfc08c140d37758d489ef9eb3: Status 404 returned error can't find the container with id 286f4900ccb01e12f5b75423e1d6bcc68c758a3dfc08c140d37758d489ef9eb3 Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.360173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75df6b9-7gvq5" event={"ID":"33b20ab0-6f98-4a46-9abf-95c2e9755b6f","Type":"ContainerStarted","Data":"73c2e0ff3e1668f4316dda4b93f6b450f0f727db76e739665424d348d6cd57d2"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.360678 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75df6b9-7gvq5" event={"ID":"33b20ab0-6f98-4a46-9abf-95c2e9755b6f","Type":"ContainerStarted","Data":"bc577ac3c48186e140ab6617baedda3a4a9b42e4a655f3600593b06abccae776"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.360315 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c75df6b9-7gvq5" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon-log" containerID="cri-o://bc577ac3c48186e140ab6617baedda3a4a9b42e4a655f3600593b06abccae776" gracePeriod=30 Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.360873 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c75df6b9-7gvq5" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon" containerID="cri-o://73c2e0ff3e1668f4316dda4b93f6b450f0f727db76e739665424d348d6cd57d2" gracePeriod=30 Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.373280 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665f878d64-829mz" event={"ID":"8c90e758-caeb-4efd-a15b-cb78f7a803dc","Type":"ContainerStarted","Data":"6818e97f51f258a4900f86e9a16b0ef5f45e525e67a9858123076236d0554eb7"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.373345 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665f878d64-829mz" event={"ID":"8c90e758-caeb-4efd-a15b-cb78f7a803dc","Type":"ContainerStarted","Data":"ff772be727c9128267a90a5df562a9968b97b9754798abcabf34ba72866700d4"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.376549 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12904d5e-9819-4d47-b8bd-94b42f8b12d2","Type":"ContainerStarted","Data":"4bba38e06e8168d023d985e9e66dd211925f38105d38c870d7edf3219eaa2fac"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.379821 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659c856df6-q8n7k" event={"ID":"444cdcb0-f68c-4943-aa88-dc3710848a7d","Type":"ContainerStarted","Data":"891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.379860 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659c856df6-q8n7k" event={"ID":"444cdcb0-f68c-4943-aa88-dc3710848a7d","Type":"ContainerStarted","Data":"c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.392644 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-cmf8d" event={"ID":"8b8dcd92-d201-4e88-8c81-8b3083bc1a76","Type":"ContainerStarted","Data":"286f4900ccb01e12f5b75423e1d6bcc68c758a3dfc08c140d37758d489ef9eb3"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.399188 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9958e447-a9f1-4513-bd78-f13804f89650","Type":"ContainerStarted","Data":"0f17c56382d01f4e63cc9999c7f94dd72c9f96615453a54d8104616894e50d3e"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.407757 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c75df6b9-7gvq5" podStartSLOduration=2.180543608 podStartE2EDuration="13.407718708s" podCreationTimestamp="2026-03-14 09:05:53 +0000 UTC" firstStartedPulling="2026-03-14 09:05:54.084890285 +0000 UTC m=+7616.836805489" lastFinishedPulling="2026-03-14 09:06:05.312065405 +0000 UTC m=+7628.063980589" observedRunningTime="2026-03-14 09:06:06.393538243 +0000 UTC m=+7629.145453437" watchObservedRunningTime="2026-03-14 09:06:06.407718708 +0000 UTC m=+7629.159633892" Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.412750 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7955df7789-47mhr" event={"ID":"d57e4649-42d3-4992-b04e-8697a98f4148","Type":"ContainerStarted","Data":"cf2353b90ab105449c577439f3ea834d5569a84a9ae2186b5e8961deb848dd19"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.412836 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7955df7789-47mhr" event={"ID":"d57e4649-42d3-4992-b04e-8697a98f4148","Type":"ContainerStarted","Data":"6d9b716da5177606e161bf97663b84849174c1afdc1d0277494fdc1482d1a070"} Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.413013 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7955df7789-47mhr" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon-log" containerID="cri-o://6d9b716da5177606e161bf97663b84849174c1afdc1d0277494fdc1482d1a070" gracePeriod=30 Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.413192 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7955df7789-47mhr" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon" containerID="cri-o://cf2353b90ab105449c577439f3ea834d5569a84a9ae2186b5e8961deb848dd19" gracePeriod=30 Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.445481 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-665f878d64-829mz" podStartSLOduration=2.173998114 podStartE2EDuration="11.445457372s" podCreationTimestamp="2026-03-14 09:05:55 +0000 UTC" firstStartedPulling="2026-03-14 09:05:55.934284822 +0000 UTC m=+7618.686200006" lastFinishedPulling="2026-03-14 09:06:05.20574408 +0000 UTC m=+7627.957659264" observedRunningTime="2026-03-14 09:06:06.423019663 +0000 UTC m=+7629.174934867" watchObservedRunningTime="2026-03-14 09:06:06.445457372 +0000 UTC m=+7629.197372556" Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.468457 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-659c856df6-q8n7k" podStartSLOduration=2.444699169 podStartE2EDuration="11.468417695s" podCreationTimestamp="2026-03-14 09:05:55 +0000 UTC" firstStartedPulling="2026-03-14 09:05:56.169953817 +0000 UTC m=+7618.921869001" lastFinishedPulling="2026-03-14 09:06:05.193672343 +0000 UTC m=+7627.945587527" observedRunningTime="2026-03-14 09:06:06.45717067 +0000 UTC m=+7629.209085864" watchObservedRunningTime="2026-03-14 09:06:06.468417695 +0000 UTC m=+7629.220332879" Mar 14 09:06:06 crc kubenswrapper[5129]: I0314 09:06:06.495878 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7955df7789-47mhr" podStartSLOduration=2.269303406 podStartE2EDuration="13.495860519s" podCreationTimestamp="2026-03-14 09:05:53 +0000 UTC" firstStartedPulling="2026-03-14 09:05:53.978669403 +0000 UTC m=+7616.730584597" lastFinishedPulling="2026-03-14 09:06:05.205226526 +0000 UTC m=+7627.957141710" observedRunningTime="2026-03-14 09:06:06.492147739 +0000 UTC m=+7629.244062923" watchObservedRunningTime="2026-03-14 09:06:06.495860519 +0000 UTC m=+7629.247775703" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.427944 5129 generic.go:334] "Generic (PLEG): container finished" podID="8b8dcd92-d201-4e88-8c81-8b3083bc1a76" containerID="e11e6a139383e68a6e3b326589457eb23662847585bee3324637679fe054d7e5" exitCode=0 Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.428089 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-cmf8d" event={"ID":"8b8dcd92-d201-4e88-8c81-8b3083bc1a76","Type":"ContainerDied","Data":"e11e6a139383e68a6e3b326589457eb23662847585bee3324637679fe054d7e5"} Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.431252 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9958e447-a9f1-4513-bd78-f13804f89650","Type":"ContainerStarted","Data":"e2e6427cfacb335bb488ad7eb27ed16828661393e7464e9f2bb7c6d4671bed0d"} Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.435282 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12904d5e-9819-4d47-b8bd-94b42f8b12d2","Type":"ContainerStarted","Data":"ce6263d394e3aadc70814c5399aec054e8e96a9091a873c42b01f34c5043a027"} Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.489373 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.48934205 podStartE2EDuration="10.48934205s" podCreationTimestamp="2026-03-14 09:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:06:07.477454257 +0000 UTC m=+7630.229369451" watchObservedRunningTime="2026-03-14 09:06:07.48934205 +0000 UTC m=+7630.241257234" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.534960 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.534902116 podStartE2EDuration="10.534902116s" podCreationTimestamp="2026-03-14 09:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:06:07.507578105 +0000 UTC m=+7630.259493299" watchObservedRunningTime="2026-03-14 09:06:07.534902116 +0000 UTC m=+7630.286817300" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.898707 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.898816 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.941637 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.942475 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.977308 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:07 crc kubenswrapper[5129]: I0314 09:06:07.977372 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:08 crc kubenswrapper[5129]: I0314 09:06:08.035257 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:08 crc kubenswrapper[5129]: I0314 09:06:08.054990 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:08 crc kubenswrapper[5129]: I0314 09:06:08.459501 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 09:06:08 crc kubenswrapper[5129]: I0314 09:06:08.460164 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 09:06:08 crc kubenswrapper[5129]: I0314 09:06:08.460198 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:08 crc kubenswrapper[5129]: I0314 09:06:08.460216 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:08 crc kubenswrapper[5129]: I0314 09:06:08.930572 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-cmf8d" Mar 14 09:06:09 crc kubenswrapper[5129]: I0314 09:06:09.130245 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/8b8dcd92-d201-4e88-8c81-8b3083bc1a76-kube-api-access-snf7b\") pod \"8b8dcd92-d201-4e88-8c81-8b3083bc1a76\" (UID: \"8b8dcd92-d201-4e88-8c81-8b3083bc1a76\") " Mar 14 09:06:09 crc kubenswrapper[5129]: I0314 09:06:09.138754 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b8dcd92-d201-4e88-8c81-8b3083bc1a76-kube-api-access-snf7b" (OuterVolumeSpecName: "kube-api-access-snf7b") pod "8b8dcd92-d201-4e88-8c81-8b3083bc1a76" (UID: "8b8dcd92-d201-4e88-8c81-8b3083bc1a76"). InnerVolumeSpecName "kube-api-access-snf7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:09 crc kubenswrapper[5129]: I0314 09:06:09.236250 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snf7b\" (UniqueName: \"kubernetes.io/projected/8b8dcd92-d201-4e88-8c81-8b3083bc1a76-kube-api-access-snf7b\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:09 crc kubenswrapper[5129]: I0314 09:06:09.471374 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-cmf8d" event={"ID":"8b8dcd92-d201-4e88-8c81-8b3083bc1a76","Type":"ContainerDied","Data":"286f4900ccb01e12f5b75423e1d6bcc68c758a3dfc08c140d37758d489ef9eb3"} Mar 14 09:06:09 crc kubenswrapper[5129]: I0314 09:06:09.471420 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-cmf8d" Mar 14 09:06:09 crc kubenswrapper[5129]: I0314 09:06:09.471441 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286f4900ccb01e12f5b75423e1d6bcc68c758a3dfc08c140d37758d489ef9eb3" Mar 14 09:06:10 crc kubenswrapper[5129]: I0314 09:06:10.028066 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-hc7f4"] Mar 14 09:06:10 crc kubenswrapper[5129]: I0314 09:06:10.051036 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-hc7f4"] Mar 14 09:06:12 crc kubenswrapper[5129]: I0314 09:06:12.053917 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fc6210-3f18-4ea6-9c7e-040906265e9b" path="/var/lib/kubelet/pods/b8fc6210-3f18-4ea6-9c7e-040906265e9b/volumes" Mar 14 09:06:13 crc kubenswrapper[5129]: I0314 09:06:13.460325 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:06:13 crc kubenswrapper[5129]: I0314 09:06:13.567800 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:06:15 crc kubenswrapper[5129]: I0314 09:06:15.481795 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:06:15 crc kubenswrapper[5129]: I0314 09:06:15.481875 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:06:15 crc kubenswrapper[5129]: I0314 09:06:15.486468 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-665f878d64-829mz" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8443: connect: connection refused" Mar 14 09:06:15 crc kubenswrapper[5129]: I0314 09:06:15.631577 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:06:15 crc kubenswrapper[5129]: I0314 09:06:15.631798 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:06:15 crc kubenswrapper[5129]: I0314 09:06:15.633974 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-659c856df6-q8n7k" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 14 09:06:19 crc kubenswrapper[5129]: I0314 09:06:19.574106 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:06:19 crc kubenswrapper[5129]: I0314 09:06:19.574672 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:06:19 crc kubenswrapper[5129]: I0314 09:06:19.574730 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:06:19 crc kubenswrapper[5129]: I0314 09:06:19.576882 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d4cd803a025626b83183f6bb1c4666176a5fbed36f07324bbfa359be7b98a8d"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:06:19 crc kubenswrapper[5129]: I0314 09:06:19.577009 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://3d4cd803a025626b83183f6bb1c4666176a5fbed36f07324bbfa359be7b98a8d" gracePeriod=600 Mar 14 09:06:20 crc kubenswrapper[5129]: I0314 09:06:20.591562 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="3d4cd803a025626b83183f6bb1c4666176a5fbed36f07324bbfa359be7b98a8d" exitCode=0 Mar 14 09:06:20 crc kubenswrapper[5129]: I0314 09:06:20.592494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"3d4cd803a025626b83183f6bb1c4666176a5fbed36f07324bbfa359be7b98a8d"} Mar 14 09:06:20 crc kubenswrapper[5129]: I0314 09:06:20.592538 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e"} Mar 14 09:06:20 crc kubenswrapper[5129]: I0314 09:06:20.592561 5129 scope.go:117] "RemoveContainer" containerID="17a2f8b54ae6c3f5541f129b9cb7a81adeb1787b8be40bad530d979d1ac5389d" Mar 14 09:06:25 crc kubenswrapper[5129]: I0314 09:06:25.482989 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-665f878d64-829mz" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8443: connect: connection refused" Mar 14 09:06:25 crc kubenswrapper[5129]: I0314 09:06:25.632539 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-659c856df6-q8n7k" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 14 09:06:29 crc kubenswrapper[5129]: I0314 09:06:29.335976 5129 scope.go:117] "RemoveContainer" containerID="3b1ac07c3a71fa0fae21fa07a816f3032d78e1a6056566d8b59b9a7ababb3b2f" Mar 14 09:06:29 crc kubenswrapper[5129]: I0314 09:06:29.977032 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 09:06:29 crc kubenswrapper[5129]: I0314 09:06:29.983024 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 09:06:30 crc kubenswrapper[5129]: I0314 09:06:30.061795 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:30 crc kubenswrapper[5129]: I0314 09:06:30.472786 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.823967 5129 generic.go:334] "Generic (PLEG): container finished" podID="d57e4649-42d3-4992-b04e-8697a98f4148" containerID="cf2353b90ab105449c577439f3ea834d5569a84a9ae2186b5e8961deb848dd19" exitCode=137 Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.825144 5129 generic.go:334] "Generic (PLEG): container finished" podID="d57e4649-42d3-4992-b04e-8697a98f4148" containerID="6d9b716da5177606e161bf97663b84849174c1afdc1d0277494fdc1482d1a070" exitCode=137 Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.824154 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7955df7789-47mhr" event={"ID":"d57e4649-42d3-4992-b04e-8697a98f4148","Type":"ContainerDied","Data":"cf2353b90ab105449c577439f3ea834d5569a84a9ae2186b5e8961deb848dd19"} Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.825229 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7955df7789-47mhr" event={"ID":"d57e4649-42d3-4992-b04e-8697a98f4148","Type":"ContainerDied","Data":"6d9b716da5177606e161bf97663b84849174c1afdc1d0277494fdc1482d1a070"} Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.828032 5129 generic.go:334] "Generic (PLEG): container finished" podID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerID="73c2e0ff3e1668f4316dda4b93f6b450f0f727db76e739665424d348d6cd57d2" exitCode=137 Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.828061 5129 generic.go:334] "Generic (PLEG): container finished" podID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerID="bc577ac3c48186e140ab6617baedda3a4a9b42e4a655f3600593b06abccae776" exitCode=137 Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.828086 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75df6b9-7gvq5" event={"ID":"33b20ab0-6f98-4a46-9abf-95c2e9755b6f","Type":"ContainerDied","Data":"73c2e0ff3e1668f4316dda4b93f6b450f0f727db76e739665424d348d6cd57d2"} Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.828115 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75df6b9-7gvq5" event={"ID":"33b20ab0-6f98-4a46-9abf-95c2e9755b6f","Type":"ContainerDied","Data":"bc577ac3c48186e140ab6617baedda3a4a9b42e4a655f3600593b06abccae776"} Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.943082 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:06:36 crc kubenswrapper[5129]: I0314 09:06:36.976141 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090213 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-logs\") pod \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090700 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d57e4649-42d3-4992-b04e-8697a98f4148-horizon-secret-key\") pod \"d57e4649-42d3-4992-b04e-8697a98f4148\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090735 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-horizon-secret-key\") pod \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090777 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48rn8\" (UniqueName: \"kubernetes.io/projected/d57e4649-42d3-4992-b04e-8697a98f4148-kube-api-access-48rn8\") pod \"d57e4649-42d3-4992-b04e-8697a98f4148\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090826 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-config-data\") pod \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090857 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-scripts\") pod \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090888 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57e4649-42d3-4992-b04e-8697a98f4148-logs\") pod \"d57e4649-42d3-4992-b04e-8697a98f4148\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090933 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-scripts\") pod \"d57e4649-42d3-4992-b04e-8697a98f4148\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.090992 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4km8\" (UniqueName: \"kubernetes.io/projected/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-kube-api-access-f4km8\") pod \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\" (UID: \"33b20ab0-6f98-4a46-9abf-95c2e9755b6f\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.091124 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-logs" (OuterVolumeSpecName: "logs") pod "33b20ab0-6f98-4a46-9abf-95c2e9755b6f" (UID: "33b20ab0-6f98-4a46-9abf-95c2e9755b6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.091134 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-config-data\") pod \"d57e4649-42d3-4992-b04e-8697a98f4148\" (UID: \"d57e4649-42d3-4992-b04e-8697a98f4148\") " Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.092113 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.093429 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d57e4649-42d3-4992-b04e-8697a98f4148-logs" (OuterVolumeSpecName: "logs") pod "d57e4649-42d3-4992-b04e-8697a98f4148" (UID: "d57e4649-42d3-4992-b04e-8697a98f4148"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.097844 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "33b20ab0-6f98-4a46-9abf-95c2e9755b6f" (UID: "33b20ab0-6f98-4a46-9abf-95c2e9755b6f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.098424 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57e4649-42d3-4992-b04e-8697a98f4148-kube-api-access-48rn8" (OuterVolumeSpecName: "kube-api-access-48rn8") pod "d57e4649-42d3-4992-b04e-8697a98f4148" (UID: "d57e4649-42d3-4992-b04e-8697a98f4148"). InnerVolumeSpecName "kube-api-access-48rn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.099488 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-kube-api-access-f4km8" (OuterVolumeSpecName: "kube-api-access-f4km8") pod "33b20ab0-6f98-4a46-9abf-95c2e9755b6f" (UID: "33b20ab0-6f98-4a46-9abf-95c2e9755b6f"). InnerVolumeSpecName "kube-api-access-f4km8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.100242 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57e4649-42d3-4992-b04e-8697a98f4148-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d57e4649-42d3-4992-b04e-8697a98f4148" (UID: "d57e4649-42d3-4992-b04e-8697a98f4148"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.119994 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-scripts" (OuterVolumeSpecName: "scripts") pod "d57e4649-42d3-4992-b04e-8697a98f4148" (UID: "d57e4649-42d3-4992-b04e-8697a98f4148"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.123752 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-config-data" (OuterVolumeSpecName: "config-data") pod "d57e4649-42d3-4992-b04e-8697a98f4148" (UID: "d57e4649-42d3-4992-b04e-8697a98f4148"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.127260 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-config-data" (OuterVolumeSpecName: "config-data") pod "33b20ab0-6f98-4a46-9abf-95c2e9755b6f" (UID: "33b20ab0-6f98-4a46-9abf-95c2e9755b6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.133758 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-scripts" (OuterVolumeSpecName: "scripts") pod "33b20ab0-6f98-4a46-9abf-95c2e9755b6f" (UID: "33b20ab0-6f98-4a46-9abf-95c2e9755b6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194374 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194457 5129 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d57e4649-42d3-4992-b04e-8697a98f4148-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194483 5129 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194503 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48rn8\" (UniqueName: \"kubernetes.io/projected/d57e4649-42d3-4992-b04e-8697a98f4148-kube-api-access-48rn8\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194523 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194544 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194561 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d57e4649-42d3-4992-b04e-8697a98f4148-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194582 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d57e4649-42d3-4992-b04e-8697a98f4148-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.194674 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4km8\" (UniqueName: \"kubernetes.io/projected/33b20ab0-6f98-4a46-9abf-95c2e9755b6f-kube-api-access-f4km8\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.613802 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.814877 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.838756 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7955df7789-47mhr" event={"ID":"d57e4649-42d3-4992-b04e-8697a98f4148","Type":"ContainerDied","Data":"01aba64e73dd914607a7ac4eddc760caa963d17bde470a2a057d2477eec4d72d"} Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.838820 5129 scope.go:117] "RemoveContainer" containerID="cf2353b90ab105449c577439f3ea834d5569a84a9ae2186b5e8961deb848dd19" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.839008 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7955df7789-47mhr" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.846463 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c75df6b9-7gvq5" event={"ID":"33b20ab0-6f98-4a46-9abf-95c2e9755b6f","Type":"ContainerDied","Data":"f20fb35e170af7e7deb2ebc56546b076591624028a374c787a8732c6f85af3e5"} Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.846547 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c75df6b9-7gvq5" Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.882977 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7955df7789-47mhr"] Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.891997 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7955df7789-47mhr"] Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.901181 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c75df6b9-7gvq5"] Mar 14 09:06:37 crc kubenswrapper[5129]: I0314 09:06:37.910872 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c75df6b9-7gvq5"] Mar 14 09:06:38 crc kubenswrapper[5129]: I0314 09:06:38.020079 5129 scope.go:117] "RemoveContainer" containerID="6d9b716da5177606e161bf97663b84849174c1afdc1d0277494fdc1482d1a070" Mar 14 09:06:38 crc kubenswrapper[5129]: I0314 09:06:38.055937 5129 scope.go:117] "RemoveContainer" containerID="73c2e0ff3e1668f4316dda4b93f6b450f0f727db76e739665424d348d6cd57d2" Mar 14 09:06:38 crc kubenswrapper[5129]: I0314 09:06:38.061363 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" path="/var/lib/kubelet/pods/33b20ab0-6f98-4a46-9abf-95c2e9755b6f/volumes" Mar 14 09:06:38 crc kubenswrapper[5129]: I0314 09:06:38.062332 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" path="/var/lib/kubelet/pods/d57e4649-42d3-4992-b04e-8697a98f4148/volumes" Mar 14 09:06:38 crc kubenswrapper[5129]: I0314 09:06:38.228331 5129 scope.go:117] "RemoveContainer" containerID="bc577ac3c48186e140ab6617baedda3a4a9b42e4a655f3600593b06abccae776" Mar 14 09:06:39 crc kubenswrapper[5129]: I0314 09:06:39.322170 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:06:39 crc kubenswrapper[5129]: I0314 09:06:39.751622 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:06:39 crc kubenswrapper[5129]: I0314 09:06:39.820050 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665f878d64-829mz"] Mar 14 09:06:39 crc kubenswrapper[5129]: I0314 09:06:39.885373 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-665f878d64-829mz" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon-log" containerID="cri-o://ff772be727c9128267a90a5df562a9968b97b9754798abcabf34ba72866700d4" gracePeriod=30 Mar 14 09:06:39 crc kubenswrapper[5129]: I0314 09:06:39.885530 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-665f878d64-829mz" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" containerID="cri-o://6818e97f51f258a4900f86e9a16b0ef5f45e525e67a9858123076236d0554eb7" gracePeriod=30 Mar 14 09:06:43 crc kubenswrapper[5129]: I0314 09:06:43.938789 5129 generic.go:334] "Generic (PLEG): container finished" podID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerID="6818e97f51f258a4900f86e9a16b0ef5f45e525e67a9858123076236d0554eb7" exitCode=0 Mar 14 09:06:43 crc kubenswrapper[5129]: I0314 09:06:43.938894 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665f878d64-829mz" event={"ID":"8c90e758-caeb-4efd-a15b-cb78f7a803dc","Type":"ContainerDied","Data":"6818e97f51f258a4900f86e9a16b0ef5f45e525e67a9858123076236d0554eb7"} Mar 14 09:06:45 crc kubenswrapper[5129]: I0314 09:06:45.483165 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-665f878d64-829mz" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8443: connect: connection refused" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.033414 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-589757bd64-ph527"] Mar 14 09:06:47 crc kubenswrapper[5129]: E0314 09:06:47.038574 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8dcd92-d201-4e88-8c81-8b3083bc1a76" containerName="oc" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038625 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8dcd92-d201-4e88-8c81-8b3083bc1a76" containerName="oc" Mar 14 09:06:47 crc kubenswrapper[5129]: E0314 09:06:47.038642 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038651 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon" Mar 14 09:06:47 crc kubenswrapper[5129]: E0314 09:06:47.038665 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038671 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon" Mar 14 09:06:47 crc kubenswrapper[5129]: E0314 09:06:47.038684 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon-log" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038690 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon-log" Mar 14 09:06:47 crc kubenswrapper[5129]: E0314 09:06:47.038702 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon-log" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038708 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon-log" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038915 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon-log" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038934 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8dcd92-d201-4e88-8c81-8b3083bc1a76" containerName="oc" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038955 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon-log" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038966 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b20ab0-6f98-4a46-9abf-95c2e9755b6f" containerName="horizon" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.038975 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57e4649-42d3-4992-b04e-8697a98f4148" containerName="horizon" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.040875 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.046892 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-589757bd64-ph527"] Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.131302 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fblcz\" (UniqueName: \"kubernetes.io/projected/ccf2191a-b66b-4e06-a811-d38b6b465662-kube-api-access-fblcz\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.131367 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf2191a-b66b-4e06-a811-d38b6b465662-scripts\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.131446 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-combined-ca-bundle\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.131473 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf2191a-b66b-4e06-a811-d38b6b465662-logs\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.131508 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-horizon-tls-certs\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.131535 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf2191a-b66b-4e06-a811-d38b6b465662-config-data\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.131560 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-horizon-secret-key\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.234782 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-combined-ca-bundle\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.234971 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf2191a-b66b-4e06-a811-d38b6b465662-logs\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.235113 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-horizon-tls-certs\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.235219 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf2191a-b66b-4e06-a811-d38b6b465662-config-data\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.235326 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-horizon-secret-key\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.235451 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf2191a-b66b-4e06-a811-d38b6b465662-logs\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.235732 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fblcz\" (UniqueName: \"kubernetes.io/projected/ccf2191a-b66b-4e06-a811-d38b6b465662-kube-api-access-fblcz\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.235821 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf2191a-b66b-4e06-a811-d38b6b465662-scripts\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.237314 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf2191a-b66b-4e06-a811-d38b6b465662-scripts\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.238284 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf2191a-b66b-4e06-a811-d38b6b465662-config-data\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.246286 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-combined-ca-bundle\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.251411 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-horizon-secret-key\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.251669 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf2191a-b66b-4e06-a811-d38b6b465662-horizon-tls-certs\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.257122 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fblcz\" (UniqueName: \"kubernetes.io/projected/ccf2191a-b66b-4e06-a811-d38b6b465662-kube-api-access-fblcz\") pod \"horizon-589757bd64-ph527\" (UID: \"ccf2191a-b66b-4e06-a811-d38b6b465662\") " pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.385071 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.673242 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-589757bd64-ph527"] Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.982446 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589757bd64-ph527" event={"ID":"ccf2191a-b66b-4e06-a811-d38b6b465662","Type":"ContainerStarted","Data":"0a8dfc46f72af90f05c631e999cd4a743c98c53eaf2f4da3ea0a0cb6e2145670"} Mar 14 09:06:47 crc kubenswrapper[5129]: I0314 09:06:47.982959 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589757bd64-ph527" event={"ID":"ccf2191a-b66b-4e06-a811-d38b6b465662","Type":"ContainerStarted","Data":"bb9c99087ee99471b1fcf45c9be8fd72d59da706256fec25f7f799d81516d4ef"} Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.475527 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-5tt5v"] Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.476748 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.499029 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-5tt5v"] Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.573180 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5rg\" (UniqueName: \"kubernetes.io/projected/4e78b5da-50f2-4439-946c-a2abca564dd4-kube-api-access-6j5rg\") pod \"heat-db-create-5tt5v\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.573268 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e78b5da-50f2-4439-946c-a2abca564dd4-operator-scripts\") pod \"heat-db-create-5tt5v\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.579315 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-ca6a-account-create-update-9w5n4"] Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.580628 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.582588 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.590757 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ca6a-account-create-update-9w5n4"] Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.675636 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msc7\" (UniqueName: \"kubernetes.io/projected/dcd776ac-2907-401a-a4b6-2f89df804f66-kube-api-access-5msc7\") pod \"heat-ca6a-account-create-update-9w5n4\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.675755 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5rg\" (UniqueName: \"kubernetes.io/projected/4e78b5da-50f2-4439-946c-a2abca564dd4-kube-api-access-6j5rg\") pod \"heat-db-create-5tt5v\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.675816 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e78b5da-50f2-4439-946c-a2abca564dd4-operator-scripts\") pod \"heat-db-create-5tt5v\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.675903 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd776ac-2907-401a-a4b6-2f89df804f66-operator-scripts\") pod \"heat-ca6a-account-create-update-9w5n4\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.676927 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e78b5da-50f2-4439-946c-a2abca564dd4-operator-scripts\") pod \"heat-db-create-5tt5v\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.701542 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5rg\" (UniqueName: \"kubernetes.io/projected/4e78b5da-50f2-4439-946c-a2abca564dd4-kube-api-access-6j5rg\") pod \"heat-db-create-5tt5v\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.778108 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd776ac-2907-401a-a4b6-2f89df804f66-operator-scripts\") pod \"heat-ca6a-account-create-update-9w5n4\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.778187 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msc7\" (UniqueName: \"kubernetes.io/projected/dcd776ac-2907-401a-a4b6-2f89df804f66-kube-api-access-5msc7\") pod \"heat-ca6a-account-create-update-9w5n4\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.778925 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd776ac-2907-401a-a4b6-2f89df804f66-operator-scripts\") pod \"heat-ca6a-account-create-update-9w5n4\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.799299 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msc7\" (UniqueName: \"kubernetes.io/projected/dcd776ac-2907-401a-a4b6-2f89df804f66-kube-api-access-5msc7\") pod \"heat-ca6a-account-create-update-9w5n4\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.817104 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:48 crc kubenswrapper[5129]: I0314 09:06:48.903713 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:49 crc kubenswrapper[5129]: I0314 09:06:49.000692 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589757bd64-ph527" event={"ID":"ccf2191a-b66b-4e06-a811-d38b6b465662","Type":"ContainerStarted","Data":"0d4cd32aa6c67ac510e00b7877bf462631cc3de791860755baf5a06dfbec2143"} Mar 14 09:06:49 crc kubenswrapper[5129]: I0314 09:06:49.041922 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-589757bd64-ph527" podStartSLOduration=2.041891676 podStartE2EDuration="2.041891676s" podCreationTimestamp="2026-03-14 09:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:06:49.033133198 +0000 UTC m=+7671.785048382" watchObservedRunningTime="2026-03-14 09:06:49.041891676 +0000 UTC m=+7671.793806870" Mar 14 09:06:49 crc kubenswrapper[5129]: I0314 09:06:49.307440 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-5tt5v"] Mar 14 09:06:49 crc kubenswrapper[5129]: I0314 09:06:49.402378 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ca6a-account-create-update-9w5n4"] Mar 14 09:06:49 crc kubenswrapper[5129]: W0314 09:06:49.403936 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd776ac_2907_401a_a4b6_2f89df804f66.slice/crio-8cfad6bac222f4ac3d42cbf1296092e7c585d7b2f4acd5a3faeaf62ab4aa5585 WatchSource:0}: Error finding container 8cfad6bac222f4ac3d42cbf1296092e7c585d7b2f4acd5a3faeaf62ab4aa5585: Status 404 returned error can't find the container with id 8cfad6bac222f4ac3d42cbf1296092e7c585d7b2f4acd5a3faeaf62ab4aa5585 Mar 14 09:06:50 crc kubenswrapper[5129]: I0314 09:06:50.013263 5129 generic.go:334] "Generic (PLEG): container finished" podID="dcd776ac-2907-401a-a4b6-2f89df804f66" containerID="650d3b9af905853e36fcbab021532e7cb5ecef275faf34050170fc42e4af8787" exitCode=0 Mar 14 09:06:50 crc kubenswrapper[5129]: I0314 09:06:50.013330 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ca6a-account-create-update-9w5n4" event={"ID":"dcd776ac-2907-401a-a4b6-2f89df804f66","Type":"ContainerDied","Data":"650d3b9af905853e36fcbab021532e7cb5ecef275faf34050170fc42e4af8787"} Mar 14 09:06:50 crc kubenswrapper[5129]: I0314 09:06:50.013357 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ca6a-account-create-update-9w5n4" event={"ID":"dcd776ac-2907-401a-a4b6-2f89df804f66","Type":"ContainerStarted","Data":"8cfad6bac222f4ac3d42cbf1296092e7c585d7b2f4acd5a3faeaf62ab4aa5585"} Mar 14 09:06:50 crc kubenswrapper[5129]: I0314 09:06:50.016498 5129 generic.go:334] "Generic (PLEG): container finished" podID="4e78b5da-50f2-4439-946c-a2abca564dd4" containerID="72b068afd296394873ca0de2278f5daa53f60c53914a04d88f197d77078254c7" exitCode=0 Mar 14 09:06:50 crc kubenswrapper[5129]: I0314 09:06:50.016569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-5tt5v" event={"ID":"4e78b5da-50f2-4439-946c-a2abca564dd4","Type":"ContainerDied","Data":"72b068afd296394873ca0de2278f5daa53f60c53914a04d88f197d77078254c7"} Mar 14 09:06:50 crc kubenswrapper[5129]: I0314 09:06:50.016620 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-5tt5v" event={"ID":"4e78b5da-50f2-4439-946c-a2abca564dd4","Type":"ContainerStarted","Data":"39060d60e9e3bbcd401101b3edc0c1fc66daad48f5898d5da1aba81c30d31cad"} Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.397928 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.404717 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.557851 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msc7\" (UniqueName: \"kubernetes.io/projected/dcd776ac-2907-401a-a4b6-2f89df804f66-kube-api-access-5msc7\") pod \"dcd776ac-2907-401a-a4b6-2f89df804f66\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.557969 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j5rg\" (UniqueName: \"kubernetes.io/projected/4e78b5da-50f2-4439-946c-a2abca564dd4-kube-api-access-6j5rg\") pod \"4e78b5da-50f2-4439-946c-a2abca564dd4\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.558078 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e78b5da-50f2-4439-946c-a2abca564dd4-operator-scripts\") pod \"4e78b5da-50f2-4439-946c-a2abca564dd4\" (UID: \"4e78b5da-50f2-4439-946c-a2abca564dd4\") " Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.558136 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd776ac-2907-401a-a4b6-2f89df804f66-operator-scripts\") pod \"dcd776ac-2907-401a-a4b6-2f89df804f66\" (UID: \"dcd776ac-2907-401a-a4b6-2f89df804f66\") " Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.558902 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e78b5da-50f2-4439-946c-a2abca564dd4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e78b5da-50f2-4439-946c-a2abca564dd4" (UID: "4e78b5da-50f2-4439-946c-a2abca564dd4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.558980 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd776ac-2907-401a-a4b6-2f89df804f66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcd776ac-2907-401a-a4b6-2f89df804f66" (UID: "dcd776ac-2907-401a-a4b6-2f89df804f66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.567495 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e78b5da-50f2-4439-946c-a2abca564dd4-kube-api-access-6j5rg" (OuterVolumeSpecName: "kube-api-access-6j5rg") pod "4e78b5da-50f2-4439-946c-a2abca564dd4" (UID: "4e78b5da-50f2-4439-946c-a2abca564dd4"). InnerVolumeSpecName "kube-api-access-6j5rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.568759 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd776ac-2907-401a-a4b6-2f89df804f66-kube-api-access-5msc7" (OuterVolumeSpecName: "kube-api-access-5msc7") pod "dcd776ac-2907-401a-a4b6-2f89df804f66" (UID: "dcd776ac-2907-401a-a4b6-2f89df804f66"). InnerVolumeSpecName "kube-api-access-5msc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.660945 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd776ac-2907-401a-a4b6-2f89df804f66-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.661002 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msc7\" (UniqueName: \"kubernetes.io/projected/dcd776ac-2907-401a-a4b6-2f89df804f66-kube-api-access-5msc7\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.661019 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j5rg\" (UniqueName: \"kubernetes.io/projected/4e78b5da-50f2-4439-946c-a2abca564dd4-kube-api-access-6j5rg\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:51 crc kubenswrapper[5129]: I0314 09:06:51.661033 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e78b5da-50f2-4439-946c-a2abca564dd4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:52 crc kubenswrapper[5129]: I0314 09:06:52.035101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ca6a-account-create-update-9w5n4" event={"ID":"dcd776ac-2907-401a-a4b6-2f89df804f66","Type":"ContainerDied","Data":"8cfad6bac222f4ac3d42cbf1296092e7c585d7b2f4acd5a3faeaf62ab4aa5585"} Mar 14 09:06:52 crc kubenswrapper[5129]: I0314 09:06:52.035591 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfad6bac222f4ac3d42cbf1296092e7c585d7b2f4acd5a3faeaf62ab4aa5585" Mar 14 09:06:52 crc kubenswrapper[5129]: I0314 09:06:52.036993 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5tt5v" Mar 14 09:06:52 crc kubenswrapper[5129]: I0314 09:06:52.035177 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ca6a-account-create-update-9w5n4" Mar 14 09:06:52 crc kubenswrapper[5129]: I0314 09:06:52.050266 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-5tt5v" event={"ID":"4e78b5da-50f2-4439-946c-a2abca564dd4","Type":"ContainerDied","Data":"39060d60e9e3bbcd401101b3edc0c1fc66daad48f5898d5da1aba81c30d31cad"} Mar 14 09:06:52 crc kubenswrapper[5129]: I0314 09:06:52.050315 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39060d60e9e3bbcd401101b3edc0c1fc66daad48f5898d5da1aba81c30d31cad" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.720654 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-qprmq"] Mar 14 09:06:53 crc kubenswrapper[5129]: E0314 09:06:53.721477 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e78b5da-50f2-4439-946c-a2abca564dd4" containerName="mariadb-database-create" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.721492 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e78b5da-50f2-4439-946c-a2abca564dd4" containerName="mariadb-database-create" Mar 14 09:06:53 crc kubenswrapper[5129]: E0314 09:06:53.721518 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd776ac-2907-401a-a4b6-2f89df804f66" containerName="mariadb-account-create-update" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.721525 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd776ac-2907-401a-a4b6-2f89df804f66" containerName="mariadb-account-create-update" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.721735 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e78b5da-50f2-4439-946c-a2abca564dd4" containerName="mariadb-database-create" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.721750 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd776ac-2907-401a-a4b6-2f89df804f66" containerName="mariadb-account-create-update" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.722449 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.724949 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-dfzhn" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.728938 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.732633 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-qprmq"] Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.812617 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-combined-ca-bundle\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.812744 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44q7q\" (UniqueName: \"kubernetes.io/projected/db9df730-0f02-4892-b3ed-2273cb131298-kube-api-access-44q7q\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.813696 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-config-data\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.915774 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-combined-ca-bundle\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.915840 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44q7q\" (UniqueName: \"kubernetes.io/projected/db9df730-0f02-4892-b3ed-2273cb131298-kube-api-access-44q7q\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.915967 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-config-data\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.924759 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-combined-ca-bundle\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.925505 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-config-data\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:53 crc kubenswrapper[5129]: I0314 09:06:53.943560 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44q7q\" (UniqueName: \"kubernetes.io/projected/db9df730-0f02-4892-b3ed-2273cb131298-kube-api-access-44q7q\") pod \"heat-db-sync-qprmq\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:54 crc kubenswrapper[5129]: I0314 09:06:54.052809 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qprmq" Mar 14 09:06:54 crc kubenswrapper[5129]: I0314 09:06:54.642448 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-qprmq"] Mar 14 09:06:55 crc kubenswrapper[5129]: I0314 09:06:55.067974 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qprmq" event={"ID":"db9df730-0f02-4892-b3ed-2273cb131298","Type":"ContainerStarted","Data":"fda6c8c99026d378518c6c468725ae9f88e9c89f416cf859738a00065b9e5497"} Mar 14 09:06:55 crc kubenswrapper[5129]: I0314 09:06:55.482884 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-665f878d64-829mz" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8443: connect: connection refused" Mar 14 09:06:56 crc kubenswrapper[5129]: I0314 09:06:56.062237 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xb87m"] Mar 14 09:06:56 crc kubenswrapper[5129]: I0314 09:06:56.072207 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8475-account-create-update-nq4b8"] Mar 14 09:06:56 crc kubenswrapper[5129]: I0314 09:06:56.079989 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xb87m"] Mar 14 09:06:56 crc kubenswrapper[5129]: I0314 09:06:56.087299 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8475-account-create-update-nq4b8"] Mar 14 09:06:57 crc kubenswrapper[5129]: I0314 09:06:57.385486 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:57 crc kubenswrapper[5129]: I0314 09:06:57.386978 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-589757bd64-ph527" Mar 14 09:06:58 crc kubenswrapper[5129]: I0314 09:06:58.051282 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117258eb-cadd-4d21-b2fe-6b2901131e5f" path="/var/lib/kubelet/pods/117258eb-cadd-4d21-b2fe-6b2901131e5f/volumes" Mar 14 09:06:58 crc kubenswrapper[5129]: I0314 09:06:58.052037 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2020c4-b598-4f6d-9867-337f481bab41" path="/var/lib/kubelet/pods/da2020c4-b598-4f6d-9867-337f481bab41/volumes" Mar 14 09:07:03 crc kubenswrapper[5129]: I0314 09:07:03.143119 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qprmq" event={"ID":"db9df730-0f02-4892-b3ed-2273cb131298","Type":"ContainerStarted","Data":"c525e0d94418798d9243a949c6127a27e15581ee18c2628c0270ff8236875e14"} Mar 14 09:07:03 crc kubenswrapper[5129]: I0314 09:07:03.172244 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-qprmq" podStartSLOduration=2.578965793 podStartE2EDuration="10.172225s" podCreationTimestamp="2026-03-14 09:06:53 +0000 UTC" firstStartedPulling="2026-03-14 09:06:54.643882307 +0000 UTC m=+7677.395797491" lastFinishedPulling="2026-03-14 09:07:02.237141524 +0000 UTC m=+7684.989056698" observedRunningTime="2026-03-14 09:07:03.155263009 +0000 UTC m=+7685.907178193" watchObservedRunningTime="2026-03-14 09:07:03.172225 +0000 UTC m=+7685.924140184" Mar 14 09:07:05 crc kubenswrapper[5129]: I0314 09:07:05.169175 5129 generic.go:334] "Generic (PLEG): container finished" podID="db9df730-0f02-4892-b3ed-2273cb131298" containerID="c525e0d94418798d9243a949c6127a27e15581ee18c2628c0270ff8236875e14" exitCode=0 Mar 14 09:07:05 crc kubenswrapper[5129]: I0314 09:07:05.169273 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qprmq" event={"ID":"db9df730-0f02-4892-b3ed-2273cb131298","Type":"ContainerDied","Data":"c525e0d94418798d9243a949c6127a27e15581ee18c2628c0270ff8236875e14"} Mar 14 09:07:05 crc kubenswrapper[5129]: I0314 09:07:05.483854 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-665f878d64-829mz" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8443: connect: connection refused" Mar 14 09:07:05 crc kubenswrapper[5129]: I0314 09:07:05.483983 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.067241 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-prd8k"] Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.079798 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-prd8k"] Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.567911 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qprmq" Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.696613 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-combined-ca-bundle\") pod \"db9df730-0f02-4892-b3ed-2273cb131298\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.696863 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-config-data\") pod \"db9df730-0f02-4892-b3ed-2273cb131298\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.697099 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44q7q\" (UniqueName: \"kubernetes.io/projected/db9df730-0f02-4892-b3ed-2273cb131298-kube-api-access-44q7q\") pod \"db9df730-0f02-4892-b3ed-2273cb131298\" (UID: \"db9df730-0f02-4892-b3ed-2273cb131298\") " Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.704095 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db9df730-0f02-4892-b3ed-2273cb131298-kube-api-access-44q7q" (OuterVolumeSpecName: "kube-api-access-44q7q") pod "db9df730-0f02-4892-b3ed-2273cb131298" (UID: "db9df730-0f02-4892-b3ed-2273cb131298"). InnerVolumeSpecName "kube-api-access-44q7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.765421 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db9df730-0f02-4892-b3ed-2273cb131298" (UID: "db9df730-0f02-4892-b3ed-2273cb131298"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.801237 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.801303 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44q7q\" (UniqueName: \"kubernetes.io/projected/db9df730-0f02-4892-b3ed-2273cb131298-kube-api-access-44q7q\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.813385 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-config-data" (OuterVolumeSpecName: "config-data") pod "db9df730-0f02-4892-b3ed-2273cb131298" (UID: "db9df730-0f02-4892-b3ed-2273cb131298"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:06 crc kubenswrapper[5129]: I0314 09:07:06.904256 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9df730-0f02-4892-b3ed-2273cb131298-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:07 crc kubenswrapper[5129]: I0314 09:07:07.188243 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qprmq" event={"ID":"db9df730-0f02-4892-b3ed-2273cb131298","Type":"ContainerDied","Data":"fda6c8c99026d378518c6c468725ae9f88e9c89f416cf859738a00065b9e5497"} Mar 14 09:07:07 crc kubenswrapper[5129]: I0314 09:07:07.188446 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fda6c8c99026d378518c6c468725ae9f88e9c89f416cf859738a00065b9e5497" Mar 14 09:07:07 crc kubenswrapper[5129]: I0314 09:07:07.188332 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qprmq" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.048832 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa745c2-ed72-4a7a-8a5a-ee1733246f11" path="/var/lib/kubelet/pods/7fa745c2-ed72-4a7a-8a5a-ee1733246f11/volumes" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.392670 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6bd79d47cd-8jc86"] Mar 14 09:07:08 crc kubenswrapper[5129]: E0314 09:07:08.393143 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db9df730-0f02-4892-b3ed-2273cb131298" containerName="heat-db-sync" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.393163 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9df730-0f02-4892-b3ed-2273cb131298" containerName="heat-db-sync" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.393322 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="db9df730-0f02-4892-b3ed-2273cb131298" containerName="heat-db-sync" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.393996 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.397502 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.397817 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.398097 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-dfzhn" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.411439 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6bd79d47cd-8jc86"] Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.510922 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6d476769b-mw4rk"] Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.513153 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.517415 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.526342 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7cb7bfc8bd-fnxjv"] Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.527473 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.530984 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.539507 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data-custom\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.539653 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jdq7\" (UniqueName: \"kubernetes.io/projected/d710d033-8d28-4007-9088-1b98afe917da-kube-api-access-7jdq7\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.539740 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.539835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-combined-ca-bundle\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.544720 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d476769b-mw4rk"] Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.561057 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cb7bfc8bd-fnxjv"] Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642213 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-combined-ca-bundle\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642320 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pw9\" (UniqueName: \"kubernetes.io/projected/e4c89eb1-1971-49aa-9967-2fde67ead88a-kube-api-access-52pw9\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642384 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642482 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86qgz\" (UniqueName: \"kubernetes.io/projected/9acd8d94-20a0-4529-be0f-ddcac0466c8f-kube-api-access-86qgz\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642720 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data-custom\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642754 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642803 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jdq7\" (UniqueName: \"kubernetes.io/projected/d710d033-8d28-4007-9088-1b98afe917da-kube-api-access-7jdq7\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642860 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data-custom\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642892 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data-custom\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642921 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.642953 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-combined-ca-bundle\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.643034 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-combined-ca-bundle\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.649865 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.651453 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-combined-ca-bundle\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.660578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data-custom\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.664234 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jdq7\" (UniqueName: \"kubernetes.io/projected/d710d033-8d28-4007-9088-1b98afe917da-kube-api-access-7jdq7\") pod \"heat-engine-6bd79d47cd-8jc86\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.738043 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.744867 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-combined-ca-bundle\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.744932 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pw9\" (UniqueName: \"kubernetes.io/projected/e4c89eb1-1971-49aa-9967-2fde67ead88a-kube-api-access-52pw9\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.744960 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.744997 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86qgz\" (UniqueName: \"kubernetes.io/projected/9acd8d94-20a0-4529-be0f-ddcac0466c8f-kube-api-access-86qgz\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.745065 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.745104 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data-custom\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.745132 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data-custom\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.745155 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-combined-ca-bundle\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.755597 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.757908 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-combined-ca-bundle\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.759724 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data-custom\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.761208 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-combined-ca-bundle\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.762225 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data-custom\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.768335 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pw9\" (UniqueName: \"kubernetes.io/projected/e4c89eb1-1971-49aa-9967-2fde67ead88a-kube-api-access-52pw9\") pod \"heat-api-6d476769b-mw4rk\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.775525 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86qgz\" (UniqueName: \"kubernetes.io/projected/9acd8d94-20a0-4529-be0f-ddcac0466c8f-kube-api-access-86qgz\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.782440 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data\") pod \"heat-cfnapi-7cb7bfc8bd-fnxjv\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.839889 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:08 crc kubenswrapper[5129]: I0314 09:07:08.857738 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:09 crc kubenswrapper[5129]: W0314 09:07:09.301579 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd710d033_8d28_4007_9088_1b98afe917da.slice/crio-b1d10b9c64d2f64e156801bc2c7812d8f324405bdafdaf4a43ad950b246d25a7 WatchSource:0}: Error finding container b1d10b9c64d2f64e156801bc2c7812d8f324405bdafdaf4a43ad950b246d25a7: Status 404 returned error can't find the container with id b1d10b9c64d2f64e156801bc2c7812d8f324405bdafdaf4a43ad950b246d25a7 Mar 14 09:07:09 crc kubenswrapper[5129]: I0314 09:07:09.307942 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6bd79d47cd-8jc86"] Mar 14 09:07:09 crc kubenswrapper[5129]: I0314 09:07:09.402875 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d476769b-mw4rk"] Mar 14 09:07:09 crc kubenswrapper[5129]: I0314 09:07:09.420130 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cb7bfc8bd-fnxjv"] Mar 14 09:07:09 crc kubenswrapper[5129]: W0314 09:07:09.445654 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9acd8d94_20a0_4529_be0f_ddcac0466c8f.slice/crio-b8e15abc4b30ec14260b47993b5a6503160fd54179da4520e2b64504fd064eb8 WatchSource:0}: Error finding container b8e15abc4b30ec14260b47993b5a6503160fd54179da4520e2b64504fd064eb8: Status 404 returned error can't find the container with id b8e15abc4b30ec14260b47993b5a6503160fd54179da4520e2b64504fd064eb8 Mar 14 09:07:09 crc kubenswrapper[5129]: I0314 09:07:09.949125 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-589757bd64-ph527" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.226365 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" event={"ID":"9acd8d94-20a0-4529-be0f-ddcac0466c8f","Type":"ContainerStarted","Data":"b8e15abc4b30ec14260b47993b5a6503160fd54179da4520e2b64504fd064eb8"} Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.229415 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d476769b-mw4rk" event={"ID":"e4c89eb1-1971-49aa-9967-2fde67ead88a","Type":"ContainerStarted","Data":"3b6eb43425a3b3c998ffc41fe71e489f610b40c663f8cd84944e5608e22791c7"} Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.236040 5129 generic.go:334] "Generic (PLEG): container finished" podID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerID="ff772be727c9128267a90a5df562a9968b97b9754798abcabf34ba72866700d4" exitCode=137 Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.236163 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665f878d64-829mz" event={"ID":"8c90e758-caeb-4efd-a15b-cb78f7a803dc","Type":"ContainerDied","Data":"ff772be727c9128267a90a5df562a9968b97b9754798abcabf34ba72866700d4"} Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.241072 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bd79d47cd-8jc86" event={"ID":"d710d033-8d28-4007-9088-1b98afe917da","Type":"ContainerStarted","Data":"5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026"} Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.241133 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bd79d47cd-8jc86" event={"ID":"d710d033-8d28-4007-9088-1b98afe917da","Type":"ContainerStarted","Data":"b1d10b9c64d2f64e156801bc2c7812d8f324405bdafdaf4a43ad950b246d25a7"} Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.241337 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.274544 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6bd79d47cd-8jc86" podStartSLOduration=2.274524884 podStartE2EDuration="2.274524884s" podCreationTimestamp="2026-03-14 09:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:07:10.266364302 +0000 UTC m=+7693.018279486" watchObservedRunningTime="2026-03-14 09:07:10.274524884 +0000 UTC m=+7693.026440058" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.350338 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493207 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-tls-certs\") pod \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493288 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c90e758-caeb-4efd-a15b-cb78f7a803dc-logs\") pod \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493370 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zllt5\" (UniqueName: \"kubernetes.io/projected/8c90e758-caeb-4efd-a15b-cb78f7a803dc-kube-api-access-zllt5\") pod \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493448 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-scripts\") pod \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493551 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-combined-ca-bundle\") pod \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493633 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-secret-key\") pod \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493670 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-config-data\") pod \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\" (UID: \"8c90e758-caeb-4efd-a15b-cb78f7a803dc\") " Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.493972 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c90e758-caeb-4efd-a15b-cb78f7a803dc-logs" (OuterVolumeSpecName: "logs") pod "8c90e758-caeb-4efd-a15b-cb78f7a803dc" (UID: "8c90e758-caeb-4efd-a15b-cb78f7a803dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.494136 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c90e758-caeb-4efd-a15b-cb78f7a803dc-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.504927 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8c90e758-caeb-4efd-a15b-cb78f7a803dc" (UID: "8c90e758-caeb-4efd-a15b-cb78f7a803dc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.518869 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c90e758-caeb-4efd-a15b-cb78f7a803dc-kube-api-access-zllt5" (OuterVolumeSpecName: "kube-api-access-zllt5") pod "8c90e758-caeb-4efd-a15b-cb78f7a803dc" (UID: "8c90e758-caeb-4efd-a15b-cb78f7a803dc"). InnerVolumeSpecName "kube-api-access-zllt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.530475 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c90e758-caeb-4efd-a15b-cb78f7a803dc" (UID: "8c90e758-caeb-4efd-a15b-cb78f7a803dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.565971 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-config-data" (OuterVolumeSpecName: "config-data") pod "8c90e758-caeb-4efd-a15b-cb78f7a803dc" (UID: "8c90e758-caeb-4efd-a15b-cb78f7a803dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.599979 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.600109 5129 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.600125 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.600145 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zllt5\" (UniqueName: \"kubernetes.io/projected/8c90e758-caeb-4efd-a15b-cb78f7a803dc-kube-api-access-zllt5\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.608771 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-scripts" (OuterVolumeSpecName: "scripts") pod "8c90e758-caeb-4efd-a15b-cb78f7a803dc" (UID: "8c90e758-caeb-4efd-a15b-cb78f7a803dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.638573 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "8c90e758-caeb-4efd-a15b-cb78f7a803dc" (UID: "8c90e758-caeb-4efd-a15b-cb78f7a803dc"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.701638 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c90e758-caeb-4efd-a15b-cb78f7a803dc-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:10 crc kubenswrapper[5129]: I0314 09:07:10.701675 5129 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c90e758-caeb-4efd-a15b-cb78f7a803dc-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.255347 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665f878d64-829mz" event={"ID":"8c90e758-caeb-4efd-a15b-cb78f7a803dc","Type":"ContainerDied","Data":"fef262d71cae2801b3eed734064c53504316947c7dc28f1bc73fc76c16de1c40"} Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.255425 5129 scope.go:117] "RemoveContainer" containerID="6818e97f51f258a4900f86e9a16b0ef5f45e525e67a9858123076236d0554eb7" Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.255501 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665f878d64-829mz" Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.307979 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665f878d64-829mz"] Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.318053 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-665f878d64-829mz"] Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.839621 5129 scope.go:117] "RemoveContainer" containerID="ff772be727c9128267a90a5df562a9968b97b9754798abcabf34ba72866700d4" Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.840313 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-589757bd64-ph527" Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.910566 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-659c856df6-q8n7k"] Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.910836 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-659c856df6-q8n7k" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon-log" containerID="cri-o://c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682" gracePeriod=30 Mar 14 09:07:11 crc kubenswrapper[5129]: I0314 09:07:11.910989 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-659c856df6-q8n7k" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" containerID="cri-o://891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6" gracePeriod=30 Mar 14 09:07:12 crc kubenswrapper[5129]: I0314 09:07:12.058176 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" path="/var/lib/kubelet/pods/8c90e758-caeb-4efd-a15b-cb78f7a803dc/volumes" Mar 14 09:07:12 crc kubenswrapper[5129]: I0314 09:07:12.271646 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" event={"ID":"9acd8d94-20a0-4529-be0f-ddcac0466c8f","Type":"ContainerStarted","Data":"211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf"} Mar 14 09:07:12 crc kubenswrapper[5129]: I0314 09:07:12.272935 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:12 crc kubenswrapper[5129]: I0314 09:07:12.278182 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d476769b-mw4rk" event={"ID":"e4c89eb1-1971-49aa-9967-2fde67ead88a","Type":"ContainerStarted","Data":"7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d"} Mar 14 09:07:12 crc kubenswrapper[5129]: I0314 09:07:12.278386 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:12 crc kubenswrapper[5129]: I0314 09:07:12.294931 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" podStartSLOduration=1.8281456999999999 podStartE2EDuration="4.294911511s" podCreationTimestamp="2026-03-14 09:07:08 +0000 UTC" firstStartedPulling="2026-03-14 09:07:09.448544789 +0000 UTC m=+7692.200459973" lastFinishedPulling="2026-03-14 09:07:11.9153106 +0000 UTC m=+7694.667225784" observedRunningTime="2026-03-14 09:07:12.29117839 +0000 UTC m=+7695.043093594" watchObservedRunningTime="2026-03-14 09:07:12.294911511 +0000 UTC m=+7695.046826695" Mar 14 09:07:12 crc kubenswrapper[5129]: I0314 09:07:12.333627 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6d476769b-mw4rk" podStartSLOduration=1.840152446 podStartE2EDuration="4.333584s" podCreationTimestamp="2026-03-14 09:07:08 +0000 UTC" firstStartedPulling="2026-03-14 09:07:09.425741101 +0000 UTC m=+7692.177656285" lastFinishedPulling="2026-03-14 09:07:11.919172665 +0000 UTC m=+7694.671087839" observedRunningTime="2026-03-14 09:07:12.319065007 +0000 UTC m=+7695.070980201" watchObservedRunningTime="2026-03-14 09:07:12.333584 +0000 UTC m=+7695.085499184" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.312272 5129 generic.go:334] "Generic (PLEG): container finished" podID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerID="891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6" exitCode=0 Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.313070 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659c856df6-q8n7k" event={"ID":"444cdcb0-f68c-4943-aa88-dc3710848a7d","Type":"ContainerDied","Data":"891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6"} Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.485863 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-85c4d5f8c-p2pkn"] Mar 14 09:07:15 crc kubenswrapper[5129]: E0314 09:07:15.486576 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon-log" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.486613 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon-log" Mar 14 09:07:15 crc kubenswrapper[5129]: E0314 09:07:15.486658 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.486669 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.486937 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.486965 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c90e758-caeb-4efd-a15b-cb78f7a803dc" containerName="horizon-log" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.487961 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.496669 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-85c4d5f8c-p2pkn"] Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.567876 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5ffb7f87f4-dr5qd"] Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.569869 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.593879 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7cc9c6d9db-l2g7q"] Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.595825 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.605093 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5ffb7f87f4-dr5qd"] Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.617242 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-config-data\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.617324 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fwzr\" (UniqueName: \"kubernetes.io/projected/e416d899-7d11-4e29-a4d6-03774e65691a-kube-api-access-5fwzr\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.617502 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-config-data-custom\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.619338 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cc9c6d9db-l2g7q"] Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.620365 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ws4z\" (UniqueName: \"kubernetes.io/projected/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-kube-api-access-7ws4z\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.620468 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data-custom\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.620712 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.620800 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-combined-ca-bundle\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.620986 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-combined-ca-bundle\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.637527 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-659c856df6-q8n7k" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.723326 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-combined-ca-bundle\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.723423 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-combined-ca-bundle\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.723527 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-config-data\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.723560 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fwzr\" (UniqueName: \"kubernetes.io/projected/e416d899-7d11-4e29-a4d6-03774e65691a-kube-api-access-5fwzr\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.724403 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-config-data-custom\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.724932 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data-custom\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.724996 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ws4z\" (UniqueName: \"kubernetes.io/projected/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-kube-api-access-7ws4z\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.725153 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data-custom\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.725338 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.725469 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77rr\" (UniqueName: \"kubernetes.io/projected/3625535f-12fd-4bcf-93ca-bee191372744-kube-api-access-g77rr\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.725522 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.725617 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-combined-ca-bundle\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.731056 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-combined-ca-bundle\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.735909 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.737521 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data-custom\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.739052 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-config-data\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.741811 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-combined-ca-bundle\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.743402 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fwzr\" (UniqueName: \"kubernetes.io/projected/e416d899-7d11-4e29-a4d6-03774e65691a-kube-api-access-5fwzr\") pod \"heat-cfnapi-5ffb7f87f4-dr5qd\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.743725 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ws4z\" (UniqueName: \"kubernetes.io/projected/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-kube-api-access-7ws4z\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.744309 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc-config-data-custom\") pod \"heat-engine-85c4d5f8c-p2pkn\" (UID: \"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc\") " pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.814339 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.828265 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data-custom\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.828361 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.828411 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g77rr\" (UniqueName: \"kubernetes.io/projected/3625535f-12fd-4bcf-93ca-bee191372744-kube-api-access-g77rr\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.828500 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-combined-ca-bundle\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.832485 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-combined-ca-bundle\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.833340 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data-custom\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.834007 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.847915 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77rr\" (UniqueName: \"kubernetes.io/projected/3625535f-12fd-4bcf-93ca-bee191372744-kube-api-access-g77rr\") pod \"heat-api-7cc9c6d9db-l2g7q\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.897618 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:15 crc kubenswrapper[5129]: I0314 09:07:15.923870 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.175144 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-85c4d5f8c-p2pkn"] Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.331327 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-85c4d5f8c-p2pkn" event={"ID":"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc","Type":"ContainerStarted","Data":"94586678d91abc94aec1f2ebaba5f4d0bcb0a19fc38bdcaffd5a2eb6ccfbf1b2"} Mar 14 09:07:16 crc kubenswrapper[5129]: W0314 09:07:16.456055 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode416d899_7d11_4e29_a4d6_03774e65691a.slice/crio-02c8c32f08d798bc820d410b351e49bae9ae5ff14875a61ad622f5ced99e21ac WatchSource:0}: Error finding container 02c8c32f08d798bc820d410b351e49bae9ae5ff14875a61ad622f5ced99e21ac: Status 404 returned error can't find the container with id 02c8c32f08d798bc820d410b351e49bae9ae5ff14875a61ad622f5ced99e21ac Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.459982 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5ffb7f87f4-dr5qd"] Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.621539 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cc9c6d9db-l2g7q"] Mar 14 09:07:16 crc kubenswrapper[5129]: W0314 09:07:16.636579 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3625535f_12fd_4bcf_93ca_bee191372744.slice/crio-9c8abc60633055b87869bf0e42270a793c5c1d420569f6f3cf628db2c97d5608 WatchSource:0}: Error finding container 9c8abc60633055b87869bf0e42270a793c5c1d420569f6f3cf628db2c97d5608: Status 404 returned error can't find the container with id 9c8abc60633055b87869bf0e42270a793c5c1d420569f6f3cf628db2c97d5608 Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.972487 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6d476769b-mw4rk"] Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.972917 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6d476769b-mw4rk" podUID="e4c89eb1-1971-49aa-9967-2fde67ead88a" containerName="heat-api" containerID="cri-o://7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d" gracePeriod=60 Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.993285 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7cb7bfc8bd-fnxjv"] Mar 14 09:07:16 crc kubenswrapper[5129]: I0314 09:07:16.993491 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" podUID="9acd8d94-20a0-4529-be0f-ddcac0466c8f" containerName="heat-cfnapi" containerID="cri-o://211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf" gracePeriod=60 Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.021294 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-76b5f79468-xdxdp"] Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.022625 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.025583 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.026417 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.038663 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-66dc9496c6-nvsvn"] Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.039966 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.045369 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.045526 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.062881 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-76b5f79468-xdxdp"] Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.163481 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvw5\" (UniqueName: \"kubernetes.io/projected/66f931d0-1eee-42cf-8ac3-998559b831ae-kube-api-access-9mvw5\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.163568 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-config-data-custom\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.163632 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-public-tls-certs\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.174991 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-internal-tls-certs\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175116 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxw5\" (UniqueName: \"kubernetes.io/projected/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-kube-api-access-btxw5\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175160 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-combined-ca-bundle\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175180 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-internal-tls-certs\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175216 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-public-tls-certs\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175249 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-config-data\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175322 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-combined-ca-bundle\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175371 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-config-data-custom\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175437 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-config-data\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.175851 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-66dc9496c6-nvsvn"] Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277482 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-public-tls-certs\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277585 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-internal-tls-certs\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277653 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btxw5\" (UniqueName: \"kubernetes.io/projected/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-kube-api-access-btxw5\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277675 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-combined-ca-bundle\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277693 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-internal-tls-certs\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277711 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-public-tls-certs\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277726 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-config-data\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277757 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-combined-ca-bundle\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277780 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-config-data-custom\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277805 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-config-data\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277846 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvw5\" (UniqueName: \"kubernetes.io/projected/66f931d0-1eee-42cf-8ac3-998559b831ae-kube-api-access-9mvw5\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.277877 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-config-data-custom\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.283627 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-config-data-custom\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.283863 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-public-tls-certs\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.284116 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-public-tls-certs\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.284936 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-config-data-custom\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.285389 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-combined-ca-bundle\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.285763 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-internal-tls-certs\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.285954 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f931d0-1eee-42cf-8ac3-998559b831ae-config-data\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.286302 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-internal-tls-certs\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.287431 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-config-data\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.287850 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-combined-ca-bundle\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.297703 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxw5\" (UniqueName: \"kubernetes.io/projected/1b4b6a8a-c1a0-4104-aa02-b24f51abec71-kube-api-access-btxw5\") pod \"heat-cfnapi-66dc9496c6-nvsvn\" (UID: \"1b4b6a8a-c1a0-4104-aa02-b24f51abec71\") " pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.300825 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvw5\" (UniqueName: \"kubernetes.io/projected/66f931d0-1eee-42cf-8ac3-998559b831ae-kube-api-access-9mvw5\") pod \"heat-api-76b5f79468-xdxdp\" (UID: \"66f931d0-1eee-42cf-8ac3-998559b831ae\") " pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.348443 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-85c4d5f8c-p2pkn" event={"ID":"6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc","Type":"ContainerStarted","Data":"5543dc536d83e3b4b18dde49f5f182d4f846f88de97ce5c5f1c17215aa12871b"} Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.349482 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.351550 5129 generic.go:334] "Generic (PLEG): container finished" podID="3625535f-12fd-4bcf-93ca-bee191372744" containerID="f49d09335ea1b3e4ec83484c08f3cda43091a9945bbe1e49b97ed25e81f4b1ff" exitCode=1 Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.351659 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cc9c6d9db-l2g7q" event={"ID":"3625535f-12fd-4bcf-93ca-bee191372744","Type":"ContainerDied","Data":"f49d09335ea1b3e4ec83484c08f3cda43091a9945bbe1e49b97ed25e81f4b1ff"} Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.351719 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cc9c6d9db-l2g7q" event={"ID":"3625535f-12fd-4bcf-93ca-bee191372744","Type":"ContainerStarted","Data":"9c8abc60633055b87869bf0e42270a793c5c1d420569f6f3cf628db2c97d5608"} Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.352616 5129 scope.go:117] "RemoveContainer" containerID="f49d09335ea1b3e4ec83484c08f3cda43091a9945bbe1e49b97ed25e81f4b1ff" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.357230 5129 generic.go:334] "Generic (PLEG): container finished" podID="e416d899-7d11-4e29-a4d6-03774e65691a" containerID="fe57dab5967f7aecd0e548c13583e9e4c3b9bcefa110e6fdc9544b996bfccf2e" exitCode=1 Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.357277 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" event={"ID":"e416d899-7d11-4e29-a4d6-03774e65691a","Type":"ContainerDied","Data":"fe57dab5967f7aecd0e548c13583e9e4c3b9bcefa110e6fdc9544b996bfccf2e"} Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.357303 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" event={"ID":"e416d899-7d11-4e29-a4d6-03774e65691a","Type":"ContainerStarted","Data":"02c8c32f08d798bc820d410b351e49bae9ae5ff14875a61ad622f5ced99e21ac"} Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.357996 5129 scope.go:117] "RemoveContainer" containerID="fe57dab5967f7aecd0e548c13583e9e4c3b9bcefa110e6fdc9544b996bfccf2e" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.362567 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.391188 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.393305 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-85c4d5f8c-p2pkn" podStartSLOduration=2.393260393 podStartE2EDuration="2.393260393s" podCreationTimestamp="2026-03-14 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:07:17.369785156 +0000 UTC m=+7700.121700340" watchObservedRunningTime="2026-03-14 09:07:17.393260393 +0000 UTC m=+7700.145175577" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.792531 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.873499 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897401 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data\") pod \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897473 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86qgz\" (UniqueName: \"kubernetes.io/projected/9acd8d94-20a0-4529-be0f-ddcac0466c8f-kube-api-access-86qgz\") pod \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897544 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52pw9\" (UniqueName: \"kubernetes.io/projected/e4c89eb1-1971-49aa-9967-2fde67ead88a-kube-api-access-52pw9\") pod \"e4c89eb1-1971-49aa-9967-2fde67ead88a\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897572 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-combined-ca-bundle\") pod \"e4c89eb1-1971-49aa-9967-2fde67ead88a\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897619 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data-custom\") pod \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897682 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-combined-ca-bundle\") pod \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\" (UID: \"9acd8d94-20a0-4529-be0f-ddcac0466c8f\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897709 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data-custom\") pod \"e4c89eb1-1971-49aa-9967-2fde67ead88a\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.897734 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data\") pod \"e4c89eb1-1971-49aa-9967-2fde67ead88a\" (UID: \"e4c89eb1-1971-49aa-9967-2fde67ead88a\") " Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.921838 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c89eb1-1971-49aa-9967-2fde67ead88a-kube-api-access-52pw9" (OuterVolumeSpecName: "kube-api-access-52pw9") pod "e4c89eb1-1971-49aa-9967-2fde67ead88a" (UID: "e4c89eb1-1971-49aa-9967-2fde67ead88a"). InnerVolumeSpecName "kube-api-access-52pw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.932229 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9acd8d94-20a0-4529-be0f-ddcac0466c8f-kube-api-access-86qgz" (OuterVolumeSpecName: "kube-api-access-86qgz") pod "9acd8d94-20a0-4529-be0f-ddcac0466c8f" (UID: "9acd8d94-20a0-4529-be0f-ddcac0466c8f"). InnerVolumeSpecName "kube-api-access-86qgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.938219 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4c89eb1-1971-49aa-9967-2fde67ead88a" (UID: "e4c89eb1-1971-49aa-9967-2fde67ead88a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.938930 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9acd8d94-20a0-4529-be0f-ddcac0466c8f" (UID: "9acd8d94-20a0-4529-be0f-ddcac0466c8f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.968062 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9acd8d94-20a0-4529-be0f-ddcac0466c8f" (UID: "9acd8d94-20a0-4529-be0f-ddcac0466c8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:17 crc kubenswrapper[5129]: I0314 09:07:17.980012 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4c89eb1-1971-49aa-9967-2fde67ead88a" (UID: "e4c89eb1-1971-49aa-9967-2fde67ead88a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.002152 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86qgz\" (UniqueName: \"kubernetes.io/projected/9acd8d94-20a0-4529-be0f-ddcac0466c8f-kube-api-access-86qgz\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.002646 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52pw9\" (UniqueName: \"kubernetes.io/projected/e4c89eb1-1971-49aa-9967-2fde67ead88a-kube-api-access-52pw9\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.002665 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.002674 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.002684 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.002694 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.052967 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data" (OuterVolumeSpecName: "config-data") pod "e4c89eb1-1971-49aa-9967-2fde67ead88a" (UID: "e4c89eb1-1971-49aa-9967-2fde67ead88a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.057713 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data" (OuterVolumeSpecName: "config-data") pod "9acd8d94-20a0-4529-be0f-ddcac0466c8f" (UID: "9acd8d94-20a0-4529-be0f-ddcac0466c8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.105156 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acd8d94-20a0-4529-be0f-ddcac0466c8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.105191 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c89eb1-1971-49aa-9967-2fde67ead88a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.206751 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-76b5f79468-xdxdp"] Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.370650 5129 generic.go:334] "Generic (PLEG): container finished" podID="9acd8d94-20a0-4529-be0f-ddcac0466c8f" containerID="211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf" exitCode=0 Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.372045 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" event={"ID":"9acd8d94-20a0-4529-be0f-ddcac0466c8f","Type":"ContainerDied","Data":"211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf"} Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.372188 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" event={"ID":"9acd8d94-20a0-4529-be0f-ddcac0466c8f","Type":"ContainerDied","Data":"b8e15abc4b30ec14260b47993b5a6503160fd54179da4520e2b64504fd064eb8"} Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.372281 5129 scope.go:117] "RemoveContainer" containerID="211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.372533 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cb7bfc8bd-fnxjv" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.380464 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cc9c6d9db-l2g7q" event={"ID":"3625535f-12fd-4bcf-93ca-bee191372744","Type":"ContainerStarted","Data":"ebf9eca05ab28c8ab09e58f6b57bfd29db6088ea40fdcfc2abc0fd99f705ac57"} Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.380767 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.385868 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" event={"ID":"e416d899-7d11-4e29-a4d6-03774e65691a","Type":"ContainerStarted","Data":"a6924bd25c9713c1d29ad974573da527ef3360b10ae1bbaca0c2eb7adc59aee2"} Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.386446 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.390545 5129 generic.go:334] "Generic (PLEG): container finished" podID="e4c89eb1-1971-49aa-9967-2fde67ead88a" containerID="7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d" exitCode=0 Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.391834 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d476769b-mw4rk" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.396135 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d476769b-mw4rk" event={"ID":"e4c89eb1-1971-49aa-9967-2fde67ead88a","Type":"ContainerDied","Data":"7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d"} Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.396286 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d476769b-mw4rk" event={"ID":"e4c89eb1-1971-49aa-9967-2fde67ead88a","Type":"ContainerDied","Data":"3b6eb43425a3b3c998ffc41fe71e489f610b40c663f8cd84944e5608e22791c7"} Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.405785 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7cb7bfc8bd-fnxjv"] Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.412980 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7cb7bfc8bd-fnxjv"] Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.425771 5129 scope.go:117] "RemoveContainer" containerID="211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf" Mar 14 09:07:18 crc kubenswrapper[5129]: E0314 09:07:18.426797 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf\": container with ID starting with 211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf not found: ID does not exist" containerID="211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.426852 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf"} err="failed to get container status \"211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf\": rpc error: code = NotFound desc = could not find container \"211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf\": container with ID starting with 211e96e9e63cedbf9177d25a4a8f01573aa9a3f9ce84f919086bca1627897abf not found: ID does not exist" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.426884 5129 scope.go:117] "RemoveContainer" containerID="7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.434553 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6d476769b-mw4rk"] Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.448180 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6d476769b-mw4rk"] Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.454842 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" podStartSLOduration=3.45482844 podStartE2EDuration="3.45482844s" podCreationTimestamp="2026-03-14 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:07:18.441299013 +0000 UTC m=+7701.193214197" watchObservedRunningTime="2026-03-14 09:07:18.45482844 +0000 UTC m=+7701.206743624" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.463204 5129 scope.go:117] "RemoveContainer" containerID="7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d" Mar 14 09:07:18 crc kubenswrapper[5129]: E0314 09:07:18.463684 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d\": container with ID starting with 7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d not found: ID does not exist" containerID="7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.464477 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d"} err="failed to get container status \"7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d\": rpc error: code = NotFound desc = could not find container \"7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d\": container with ID starting with 7c6b67416022f7d85d238edddc864bdff41d7d4c8b5fa0e29ad294ffbe47eb8d not found: ID does not exist" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.484063 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7cc9c6d9db-l2g7q" podStartSLOduration=3.484029073 podStartE2EDuration="3.484029073s" podCreationTimestamp="2026-03-14 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:07:18.457847902 +0000 UTC m=+7701.209763086" watchObservedRunningTime="2026-03-14 09:07:18.484029073 +0000 UTC m=+7701.235944257" Mar 14 09:07:18 crc kubenswrapper[5129]: I0314 09:07:18.509922 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-66dc9496c6-nvsvn"] Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.402853 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76b5f79468-xdxdp" event={"ID":"66f931d0-1eee-42cf-8ac3-998559b831ae","Type":"ContainerStarted","Data":"054c7214bdfeda9f7fbe165754a679843894482f81b260e1ee424d66fa104aac"} Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.403305 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76b5f79468-xdxdp" event={"ID":"66f931d0-1eee-42cf-8ac3-998559b831ae","Type":"ContainerStarted","Data":"fe0399ff29ccda8006911c84747e93865f388bb9d5fcef4e64870fa5c63a8fa1"} Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.403353 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.405889 5129 generic.go:334] "Generic (PLEG): container finished" podID="e416d899-7d11-4e29-a4d6-03774e65691a" containerID="a6924bd25c9713c1d29ad974573da527ef3360b10ae1bbaca0c2eb7adc59aee2" exitCode=1 Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.405955 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" event={"ID":"e416d899-7d11-4e29-a4d6-03774e65691a","Type":"ContainerDied","Data":"a6924bd25c9713c1d29ad974573da527ef3360b10ae1bbaca0c2eb7adc59aee2"} Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.405991 5129 scope.go:117] "RemoveContainer" containerID="fe57dab5967f7aecd0e548c13583e9e4c3b9bcefa110e6fdc9544b996bfccf2e" Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.406362 5129 scope.go:117] "RemoveContainer" containerID="a6924bd25c9713c1d29ad974573da527ef3360b10ae1bbaca0c2eb7adc59aee2" Mar 14 09:07:19 crc kubenswrapper[5129]: E0314 09:07:19.406627 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5ffb7f87f4-dr5qd_openstack(e416d899-7d11-4e29-a4d6-03774e65691a)\"" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.422648 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-76b5f79468-xdxdp" podStartSLOduration=3.422616614 podStartE2EDuration="3.422616614s" podCreationTimestamp="2026-03-14 09:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:07:19.418160652 +0000 UTC m=+7702.170075846" watchObservedRunningTime="2026-03-14 09:07:19.422616614 +0000 UTC m=+7702.174531798" Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.446152 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" event={"ID":"1b4b6a8a-c1a0-4104-aa02-b24f51abec71","Type":"ContainerStarted","Data":"c6f94f103cb76c2918c632615197ea9248195b8a6c2d4d85757e7db7e63cb501"} Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.446224 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" event={"ID":"1b4b6a8a-c1a0-4104-aa02-b24f51abec71","Type":"ContainerStarted","Data":"0a4163a2999b950a4569ba80a5fb1def0e2469e315c75eaf8b65cc28cd0d83ae"} Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.448805 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.453178 5129 generic.go:334] "Generic (PLEG): container finished" podID="3625535f-12fd-4bcf-93ca-bee191372744" containerID="ebf9eca05ab28c8ab09e58f6b57bfd29db6088ea40fdcfc2abc0fd99f705ac57" exitCode=1 Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.453246 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cc9c6d9db-l2g7q" event={"ID":"3625535f-12fd-4bcf-93ca-bee191372744","Type":"ContainerDied","Data":"ebf9eca05ab28c8ab09e58f6b57bfd29db6088ea40fdcfc2abc0fd99f705ac57"} Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.476679 5129 scope.go:117] "RemoveContainer" containerID="ebf9eca05ab28c8ab09e58f6b57bfd29db6088ea40fdcfc2abc0fd99f705ac57" Mar 14 09:07:19 crc kubenswrapper[5129]: E0314 09:07:19.477273 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7cc9c6d9db-l2g7q_openstack(3625535f-12fd-4bcf-93ca-bee191372744)\"" pod="openstack/heat-api-7cc9c6d9db-l2g7q" podUID="3625535f-12fd-4bcf-93ca-bee191372744" Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.478874 5129 scope.go:117] "RemoveContainer" containerID="f49d09335ea1b3e4ec83484c08f3cda43091a9945bbe1e49b97ed25e81f4b1ff" Mar 14 09:07:19 crc kubenswrapper[5129]: I0314 09:07:19.502531 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" podStartSLOduration=2.502505921 podStartE2EDuration="2.502505921s" podCreationTimestamp="2026-03-14 09:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:07:19.4678076 +0000 UTC m=+7702.219722784" watchObservedRunningTime="2026-03-14 09:07:19.502505921 +0000 UTC m=+7702.254421105" Mar 14 09:07:20 crc kubenswrapper[5129]: I0314 09:07:20.049091 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9acd8d94-20a0-4529-be0f-ddcac0466c8f" path="/var/lib/kubelet/pods/9acd8d94-20a0-4529-be0f-ddcac0466c8f/volumes" Mar 14 09:07:20 crc kubenswrapper[5129]: I0314 09:07:20.051098 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c89eb1-1971-49aa-9967-2fde67ead88a" path="/var/lib/kubelet/pods/e4c89eb1-1971-49aa-9967-2fde67ead88a/volumes" Mar 14 09:07:20 crc kubenswrapper[5129]: I0314 09:07:20.463392 5129 scope.go:117] "RemoveContainer" containerID="ebf9eca05ab28c8ab09e58f6b57bfd29db6088ea40fdcfc2abc0fd99f705ac57" Mar 14 09:07:20 crc kubenswrapper[5129]: E0314 09:07:20.463622 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7cc9c6d9db-l2g7q_openstack(3625535f-12fd-4bcf-93ca-bee191372744)\"" pod="openstack/heat-api-7cc9c6d9db-l2g7q" podUID="3625535f-12fd-4bcf-93ca-bee191372744" Mar 14 09:07:20 crc kubenswrapper[5129]: I0314 09:07:20.466471 5129 scope.go:117] "RemoveContainer" containerID="a6924bd25c9713c1d29ad974573da527ef3360b10ae1bbaca0c2eb7adc59aee2" Mar 14 09:07:20 crc kubenswrapper[5129]: E0314 09:07:20.466803 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5ffb7f87f4-dr5qd_openstack(e416d899-7d11-4e29-a4d6-03774e65691a)\"" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" Mar 14 09:07:20 crc kubenswrapper[5129]: I0314 09:07:20.898525 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:20 crc kubenswrapper[5129]: I0314 09:07:20.924710 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:21 crc kubenswrapper[5129]: I0314 09:07:21.480077 5129 scope.go:117] "RemoveContainer" containerID="a6924bd25c9713c1d29ad974573da527ef3360b10ae1bbaca0c2eb7adc59aee2" Mar 14 09:07:21 crc kubenswrapper[5129]: I0314 09:07:21.480623 5129 scope.go:117] "RemoveContainer" containerID="ebf9eca05ab28c8ab09e58f6b57bfd29db6088ea40fdcfc2abc0fd99f705ac57" Mar 14 09:07:21 crc kubenswrapper[5129]: E0314 09:07:21.480759 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5ffb7f87f4-dr5qd_openstack(e416d899-7d11-4e29-a4d6-03774e65691a)\"" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" Mar 14 09:07:21 crc kubenswrapper[5129]: E0314 09:07:21.480895 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7cc9c6d9db-l2g7q_openstack(3625535f-12fd-4bcf-93ca-bee191372744)\"" pod="openstack/heat-api-7cc9c6d9db-l2g7q" podUID="3625535f-12fd-4bcf-93ca-bee191372744" Mar 14 09:07:25 crc kubenswrapper[5129]: I0314 09:07:25.632675 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-659c856df6-q8n7k" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 14 09:07:28 crc kubenswrapper[5129]: I0314 09:07:28.771656 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:28 crc kubenswrapper[5129]: I0314 09:07:28.805472 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-66dc9496c6-nvsvn" Mar 14 09:07:28 crc kubenswrapper[5129]: I0314 09:07:28.817841 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-76b5f79468-xdxdp" Mar 14 09:07:28 crc kubenswrapper[5129]: I0314 09:07:28.875772 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5ffb7f87f4-dr5qd"] Mar 14 09:07:28 crc kubenswrapper[5129]: I0314 09:07:28.923561 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7cc9c6d9db-l2g7q"] Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.304803 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.309870 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fwzr\" (UniqueName: \"kubernetes.io/projected/e416d899-7d11-4e29-a4d6-03774e65691a-kube-api-access-5fwzr\") pod \"e416d899-7d11-4e29-a4d6-03774e65691a\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.318584 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e416d899-7d11-4e29-a4d6-03774e65691a-kube-api-access-5fwzr" (OuterVolumeSpecName: "kube-api-access-5fwzr") pod "e416d899-7d11-4e29-a4d6-03774e65691a" (UID: "e416d899-7d11-4e29-a4d6-03774e65691a"). InnerVolumeSpecName "kube-api-access-5fwzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.411780 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data\") pod \"e416d899-7d11-4e29-a4d6-03774e65691a\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.411867 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-combined-ca-bundle\") pod \"e416d899-7d11-4e29-a4d6-03774e65691a\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.412005 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data-custom\") pod \"e416d899-7d11-4e29-a4d6-03774e65691a\" (UID: \"e416d899-7d11-4e29-a4d6-03774e65691a\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.412678 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fwzr\" (UniqueName: \"kubernetes.io/projected/e416d899-7d11-4e29-a4d6-03774e65691a-kube-api-access-5fwzr\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.416863 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e416d899-7d11-4e29-a4d6-03774e65691a" (UID: "e416d899-7d11-4e29-a4d6-03774e65691a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.439307 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e416d899-7d11-4e29-a4d6-03774e65691a" (UID: "e416d899-7d11-4e29-a4d6-03774e65691a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.460506 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data" (OuterVolumeSpecName: "config-data") pod "e416d899-7d11-4e29-a4d6-03774e65691a" (UID: "e416d899-7d11-4e29-a4d6-03774e65691a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.478014 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.491000 5129 scope.go:117] "RemoveContainer" containerID="d084c43162e9879770e569b167ca69ab6a90086fc13f85f53cac8b93f0e4bc80" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.515102 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.515141 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.515153 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e416d899-7d11-4e29-a4d6-03774e65691a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.527295 5129 scope.go:117] "RemoveContainer" containerID="1e0d55ae3db73d445f78dead6c5674087358766ca6767a11c9401d3213a475a7" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.569498 5129 scope.go:117] "RemoveContainer" containerID="3654335946cffa3fff6a4ba9084ce2f1a20dafab44a3e605e0cb811b09e860cd" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.573906 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cc9c6d9db-l2g7q" event={"ID":"3625535f-12fd-4bcf-93ca-bee191372744","Type":"ContainerDied","Data":"9c8abc60633055b87869bf0e42270a793c5c1d420569f6f3cf628db2c97d5608"} Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.573945 5129 scope.go:117] "RemoveContainer" containerID="ebf9eca05ab28c8ab09e58f6b57bfd29db6088ea40fdcfc2abc0fd99f705ac57" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.574181 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cc9c6d9db-l2g7q" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.576111 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" event={"ID":"e416d899-7d11-4e29-a4d6-03774e65691a","Type":"ContainerDied","Data":"02c8c32f08d798bc820d410b351e49bae9ae5ff14875a61ad622f5ced99e21ac"} Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.576186 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ffb7f87f4-dr5qd" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.616098 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data-custom\") pod \"3625535f-12fd-4bcf-93ca-bee191372744\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.616172 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data\") pod \"3625535f-12fd-4bcf-93ca-bee191372744\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.616331 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g77rr\" (UniqueName: \"kubernetes.io/projected/3625535f-12fd-4bcf-93ca-bee191372744-kube-api-access-g77rr\") pod \"3625535f-12fd-4bcf-93ca-bee191372744\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.616384 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-combined-ca-bundle\") pod \"3625535f-12fd-4bcf-93ca-bee191372744\" (UID: \"3625535f-12fd-4bcf-93ca-bee191372744\") " Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.619978 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5ffb7f87f4-dr5qd"] Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.620656 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3625535f-12fd-4bcf-93ca-bee191372744" (UID: "3625535f-12fd-4bcf-93ca-bee191372744"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.623502 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3625535f-12fd-4bcf-93ca-bee191372744-kube-api-access-g77rr" (OuterVolumeSpecName: "kube-api-access-g77rr") pod "3625535f-12fd-4bcf-93ca-bee191372744" (UID: "3625535f-12fd-4bcf-93ca-bee191372744"). InnerVolumeSpecName "kube-api-access-g77rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.633235 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5ffb7f87f4-dr5qd"] Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.636170 5129 scope.go:117] "RemoveContainer" containerID="a6924bd25c9713c1d29ad974573da527ef3360b10ae1bbaca0c2eb7adc59aee2" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.646237 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3625535f-12fd-4bcf-93ca-bee191372744" (UID: "3625535f-12fd-4bcf-93ca-bee191372744"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.672675 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data" (OuterVolumeSpecName: "config-data") pod "3625535f-12fd-4bcf-93ca-bee191372744" (UID: "3625535f-12fd-4bcf-93ca-bee191372744"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.718803 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g77rr\" (UniqueName: \"kubernetes.io/projected/3625535f-12fd-4bcf-93ca-bee191372744-kube-api-access-g77rr\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.718843 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.718855 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.718866 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3625535f-12fd-4bcf-93ca-bee191372744-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.910738 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7cc9c6d9db-l2g7q"] Mar 14 09:07:29 crc kubenswrapper[5129]: I0314 09:07:29.918898 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7cc9c6d9db-l2g7q"] Mar 14 09:07:30 crc kubenswrapper[5129]: I0314 09:07:30.057264 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3625535f-12fd-4bcf-93ca-bee191372744" path="/var/lib/kubelet/pods/3625535f-12fd-4bcf-93ca-bee191372744/volumes" Mar 14 09:07:30 crc kubenswrapper[5129]: I0314 09:07:30.058090 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" path="/var/lib/kubelet/pods/e416d899-7d11-4e29-a4d6-03774e65691a/volumes" Mar 14 09:07:35 crc kubenswrapper[5129]: I0314 09:07:35.633206 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-659c856df6-q8n7k" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 14 09:07:35 crc kubenswrapper[5129]: I0314 09:07:35.633952 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:07:35 crc kubenswrapper[5129]: I0314 09:07:35.851663 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-85c4d5f8c-p2pkn" Mar 14 09:07:35 crc kubenswrapper[5129]: I0314 09:07:35.921754 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6bd79d47cd-8jc86"] Mar 14 09:07:35 crc kubenswrapper[5129]: I0314 09:07:35.922175 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6bd79d47cd-8jc86" podUID="d710d033-8d28-4007-9088-1b98afe917da" containerName="heat-engine" containerID="cri-o://5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" gracePeriod=60 Mar 14 09:07:38 crc kubenswrapper[5129]: E0314 09:07:38.740816 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 09:07:38 crc kubenswrapper[5129]: E0314 09:07:38.747521 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 09:07:38 crc kubenswrapper[5129]: E0314 09:07:38.750684 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 14 09:07:38 crc kubenswrapper[5129]: E0314 09:07:38.750719 5129 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6bd79d47cd-8jc86" podUID="d710d033-8d28-4007-9088-1b98afe917da" containerName="heat-engine" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.333343 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.340313 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.499120 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-combined-ca-bundle\") pod \"d710d033-8d28-4007-9088-1b98afe917da\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.499249 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-config-data\") pod \"444cdcb0-f68c-4943-aa88-dc3710848a7d\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.499294 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data\") pod \"d710d033-8d28-4007-9088-1b98afe917da\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.499328 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-scripts\") pod \"444cdcb0-f68c-4943-aa88-dc3710848a7d\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.499401 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-tls-certs\") pod \"444cdcb0-f68c-4943-aa88-dc3710848a7d\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.499462 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/444cdcb0-f68c-4943-aa88-dc3710848a7d-logs\") pod \"444cdcb0-f68c-4943-aa88-dc3710848a7d\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.500041 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-secret-key\") pod \"444cdcb0-f68c-4943-aa88-dc3710848a7d\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.500105 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data-custom\") pod \"d710d033-8d28-4007-9088-1b98afe917da\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.500165 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jdq7\" (UniqueName: \"kubernetes.io/projected/d710d033-8d28-4007-9088-1b98afe917da-kube-api-access-7jdq7\") pod \"d710d033-8d28-4007-9088-1b98afe917da\" (UID: \"d710d033-8d28-4007-9088-1b98afe917da\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.500205 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pccq5\" (UniqueName: \"kubernetes.io/projected/444cdcb0-f68c-4943-aa88-dc3710848a7d-kube-api-access-pccq5\") pod \"444cdcb0-f68c-4943-aa88-dc3710848a7d\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.500238 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-combined-ca-bundle\") pod \"444cdcb0-f68c-4943-aa88-dc3710848a7d\" (UID: \"444cdcb0-f68c-4943-aa88-dc3710848a7d\") " Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.500299 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/444cdcb0-f68c-4943-aa88-dc3710848a7d-logs" (OuterVolumeSpecName: "logs") pod "444cdcb0-f68c-4943-aa88-dc3710848a7d" (UID: "444cdcb0-f68c-4943-aa88-dc3710848a7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.500781 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/444cdcb0-f68c-4943-aa88-dc3710848a7d-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.505628 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d710d033-8d28-4007-9088-1b98afe917da-kube-api-access-7jdq7" (OuterVolumeSpecName: "kube-api-access-7jdq7") pod "d710d033-8d28-4007-9088-1b98afe917da" (UID: "d710d033-8d28-4007-9088-1b98afe917da"). InnerVolumeSpecName "kube-api-access-7jdq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.505709 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d710d033-8d28-4007-9088-1b98afe917da" (UID: "d710d033-8d28-4007-9088-1b98afe917da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.507919 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "444cdcb0-f68c-4943-aa88-dc3710848a7d" (UID: "444cdcb0-f68c-4943-aa88-dc3710848a7d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.519812 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444cdcb0-f68c-4943-aa88-dc3710848a7d-kube-api-access-pccq5" (OuterVolumeSpecName: "kube-api-access-pccq5") pod "444cdcb0-f68c-4943-aa88-dc3710848a7d" (UID: "444cdcb0-f68c-4943-aa88-dc3710848a7d"). InnerVolumeSpecName "kube-api-access-pccq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.533634 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-scripts" (OuterVolumeSpecName: "scripts") pod "444cdcb0-f68c-4943-aa88-dc3710848a7d" (UID: "444cdcb0-f68c-4943-aa88-dc3710848a7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.546662 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "444cdcb0-f68c-4943-aa88-dc3710848a7d" (UID: "444cdcb0-f68c-4943-aa88-dc3710848a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.559660 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-config-data" (OuterVolumeSpecName: "config-data") pod "444cdcb0-f68c-4943-aa88-dc3710848a7d" (UID: "444cdcb0-f68c-4943-aa88-dc3710848a7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.563312 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d710d033-8d28-4007-9088-1b98afe917da" (UID: "d710d033-8d28-4007-9088-1b98afe917da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.563754 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data" (OuterVolumeSpecName: "config-data") pod "d710d033-8d28-4007-9088-1b98afe917da" (UID: "d710d033-8d28-4007-9088-1b98afe917da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.567469 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "444cdcb0-f68c-4943-aa88-dc3710848a7d" (UID: "444cdcb0-f68c-4943-aa88-dc3710848a7d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.603363 5129 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.603697 5129 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.603808 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jdq7\" (UniqueName: \"kubernetes.io/projected/d710d033-8d28-4007-9088-1b98afe917da-kube-api-access-7jdq7\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.603906 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pccq5\" (UniqueName: \"kubernetes.io/projected/444cdcb0-f68c-4943-aa88-dc3710848a7d-kube-api-access-pccq5\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.603990 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.604075 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.604166 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.604257 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d710d033-8d28-4007-9088-1b98afe917da-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.604350 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/444cdcb0-f68c-4943-aa88-dc3710848a7d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.604450 5129 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/444cdcb0-f68c-4943-aa88-dc3710848a7d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.703947 5129 generic.go:334] "Generic (PLEG): container finished" podID="d710d033-8d28-4007-9088-1b98afe917da" containerID="5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" exitCode=0 Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.704302 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bd79d47cd-8jc86" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.704218 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bd79d47cd-8jc86" event={"ID":"d710d033-8d28-4007-9088-1b98afe917da","Type":"ContainerDied","Data":"5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026"} Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.704449 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bd79d47cd-8jc86" event={"ID":"d710d033-8d28-4007-9088-1b98afe917da","Type":"ContainerDied","Data":"b1d10b9c64d2f64e156801bc2c7812d8f324405bdafdaf4a43ad950b246d25a7"} Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.704479 5129 scope.go:117] "RemoveContainer" containerID="5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.708073 5129 generic.go:334] "Generic (PLEG): container finished" podID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerID="c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682" exitCode=137 Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.708180 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659c856df6-q8n7k" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.708201 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659c856df6-q8n7k" event={"ID":"444cdcb0-f68c-4943-aa88-dc3710848a7d","Type":"ContainerDied","Data":"c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682"} Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.708428 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659c856df6-q8n7k" event={"ID":"444cdcb0-f68c-4943-aa88-dc3710848a7d","Type":"ContainerDied","Data":"a933796dd05a4bfa5c1ff82d3425e434da909b082607d96906d73b42c5be8b7a"} Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.744460 5129 scope.go:117] "RemoveContainer" containerID="5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" Mar 14 09:07:42 crc kubenswrapper[5129]: E0314 09:07:42.746209 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026\": container with ID starting with 5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026 not found: ID does not exist" containerID="5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.746253 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026"} err="failed to get container status \"5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026\": rpc error: code = NotFound desc = could not find container \"5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026\": container with ID starting with 5d7a56745fa6261133cb60302e4a06eaf958b1e5ba92166579b727dca336b026 not found: ID does not exist" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.746278 5129 scope.go:117] "RemoveContainer" containerID="891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.750292 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6bd79d47cd-8jc86"] Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.763759 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6bd79d47cd-8jc86"] Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.772512 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-659c856df6-q8n7k"] Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.783287 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-659c856df6-q8n7k"] Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.918106 5129 scope.go:117] "RemoveContainer" containerID="c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.983902 5129 scope.go:117] "RemoveContainer" containerID="891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6" Mar 14 09:07:42 crc kubenswrapper[5129]: E0314 09:07:42.989750 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6\": container with ID starting with 891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6 not found: ID does not exist" containerID="891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.990047 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6"} err="failed to get container status \"891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6\": rpc error: code = NotFound desc = could not find container \"891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6\": container with ID starting with 891fc688af2ec8e829eea7544e8bc2e52924fd1da1d453a8aba3f1e4a3b9b2c6 not found: ID does not exist" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.990086 5129 scope.go:117] "RemoveContainer" containerID="c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682" Mar 14 09:07:42 crc kubenswrapper[5129]: E0314 09:07:42.991451 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682\": container with ID starting with c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682 not found: ID does not exist" containerID="c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682" Mar 14 09:07:42 crc kubenswrapper[5129]: I0314 09:07:42.991488 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682"} err="failed to get container status \"c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682\": rpc error: code = NotFound desc = could not find container \"c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682\": container with ID starting with c20e6877f1ebf04753afceed6552cbfb2b38c48881a1d842ce09335d1fd9d682 not found: ID does not exist" Mar 14 09:07:44 crc kubenswrapper[5129]: I0314 09:07:44.055360 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" path="/var/lib/kubelet/pods/444cdcb0-f68c-4943-aa88-dc3710848a7d/volumes" Mar 14 09:07:44 crc kubenswrapper[5129]: I0314 09:07:44.056975 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d710d033-8d28-4007-9088-1b98afe917da" path="/var/lib/kubelet/pods/d710d033-8d28-4007-9088-1b98afe917da/volumes" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.126444 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs"] Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127589 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acd8d94-20a0-4529-be0f-ddcac0466c8f" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127628 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acd8d94-20a0-4529-be0f-ddcac0466c8f" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127649 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d710d033-8d28-4007-9088-1b98afe917da" containerName="heat-engine" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127658 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d710d033-8d28-4007-9088-1b98afe917da" containerName="heat-engine" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127677 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c89eb1-1971-49aa-9967-2fde67ead88a" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127687 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c89eb1-1971-49aa-9967-2fde67ead88a" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127704 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127712 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127728 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127736 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127762 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon-log" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127772 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon-log" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127789 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127797 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127816 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3625535f-12fd-4bcf-93ca-bee191372744" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127824 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3625535f-12fd-4bcf-93ca-bee191372744" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: E0314 09:07:57.127839 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3625535f-12fd-4bcf-93ca-bee191372744" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.127847 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3625535f-12fd-4bcf-93ca-bee191372744" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128082 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acd8d94-20a0-4529-be0f-ddcac0466c8f" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128100 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128111 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128128 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3625535f-12fd-4bcf-93ca-bee191372744" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128144 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3625535f-12fd-4bcf-93ca-bee191372744" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128160 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c89eb1-1971-49aa-9967-2fde67ead88a" containerName="heat-api" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128170 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d710d033-8d28-4007-9088-1b98afe917da" containerName="heat-engine" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128191 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="444cdcb0-f68c-4943-aa88-dc3710848a7d" containerName="horizon-log" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.128751 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e416d899-7d11-4e29-a4d6-03774e65691a" containerName="heat-cfnapi" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.130233 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.133193 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.139912 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs"] Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.242928 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.243631 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-kube-api-access-6dvks\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.243866 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.346069 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.346257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-kube-api-access-6dvks\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.346365 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.346598 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.349899 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.382799 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-kube-api-access-6dvks\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.468363 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:07:57 crc kubenswrapper[5129]: I0314 09:07:57.968025 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs"] Mar 14 09:07:58 crc kubenswrapper[5129]: I0314 09:07:58.873640 5129 generic.go:334] "Generic (PLEG): container finished" podID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerID="2408c01622dd172950c980c3db71245690603ee64b9d1840bfe4c99b47d0a9bc" exitCode=0 Mar 14 09:07:58 crc kubenswrapper[5129]: I0314 09:07:58.873689 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" event={"ID":"23d1faf4-a154-444b-93c5-9ea1eb9e43ea","Type":"ContainerDied","Data":"2408c01622dd172950c980c3db71245690603ee64b9d1840bfe4c99b47d0a9bc"} Mar 14 09:07:58 crc kubenswrapper[5129]: I0314 09:07:58.873977 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" event={"ID":"23d1faf4-a154-444b-93c5-9ea1eb9e43ea","Type":"ContainerStarted","Data":"e77a4a78e2223a36d15d1a20e1f2ff645977bd8262e11b28a47dac8bac3d3101"} Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.145673 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557988-84zvt"] Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.147495 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-84zvt" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.150057 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.150514 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.150760 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.156146 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-84zvt"] Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.305473 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6p7s\" (UniqueName: \"kubernetes.io/projected/be1f1ae2-41a9-4559-a79d-3dd00f407d52-kube-api-access-z6p7s\") pod \"auto-csr-approver-29557988-84zvt\" (UID: \"be1f1ae2-41a9-4559-a79d-3dd00f407d52\") " pod="openshift-infra/auto-csr-approver-29557988-84zvt" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.407932 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6p7s\" (UniqueName: \"kubernetes.io/projected/be1f1ae2-41a9-4559-a79d-3dd00f407d52-kube-api-access-z6p7s\") pod \"auto-csr-approver-29557988-84zvt\" (UID: \"be1f1ae2-41a9-4559-a79d-3dd00f407d52\") " pod="openshift-infra/auto-csr-approver-29557988-84zvt" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.426792 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6p7s\" (UniqueName: \"kubernetes.io/projected/be1f1ae2-41a9-4559-a79d-3dd00f407d52-kube-api-access-z6p7s\") pod \"auto-csr-approver-29557988-84zvt\" (UID: \"be1f1ae2-41a9-4559-a79d-3dd00f407d52\") " pod="openshift-infra/auto-csr-approver-29557988-84zvt" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.543959 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-84zvt" Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.782958 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-84zvt"] Mar 14 09:08:00 crc kubenswrapper[5129]: W0314 09:08:00.789220 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe1f1ae2_41a9_4559_a79d_3dd00f407d52.slice/crio-fbb666dcc9eeada88a8093808da862457b75ba65c4591ea5ca59a3ada4c2fc23 WatchSource:0}: Error finding container fbb666dcc9eeada88a8093808da862457b75ba65c4591ea5ca59a3ada4c2fc23: Status 404 returned error can't find the container with id fbb666dcc9eeada88a8093808da862457b75ba65c4591ea5ca59a3ada4c2fc23 Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.894384 5129 generic.go:334] "Generic (PLEG): container finished" podID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerID="c8e69d0d93da1087abc721c736736a7d5b932cd589f28b2fd63d41bb57e45ada" exitCode=0 Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.894635 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" event={"ID":"23d1faf4-a154-444b-93c5-9ea1eb9e43ea","Type":"ContainerDied","Data":"c8e69d0d93da1087abc721c736736a7d5b932cd589f28b2fd63d41bb57e45ada"} Mar 14 09:08:00 crc kubenswrapper[5129]: I0314 09:08:00.895854 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-84zvt" event={"ID":"be1f1ae2-41a9-4559-a79d-3dd00f407d52","Type":"ContainerStarted","Data":"fbb666dcc9eeada88a8093808da862457b75ba65c4591ea5ca59a3ada4c2fc23"} Mar 14 09:08:01 crc kubenswrapper[5129]: I0314 09:08:01.910026 5129 generic.go:334] "Generic (PLEG): container finished" podID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerID="e789195518706cb712f67d6c8c907ddd9f385e9e482f40474716bd2f79215c38" exitCode=0 Mar 14 09:08:01 crc kubenswrapper[5129]: I0314 09:08:01.910105 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" event={"ID":"23d1faf4-a154-444b-93c5-9ea1eb9e43ea","Type":"ContainerDied","Data":"e789195518706cb712f67d6c8c907ddd9f385e9e482f40474716bd2f79215c38"} Mar 14 09:08:02 crc kubenswrapper[5129]: I0314 09:08:02.924148 5129 generic.go:334] "Generic (PLEG): container finished" podID="be1f1ae2-41a9-4559-a79d-3dd00f407d52" containerID="9341c9da77ed187d1d35be3fc1af0b5c5de34df01fdca6bcf9392e148cd4d364" exitCode=0 Mar 14 09:08:02 crc kubenswrapper[5129]: I0314 09:08:02.924228 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-84zvt" event={"ID":"be1f1ae2-41a9-4559-a79d-3dd00f407d52","Type":"ContainerDied","Data":"9341c9da77ed187d1d35be3fc1af0b5c5de34df01fdca6bcf9392e148cd4d364"} Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.253578 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.368669 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-bundle\") pod \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.368796 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-util\") pod \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.369118 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-kube-api-access-6dvks\") pod \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\" (UID: \"23d1faf4-a154-444b-93c5-9ea1eb9e43ea\") " Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.371291 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-bundle" (OuterVolumeSpecName: "bundle") pod "23d1faf4-a154-444b-93c5-9ea1eb9e43ea" (UID: "23d1faf4-a154-444b-93c5-9ea1eb9e43ea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.372968 5129 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.380254 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-util" (OuterVolumeSpecName: "util") pod "23d1faf4-a154-444b-93c5-9ea1eb9e43ea" (UID: "23d1faf4-a154-444b-93c5-9ea1eb9e43ea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.383030 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-kube-api-access-6dvks" (OuterVolumeSpecName: "kube-api-access-6dvks") pod "23d1faf4-a154-444b-93c5-9ea1eb9e43ea" (UID: "23d1faf4-a154-444b-93c5-9ea1eb9e43ea"). InnerVolumeSpecName "kube-api-access-6dvks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.474305 5129 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-util\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.474346 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dvks\" (UniqueName: \"kubernetes.io/projected/23d1faf4-a154-444b-93c5-9ea1eb9e43ea-kube-api-access-6dvks\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.941955 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" event={"ID":"23d1faf4-a154-444b-93c5-9ea1eb9e43ea","Type":"ContainerDied","Data":"e77a4a78e2223a36d15d1a20e1f2ff645977bd8262e11b28a47dac8bac3d3101"} Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.942408 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e77a4a78e2223a36d15d1a20e1f2ff645977bd8262e11b28a47dac8bac3d3101" Mar 14 09:08:03 crc kubenswrapper[5129]: I0314 09:08:03.942014 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs" Mar 14 09:08:04 crc kubenswrapper[5129]: I0314 09:08:04.275415 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-84zvt" Mar 14 09:08:04 crc kubenswrapper[5129]: I0314 09:08:04.401397 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6p7s\" (UniqueName: \"kubernetes.io/projected/be1f1ae2-41a9-4559-a79d-3dd00f407d52-kube-api-access-z6p7s\") pod \"be1f1ae2-41a9-4559-a79d-3dd00f407d52\" (UID: \"be1f1ae2-41a9-4559-a79d-3dd00f407d52\") " Mar 14 09:08:04 crc kubenswrapper[5129]: I0314 09:08:04.408589 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1f1ae2-41a9-4559-a79d-3dd00f407d52-kube-api-access-z6p7s" (OuterVolumeSpecName: "kube-api-access-z6p7s") pod "be1f1ae2-41a9-4559-a79d-3dd00f407d52" (UID: "be1f1ae2-41a9-4559-a79d-3dd00f407d52"). InnerVolumeSpecName "kube-api-access-z6p7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:08:04 crc kubenswrapper[5129]: I0314 09:08:04.504160 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6p7s\" (UniqueName: \"kubernetes.io/projected/be1f1ae2-41a9-4559-a79d-3dd00f407d52-kube-api-access-z6p7s\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:04 crc kubenswrapper[5129]: I0314 09:08:04.954816 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-84zvt" event={"ID":"be1f1ae2-41a9-4559-a79d-3dd00f407d52","Type":"ContainerDied","Data":"fbb666dcc9eeada88a8093808da862457b75ba65c4591ea5ca59a3ada4c2fc23"} Mar 14 09:08:04 crc kubenswrapper[5129]: I0314 09:08:04.954868 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbb666dcc9eeada88a8093808da862457b75ba65c4591ea5ca59a3ada4c2fc23" Mar 14 09:08:04 crc kubenswrapper[5129]: I0314 09:08:04.954879 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-84zvt" Mar 14 09:08:05 crc kubenswrapper[5129]: I0314 09:08:05.374182 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-p9lb8"] Mar 14 09:08:05 crc kubenswrapper[5129]: I0314 09:08:05.388457 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-p9lb8"] Mar 14 09:08:06 crc kubenswrapper[5129]: I0314 09:08:06.049413 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55864dec-a088-4577-a4ed-9588a43de404" path="/var/lib/kubelet/pods/55864dec-a088-4577-a4ed-9588a43de404/volumes" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.675714 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l"] Mar 14 09:08:15 crc kubenswrapper[5129]: E0314 09:08:15.676765 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1f1ae2-41a9-4559-a79d-3dd00f407d52" containerName="oc" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.676784 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1f1ae2-41a9-4559-a79d-3dd00f407d52" containerName="oc" Mar 14 09:08:15 crc kubenswrapper[5129]: E0314 09:08:15.676816 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerName="util" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.676823 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerName="util" Mar 14 09:08:15 crc kubenswrapper[5129]: E0314 09:08:15.676839 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerName="pull" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.676845 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerName="pull" Mar 14 09:08:15 crc kubenswrapper[5129]: E0314 09:08:15.676857 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerName="extract" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.676863 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerName="extract" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.677044 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d1faf4-a154-444b-93c5-9ea1eb9e43ea" containerName="extract" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.677058 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1f1ae2-41a9-4559-a79d-3dd00f407d52" containerName="oc" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.677698 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.680616 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.680617 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-hqlqf" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.681169 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.692490 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l"] Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.743106 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4jh9\" (UniqueName: \"kubernetes.io/projected/cc3d7fc0-4c50-42d1-984a-822d52e9ce6f-kube-api-access-q4jh9\") pod \"obo-prometheus-operator-68bc856cb9-rst7l\" (UID: \"cc3d7fc0-4c50-42d1-984a-822d52e9ce6f\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.812832 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs"] Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.814303 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.820997 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-wsbpp" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.839372 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.841616 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs"] Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.856423 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4jh9\" (UniqueName: \"kubernetes.io/projected/cc3d7fc0-4c50-42d1-984a-822d52e9ce6f-kube-api-access-q4jh9\") pod \"obo-prometheus-operator-68bc856cb9-rst7l\" (UID: \"cc3d7fc0-4c50-42d1-984a-822d52e9ce6f\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.917685 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t"] Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.920319 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.937325 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4jh9\" (UniqueName: \"kubernetes.io/projected/cc3d7fc0-4c50-42d1-984a-822d52e9ce6f-kube-api-access-q4jh9\") pod \"obo-prometheus-operator-68bc856cb9-rst7l\" (UID: \"cc3d7fc0-4c50-42d1-984a-822d52e9ce6f\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.961345 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t"] Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.969105 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b056df4d-914e-45a6-8b07-fb2565d30c6a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-dscrs\" (UID: \"b056df4d-914e-45a6-8b07-fb2565d30c6a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:15 crc kubenswrapper[5129]: I0314 09:08:15.969532 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b056df4d-914e-45a6-8b07-fb2565d30c6a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-dscrs\" (UID: \"b056df4d-914e-45a6-8b07-fb2565d30c6a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.015555 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.030922 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-xp2x8"] Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.035769 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.041531 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-5ffwg" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.041808 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.077892 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e92945a5-6bd7-41ad-a62e-97f681d79bef-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-zx67t\" (UID: \"e92945a5-6bd7-41ad-a62e-97f681d79bef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.078566 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e92945a5-6bd7-41ad-a62e-97f681d79bef-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-zx67t\" (UID: \"e92945a5-6bd7-41ad-a62e-97f681d79bef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.078831 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b056df4d-914e-45a6-8b07-fb2565d30c6a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-dscrs\" (UID: \"b056df4d-914e-45a6-8b07-fb2565d30c6a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.079108 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b056df4d-914e-45a6-8b07-fb2565d30c6a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-dscrs\" (UID: \"b056df4d-914e-45a6-8b07-fb2565d30c6a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.083154 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-xp2x8"] Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.084906 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b056df4d-914e-45a6-8b07-fb2565d30c6a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-dscrs\" (UID: \"b056df4d-914e-45a6-8b07-fb2565d30c6a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.104112 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b056df4d-914e-45a6-8b07-fb2565d30c6a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-dscrs\" (UID: \"b056df4d-914e-45a6-8b07-fb2565d30c6a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.142374 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.181719 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e92945a5-6bd7-41ad-a62e-97f681d79bef-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-zx67t\" (UID: \"e92945a5-6bd7-41ad-a62e-97f681d79bef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.181800 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m84b\" (UniqueName: \"kubernetes.io/projected/bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7-kube-api-access-7m84b\") pod \"observability-operator-59bdc8b94-xp2x8\" (UID: \"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7\") " pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.181864 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-xp2x8\" (UID: \"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7\") " pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.181949 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e92945a5-6bd7-41ad-a62e-97f681d79bef-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-zx67t\" (UID: \"e92945a5-6bd7-41ad-a62e-97f681d79bef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.192107 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e92945a5-6bd7-41ad-a62e-97f681d79bef-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-zx67t\" (UID: \"e92945a5-6bd7-41ad-a62e-97f681d79bef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.228963 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e92945a5-6bd7-41ad-a62e-97f681d79bef-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-75c9666696-zx67t\" (UID: \"e92945a5-6bd7-41ad-a62e-97f681d79bef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.244580 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-v6wgl"] Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.246265 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.276288 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-7x5c7" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.284333 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m84b\" (UniqueName: \"kubernetes.io/projected/bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7-kube-api-access-7m84b\") pod \"observability-operator-59bdc8b94-xp2x8\" (UID: \"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7\") " pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.284387 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-xp2x8\" (UID: \"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7\") " pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.290118 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-v6wgl"] Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.306725 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-xp2x8\" (UID: \"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7\") " pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.323981 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m84b\" (UniqueName: \"kubernetes.io/projected/bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7-kube-api-access-7m84b\") pod \"observability-operator-59bdc8b94-xp2x8\" (UID: \"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7\") " pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.326269 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.332373 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.385653 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rntk\" (UniqueName: \"kubernetes.io/projected/22479fe6-fd03-45b0-8cab-a7b641134b30-kube-api-access-4rntk\") pod \"perses-operator-5bf474d74f-v6wgl\" (UID: \"22479fe6-fd03-45b0-8cab-a7b641134b30\") " pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.385715 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/22479fe6-fd03-45b0-8cab-a7b641134b30-openshift-service-ca\") pod \"perses-operator-5bf474d74f-v6wgl\" (UID: \"22479fe6-fd03-45b0-8cab-a7b641134b30\") " pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.490684 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rntk\" (UniqueName: \"kubernetes.io/projected/22479fe6-fd03-45b0-8cab-a7b641134b30-kube-api-access-4rntk\") pod \"perses-operator-5bf474d74f-v6wgl\" (UID: \"22479fe6-fd03-45b0-8cab-a7b641134b30\") " pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.490804 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/22479fe6-fd03-45b0-8cab-a7b641134b30-openshift-service-ca\") pod \"perses-operator-5bf474d74f-v6wgl\" (UID: \"22479fe6-fd03-45b0-8cab-a7b641134b30\") " pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.493313 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/22479fe6-fd03-45b0-8cab-a7b641134b30-openshift-service-ca\") pod \"perses-operator-5bf474d74f-v6wgl\" (UID: \"22479fe6-fd03-45b0-8cab-a7b641134b30\") " pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.519170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rntk\" (UniqueName: \"kubernetes.io/projected/22479fe6-fd03-45b0-8cab-a7b641134b30-kube-api-access-4rntk\") pod \"perses-operator-5bf474d74f-v6wgl\" (UID: \"22479fe6-fd03-45b0-8cab-a7b641134b30\") " pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:16 crc kubenswrapper[5129]: I0314 09:08:16.728197 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:18 crc kubenswrapper[5129]: I0314 09:08:17.365393 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l"] Mar 14 09:08:18 crc kubenswrapper[5129]: I0314 09:08:17.376970 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs"] Mar 14 09:08:18 crc kubenswrapper[5129]: I0314 09:08:18.139968 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" event={"ID":"b056df4d-914e-45a6-8b07-fb2565d30c6a","Type":"ContainerStarted","Data":"58b04141cf1888032d36febf5458bb23554960a9175f969d123431d26637fa82"} Mar 14 09:08:18 crc kubenswrapper[5129]: I0314 09:08:18.146967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" event={"ID":"cc3d7fc0-4c50-42d1-984a-822d52e9ce6f","Type":"ContainerStarted","Data":"b0504e4a5ac7c0c112d35c9b8c4417be2685db98c3c904516220bb50abab5309"} Mar 14 09:08:18 crc kubenswrapper[5129]: I0314 09:08:18.316618 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-xp2x8"] Mar 14 09:08:18 crc kubenswrapper[5129]: I0314 09:08:18.334748 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t"] Mar 14 09:08:18 crc kubenswrapper[5129]: I0314 09:08:18.520346 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-v6wgl"] Mar 14 09:08:18 crc kubenswrapper[5129]: W0314 09:08:18.595086 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22479fe6_fd03_45b0_8cab_a7b641134b30.slice/crio-03b0726b550c41a0cb862f770509d4ca90dcf6a987efd297cc84ae3ed229b2c0 WatchSource:0}: Error finding container 03b0726b550c41a0cb862f770509d4ca90dcf6a987efd297cc84ae3ed229b2c0: Status 404 returned error can't find the container with id 03b0726b550c41a0cb862f770509d4ca90dcf6a987efd297cc84ae3ed229b2c0 Mar 14 09:08:19 crc kubenswrapper[5129]: I0314 09:08:19.161807 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" event={"ID":"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7","Type":"ContainerStarted","Data":"ee44af8b70577b8ed2ed0746bf0f87e237263803962d06073400c85fede58034"} Mar 14 09:08:19 crc kubenswrapper[5129]: I0314 09:08:19.163707 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" event={"ID":"22479fe6-fd03-45b0-8cab-a7b641134b30","Type":"ContainerStarted","Data":"03b0726b550c41a0cb862f770509d4ca90dcf6a987efd297cc84ae3ed229b2c0"} Mar 14 09:08:19 crc kubenswrapper[5129]: I0314 09:08:19.165505 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" event={"ID":"e92945a5-6bd7-41ad-a62e-97f681d79bef","Type":"ContainerStarted","Data":"2b86afcc45df78257f4e32e4eada6591ec01158da62cebd1d520e0298f086d87"} Mar 14 09:08:19 crc kubenswrapper[5129]: I0314 09:08:19.574515 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:08:19 crc kubenswrapper[5129]: I0314 09:08:19.574623 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:08:29 crc kubenswrapper[5129]: I0314 09:08:29.771182 5129 scope.go:117] "RemoveContainer" containerID="c75ba01acb4672caba99ec3711556674795f55f48bb665875d0d94d48f1f3c9e" Mar 14 09:08:31 crc kubenswrapper[5129]: I0314 09:08:31.769684 5129 scope.go:117] "RemoveContainer" containerID="0e0608d16261557b94aa308a8804718cfe2aa72a71bb674e79a5a9e20e88b097" Mar 14 09:08:31 crc kubenswrapper[5129]: I0314 09:08:31.911373 5129 scope.go:117] "RemoveContainer" containerID="4727a4f876c81f431758412f374975c2b24c9697056e67a73dfbbaa41aca8019" Mar 14 09:08:32 crc kubenswrapper[5129]: I0314 09:08:32.332638 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" event={"ID":"22479fe6-fd03-45b0-8cab-a7b641134b30","Type":"ContainerStarted","Data":"937352ff0a76efe32f08fd4df00719de01f727861b4d3473aafc733100ada90e"} Mar 14 09:08:32 crc kubenswrapper[5129]: I0314 09:08:32.333045 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:32 crc kubenswrapper[5129]: I0314 09:08:32.358060 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" podStartSLOduration=3.015243921 podStartE2EDuration="16.358035841s" podCreationTimestamp="2026-03-14 09:08:16 +0000 UTC" firstStartedPulling="2026-03-14 09:08:18.59732543 +0000 UTC m=+7761.349240614" lastFinishedPulling="2026-03-14 09:08:31.94011735 +0000 UTC m=+7774.692032534" observedRunningTime="2026-03-14 09:08:32.352405288 +0000 UTC m=+7775.104320492" watchObservedRunningTime="2026-03-14 09:08:32.358035841 +0000 UTC m=+7775.109951025" Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.346894 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" event={"ID":"e92945a5-6bd7-41ad-a62e-97f681d79bef","Type":"ContainerStarted","Data":"2b005087e1b31b49a10dd8b0b04d86feabd2605ac4e554caff8935d825972224"} Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.349797 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" event={"ID":"cc3d7fc0-4c50-42d1-984a-822d52e9ce6f","Type":"ContainerStarted","Data":"5bd300744d909610030cf94ad2ef7a9002528b4c283441db57fa4e43801afa92"} Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.352698 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" event={"ID":"b056df4d-914e-45a6-8b07-fb2565d30c6a","Type":"ContainerStarted","Data":"2e6bfe60e5010a3237b463893260f8b6c7c6d8bc96febf3259e00920b1ee50dd"} Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.360632 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" event={"ID":"bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7","Type":"ContainerStarted","Data":"0f4d7e67bfdea1bc6964ec548f3ab82b68da5e00dbbc6c7ee01cd07dae106571"} Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.360681 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.366239 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.461924 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rst7l" podStartSLOduration=3.9037809660000002 podStartE2EDuration="18.461893056s" podCreationTimestamp="2026-03-14 09:08:15 +0000 UTC" firstStartedPulling="2026-03-14 09:08:17.38200817 +0000 UTC m=+7760.133923354" lastFinishedPulling="2026-03-14 09:08:31.94012026 +0000 UTC m=+7774.692035444" observedRunningTime="2026-03-14 09:08:33.439896059 +0000 UTC m=+7776.191811243" watchObservedRunningTime="2026-03-14 09:08:33.461893056 +0000 UTC m=+7776.213808260" Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.475420 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-zx67t" podStartSLOduration=4.889302509 podStartE2EDuration="18.475377092s" podCreationTimestamp="2026-03-14 09:08:15 +0000 UTC" firstStartedPulling="2026-03-14 09:08:18.354026427 +0000 UTC m=+7761.105941611" lastFinishedPulling="2026-03-14 09:08:31.94010101 +0000 UTC m=+7774.692016194" observedRunningTime="2026-03-14 09:08:33.405113946 +0000 UTC m=+7776.157029130" watchObservedRunningTime="2026-03-14 09:08:33.475377092 +0000 UTC m=+7776.227292286" Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.512779 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-xp2x8" podStartSLOduration=4.89888157 podStartE2EDuration="18.512766757s" podCreationTimestamp="2026-03-14 09:08:15 +0000 UTC" firstStartedPulling="2026-03-14 09:08:18.345306081 +0000 UTC m=+7761.097221265" lastFinishedPulling="2026-03-14 09:08:31.959191268 +0000 UTC m=+7774.711106452" observedRunningTime="2026-03-14 09:08:33.512341635 +0000 UTC m=+7776.264256819" watchObservedRunningTime="2026-03-14 09:08:33.512766757 +0000 UTC m=+7776.264681941" Mar 14 09:08:33 crc kubenswrapper[5129]: I0314 09:08:33.565554 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-75c9666696-dscrs" podStartSLOduration=3.979546961 podStartE2EDuration="18.565522918s" podCreationTimestamp="2026-03-14 09:08:15 +0000 UTC" firstStartedPulling="2026-03-14 09:08:17.373970041 +0000 UTC m=+7760.125885225" lastFinishedPulling="2026-03-14 09:08:31.959945998 +0000 UTC m=+7774.711861182" observedRunningTime="2026-03-14 09:08:33.550180853 +0000 UTC m=+7776.302096027" watchObservedRunningTime="2026-03-14 09:08:33.565522918 +0000 UTC m=+7776.317438102" Mar 14 09:08:37 crc kubenswrapper[5129]: I0314 09:08:37.087558 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-670c-account-create-update-7bmhp"] Mar 14 09:08:37 crc kubenswrapper[5129]: I0314 09:08:37.096238 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qss72"] Mar 14 09:08:37 crc kubenswrapper[5129]: I0314 09:08:37.105290 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-670c-account-create-update-7bmhp"] Mar 14 09:08:37 crc kubenswrapper[5129]: I0314 09:08:37.113478 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qss72"] Mar 14 09:08:38 crc kubenswrapper[5129]: I0314 09:08:38.047650 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0ce82f-aea7-41ff-abe9-fae70dc076ed" path="/var/lib/kubelet/pods/0e0ce82f-aea7-41ff-abe9-fae70dc076ed/volumes" Mar 14 09:08:38 crc kubenswrapper[5129]: I0314 09:08:38.050104 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0" path="/var/lib/kubelet/pods/ad84f3cc-fe3f-4834-8d4b-085e3e46e3b0/volumes" Mar 14 09:08:46 crc kubenswrapper[5129]: I0314 09:08:46.733118 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-v6wgl" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.575067 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.575746 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.646779 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.647067 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" containerName="openstackclient" containerID="cri-o://8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e" gracePeriod=2 Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.660153 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.695572 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 09:08:49 crc kubenswrapper[5129]: E0314 09:08:49.696179 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" containerName="openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.696203 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" containerName="openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.696416 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" containerName="openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.701551 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.706899 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" podUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.711986 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.808489 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.808854 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.808893 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.809140 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chstl\" (UniqueName: \"kubernetes.io/projected/8036b698-3fea-4f7d-8ebc-9959d0d8757c-kube-api-access-chstl\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.909509 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.911249 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.911322 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.911349 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.911446 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chstl\" (UniqueName: \"kubernetes.io/projected/8036b698-3fea-4f7d-8ebc-9959d0d8757c-kube-api-access-chstl\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.912882 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.911292 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.922853 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-74hlz" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.923335 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.944473 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.953208 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:49 crc kubenswrapper[5129]: I0314 09:08:49.966061 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chstl\" (UniqueName: \"kubernetes.io/projected/8036b698-3fea-4f7d-8ebc-9959d0d8757c-kube-api-access-chstl\") pod \"openstackclient\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " pod="openstack/openstackclient" Mar 14 09:08:50 crc kubenswrapper[5129]: I0314 09:08:50.046856 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:08:50 crc kubenswrapper[5129]: I0314 09:08:50.131946 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvtxv\" (UniqueName: \"kubernetes.io/projected/318cba12-018f-4218-9fb6-6e3f1c5e2970-kube-api-access-hvtxv\") pod \"kube-state-metrics-0\" (UID: \"318cba12-018f-4218-9fb6-6e3f1c5e2970\") " pod="openstack/kube-state-metrics-0" Mar 14 09:08:50 crc kubenswrapper[5129]: I0314 09:08:50.234835 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvtxv\" (UniqueName: \"kubernetes.io/projected/318cba12-018f-4218-9fb6-6e3f1c5e2970-kube-api-access-hvtxv\") pod \"kube-state-metrics-0\" (UID: \"318cba12-018f-4218-9fb6-6e3f1c5e2970\") " pod="openstack/kube-state-metrics-0" Mar 14 09:08:50 crc kubenswrapper[5129]: I0314 09:08:50.321135 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvtxv\" (UniqueName: \"kubernetes.io/projected/318cba12-018f-4218-9fb6-6e3f1c5e2970-kube-api-access-hvtxv\") pod \"kube-state-metrics-0\" (UID: \"318cba12-018f-4218-9fb6-6e3f1c5e2970\") " pod="openstack/kube-state-metrics-0" Mar 14 09:08:50 crc kubenswrapper[5129]: I0314 09:08:50.388934 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.007050 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.014361 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.020159 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.020443 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-h8xws" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.020917 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.030064 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.030261 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.074201 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.171747 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f08af818-92ad-48e1-b7f4-7e02562a816a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.171896 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.171983 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.172034 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f08af818-92ad-48e1-b7f4-7e02562a816a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.172065 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wxfr\" (UniqueName: \"kubernetes.io/projected/f08af818-92ad-48e1-b7f4-7e02562a816a-kube-api-access-4wxfr\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.172150 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f08af818-92ad-48e1-b7f4-7e02562a816a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.172289 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.278574 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.278849 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.278892 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f08af818-92ad-48e1-b7f4-7e02562a816a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.278922 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wxfr\" (UniqueName: \"kubernetes.io/projected/f08af818-92ad-48e1-b7f4-7e02562a816a-kube-api-access-4wxfr\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.278971 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f08af818-92ad-48e1-b7f4-7e02562a816a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.279028 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.279085 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f08af818-92ad-48e1-b7f4-7e02562a816a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.280105 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f08af818-92ad-48e1-b7f4-7e02562a816a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.303457 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.304070 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f08af818-92ad-48e1-b7f4-7e02562a816a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.314091 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.319429 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wxfr\" (UniqueName: \"kubernetes.io/projected/f08af818-92ad-48e1-b7f4-7e02562a816a-kube-api-access-4wxfr\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.326174 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f08af818-92ad-48e1-b7f4-7e02562a816a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.332022 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.334638 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.351704 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f08af818-92ad-48e1-b7f4-7e02562a816a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f08af818-92ad-48e1-b7f4-7e02562a816a\") " pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.354195 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-566s7" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.354436 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.354563 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.354683 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.354787 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.354941 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.355086 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.355260 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.378743 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.415392 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.466366 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.488829 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.490932 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5ql\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-kube-api-access-sr5ql\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.491092 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.491169 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.491239 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.491351 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.491482 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.491748 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.491999 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.492175 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.568413 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.584059 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8036b698-3fea-4f7d-8ebc-9959d0d8757c","Type":"ContainerStarted","Data":"e6fc6128ccd0092b3dbeb6679e15f162f60c520ab62b8e2a7c2badc7ebc5c54d"} Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.594188 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.595356 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.595476 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.595572 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.596089 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.596205 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5ql\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-kube-api-access-sr5ql\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.596285 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.596359 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.602787 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.604220 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.599525 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.602849 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.599111 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.603468 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.601810 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.609277 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.613364 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.618456 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.650124 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5ql\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-kube-api-access-sr5ql\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.651687 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.651727 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/201f60d526c983c528086870f6a56125f3223dbe138095610707f9d15227e9fd/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.844669 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:51 crc kubenswrapper[5129]: I0314 09:08:51.897976 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.169479 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.189393 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.231791 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-combined-ca-bundle\") pod \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.231884 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config\") pod \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.232131 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzghg\" (UniqueName: \"kubernetes.io/projected/092fbd6e-d672-4e5b-8513-ac36f7b7615d-kube-api-access-lzghg\") pod \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.232275 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config-secret\") pod \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\" (UID: \"092fbd6e-d672-4e5b-8513-ac36f7b7615d\") " Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.241964 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092fbd6e-d672-4e5b-8513-ac36f7b7615d-kube-api-access-lzghg" (OuterVolumeSpecName: "kube-api-access-lzghg") pod "092fbd6e-d672-4e5b-8513-ac36f7b7615d" (UID: "092fbd6e-d672-4e5b-8513-ac36f7b7615d"). InnerVolumeSpecName "kube-api-access-lzghg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.278181 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "092fbd6e-d672-4e5b-8513-ac36f7b7615d" (UID: "092fbd6e-d672-4e5b-8513-ac36f7b7615d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.295723 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "092fbd6e-d672-4e5b-8513-ac36f7b7615d" (UID: "092fbd6e-d672-4e5b-8513-ac36f7b7615d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.341172 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzghg\" (UniqueName: \"kubernetes.io/projected/092fbd6e-d672-4e5b-8513-ac36f7b7615d-kube-api-access-lzghg\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.341205 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.341220 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.352878 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "092fbd6e-d672-4e5b-8513-ac36f7b7615d" (UID: "092fbd6e-d672-4e5b-8513-ac36f7b7615d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.447074 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/092fbd6e-d672-4e5b-8513-ac36f7b7615d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:52 crc kubenswrapper[5129]: W0314 09:08:52.542188 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93209f3_9bdb_4c13_973a_2c3960f3cf92.slice/crio-2e2b27857d6b0bd05be47a71b7b7c94851f1468fc8e4d35c3bdd6752a5e3b010 WatchSource:0}: Error finding container 2e2b27857d6b0bd05be47a71b7b7c94851f1468fc8e4d35c3bdd6752a5e3b010: Status 404 returned error can't find the container with id 2e2b27857d6b0bd05be47a71b7b7c94851f1468fc8e4d35c3bdd6752a5e3b010 Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.544517 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.592990 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8036b698-3fea-4f7d-8ebc-9959d0d8757c","Type":"ContainerStarted","Data":"fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064"} Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.594616 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f08af818-92ad-48e1-b7f4-7e02562a816a","Type":"ContainerStarted","Data":"3114ecf67e4fe366ec27363e0f779db6d7e428ebadc3dd86366b208701a1f74e"} Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.596096 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"318cba12-018f-4218-9fb6-6e3f1c5e2970","Type":"ContainerStarted","Data":"0861d3de50b790da5f4a9a807f4108900a0f99e60667fb8a6f3bb3ca2babb3f4"} Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.596145 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"318cba12-018f-4218-9fb6-6e3f1c5e2970","Type":"ContainerStarted","Data":"e307039545896866f50faa1a01f6cffa3348bc3cf1e13c1729d2d42b2a6497c9"} Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.596206 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.597313 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerStarted","Data":"2e2b27857d6b0bd05be47a71b7b7c94851f1468fc8e4d35c3bdd6752a5e3b010"} Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.599018 5129 generic.go:334] "Generic (PLEG): container finished" podID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" containerID="8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e" exitCode=137 Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.599068 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.599084 5129 scope.go:117] "RemoveContainer" containerID="8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.621668 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.621586861 podStartE2EDuration="3.621586861s" podCreationTimestamp="2026-03-14 09:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:08:52.611182588 +0000 UTC m=+7795.363097782" watchObservedRunningTime="2026-03-14 09:08:52.621586861 +0000 UTC m=+7795.373502035" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.627093 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" podUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.638358 5129 scope.go:117] "RemoveContainer" containerID="8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e" Mar 14 09:08:52 crc kubenswrapper[5129]: E0314 09:08:52.638865 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e\": container with ID starting with 8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e not found: ID does not exist" containerID="8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.638898 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e"} err="failed to get container status \"8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e\": rpc error: code = NotFound desc = could not find container \"8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e\": container with ID starting with 8d34aebd949e9603448c825fba99cca36d7eeee59c17c94c872162391000965e not found: ID does not exist" Mar 14 09:08:52 crc kubenswrapper[5129]: I0314 09:08:52.653847 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.148532913 podStartE2EDuration="3.653823215s" podCreationTimestamp="2026-03-14 09:08:49 +0000 UTC" firstStartedPulling="2026-03-14 09:08:51.627905985 +0000 UTC m=+7794.379821159" lastFinishedPulling="2026-03-14 09:08:52.133196277 +0000 UTC m=+7794.885111461" observedRunningTime="2026-03-14 09:08:52.623915313 +0000 UTC m=+7795.375830487" watchObservedRunningTime="2026-03-14 09:08:52.653823215 +0000 UTC m=+7795.405738399" Mar 14 09:08:54 crc kubenswrapper[5129]: I0314 09:08:54.064109 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092fbd6e-d672-4e5b-8513-ac36f7b7615d" path="/var/lib/kubelet/pods/092fbd6e-d672-4e5b-8513-ac36f7b7615d/volumes" Mar 14 09:08:58 crc kubenswrapper[5129]: I0314 09:08:58.668994 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerStarted","Data":"24a131c394ce5ea99cb689c7ce13e93a139d28f11f18f547ddd9d64bc7c10fb8"} Mar 14 09:08:59 crc kubenswrapper[5129]: I0314 09:08:59.678174 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f08af818-92ad-48e1-b7f4-7e02562a816a","Type":"ContainerStarted","Data":"8206dd1b51a6e64f60ba62838c86c6469bd29a81ac61334a776aefd2c8008358"} Mar 14 09:09:00 crc kubenswrapper[5129]: I0314 09:09:00.393402 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 09:09:05 crc kubenswrapper[5129]: I0314 09:09:05.749513 5129 generic.go:334] "Generic (PLEG): container finished" podID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerID="24a131c394ce5ea99cb689c7ce13e93a139d28f11f18f547ddd9d64bc7c10fb8" exitCode=0 Mar 14 09:09:05 crc kubenswrapper[5129]: I0314 09:09:05.749628 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerDied","Data":"24a131c394ce5ea99cb689c7ce13e93a139d28f11f18f547ddd9d64bc7c10fb8"} Mar 14 09:09:05 crc kubenswrapper[5129]: I0314 09:09:05.753869 5129 generic.go:334] "Generic (PLEG): container finished" podID="f08af818-92ad-48e1-b7f4-7e02562a816a" containerID="8206dd1b51a6e64f60ba62838c86c6469bd29a81ac61334a776aefd2c8008358" exitCode=0 Mar 14 09:09:05 crc kubenswrapper[5129]: I0314 09:09:05.753968 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f08af818-92ad-48e1-b7f4-7e02562a816a","Type":"ContainerDied","Data":"8206dd1b51a6e64f60ba62838c86c6469bd29a81ac61334a776aefd2c8008358"} Mar 14 09:09:09 crc kubenswrapper[5129]: I0314 09:09:09.798269 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f08af818-92ad-48e1-b7f4-7e02562a816a","Type":"ContainerStarted","Data":"475f89885ca0567789342eb8f27debb04ae73ef7e08273c4f8d338b3597693a8"} Mar 14 09:09:14 crc kubenswrapper[5129]: I0314 09:09:14.878301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f08af818-92ad-48e1-b7f4-7e02562a816a","Type":"ContainerStarted","Data":"a765c669b013585e57db58225f1d5fb2c24eb5b7983b02a0b81722d17ceb007e"} Mar 14 09:09:14 crc kubenswrapper[5129]: I0314 09:09:14.879041 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 14 09:09:14 crc kubenswrapper[5129]: I0314 09:09:14.883727 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 14 09:09:14 crc kubenswrapper[5129]: I0314 09:09:14.911026 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.606809771 podStartE2EDuration="24.910997875s" podCreationTimestamp="2026-03-14 09:08:50 +0000 UTC" firstStartedPulling="2026-03-14 09:08:52.215573923 +0000 UTC m=+7794.967489107" lastFinishedPulling="2026-03-14 09:09:08.519762027 +0000 UTC m=+7811.271677211" observedRunningTime="2026-03-14 09:09:14.898669191 +0000 UTC m=+7817.650584425" watchObservedRunningTime="2026-03-14 09:09:14.910997875 +0000 UTC m=+7817.662913079" Mar 14 09:09:15 crc kubenswrapper[5129]: I0314 09:09:15.887897 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerStarted","Data":"adeaabe649c5f9fc6e74965c100d8dc7d4e8522b464b16abb8f2e08ec6f21599"} Mar 14 09:09:18 crc kubenswrapper[5129]: I0314 09:09:18.922782 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerStarted","Data":"a006a2b463c1141270b951f0c6e1d4f8b2d390a4321cfdc8dd255597c6c4f482"} Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.574842 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.574981 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.575093 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.577018 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.577157 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" gracePeriod=600 Mar 14 09:09:19 crc kubenswrapper[5129]: E0314 09:09:19.704673 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.943699 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" exitCode=0 Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.943771 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e"} Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.943859 5129 scope.go:117] "RemoveContainer" containerID="3d4cd803a025626b83183f6bb1c4666176a5fbed36f07324bbfa359be7b98a8d" Mar 14 09:09:19 crc kubenswrapper[5129]: I0314 09:09:19.945395 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:09:19 crc kubenswrapper[5129]: E0314 09:09:19.946088 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:09:21 crc kubenswrapper[5129]: I0314 09:09:21.969467 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerStarted","Data":"cf4e6c2136d46fa122905e8c055ef8a52b31c5ee8159cd2c9f9dfb8e9e781d19"} Mar 14 09:09:22 crc kubenswrapper[5129]: I0314 09:09:22.011579 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=2.919845884 podStartE2EDuration="32.011538522s" podCreationTimestamp="2026-03-14 09:08:50 +0000 UTC" firstStartedPulling="2026-03-14 09:08:52.544839407 +0000 UTC m=+7795.296754591" lastFinishedPulling="2026-03-14 09:09:21.636532045 +0000 UTC m=+7824.388447229" observedRunningTime="2026-03-14 09:09:22.003078712 +0000 UTC m=+7824.754993916" watchObservedRunningTime="2026-03-14 09:09:22.011538522 +0000 UTC m=+7824.763453716" Mar 14 09:09:23 crc kubenswrapper[5129]: I0314 09:09:23.058697 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2zxg6"] Mar 14 09:09:23 crc kubenswrapper[5129]: I0314 09:09:23.102266 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2zxg6"] Mar 14 09:09:24 crc kubenswrapper[5129]: I0314 09:09:24.050320 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bebf2c98-990d-442f-9e7e-a0ee1bf73280" path="/var/lib/kubelet/pods/bebf2c98-990d-442f-9e7e-a0ee1bf73280/volumes" Mar 14 09:09:26 crc kubenswrapper[5129]: I0314 09:09:26.899144 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.037675 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:09:31 crc kubenswrapper[5129]: E0314 09:09:31.039050 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.439015 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.442508 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.445013 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.445042 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.451340 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.513028 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-config-data\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.513075 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.513105 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbfk8\" (UniqueName: \"kubernetes.io/projected/707bb11e-41bb-4fc5-9481-d0b29dce3753-kube-api-access-dbfk8\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.513330 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.513416 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-log-httpd\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.513487 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-run-httpd\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.513740 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-scripts\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.615442 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-scripts\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.615555 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-config-data\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.615572 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.615647 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbfk8\" (UniqueName: \"kubernetes.io/projected/707bb11e-41bb-4fc5-9481-d0b29dce3753-kube-api-access-dbfk8\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.615712 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.615737 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-log-httpd\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.615765 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-run-httpd\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.616230 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-run-httpd\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.617311 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-log-httpd\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.623525 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-scripts\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.625186 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-config-data\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.627153 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.630742 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.646681 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbfk8\" (UniqueName: \"kubernetes.io/projected/707bb11e-41bb-4fc5-9481-d0b29dce3753-kube-api-access-dbfk8\") pod \"ceilometer-0\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " pod="openstack/ceilometer-0" Mar 14 09:09:31 crc kubenswrapper[5129]: I0314 09:09:31.762275 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:09:32 crc kubenswrapper[5129]: I0314 09:09:32.260408 5129 scope.go:117] "RemoveContainer" containerID="1f3c731844a0a51497e284a579f1961efea26e1c4cce48a26e6d64a5fe3690bb" Mar 14 09:09:32 crc kubenswrapper[5129]: I0314 09:09:32.287284 5129 scope.go:117] "RemoveContainer" containerID="f742be5079eb882e6135b4262407da57eb4a26191b27deacddb634561c9cb989" Mar 14 09:09:32 crc kubenswrapper[5129]: I0314 09:09:32.301273 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:09:32 crc kubenswrapper[5129]: W0314 09:09:32.302348 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707bb11e_41bb_4fc5_9481_d0b29dce3753.slice/crio-aa5b9048b48d12469d981e476e73281aef73dc37021690e85a63958e7c3ccc13 WatchSource:0}: Error finding container aa5b9048b48d12469d981e476e73281aef73dc37021690e85a63958e7c3ccc13: Status 404 returned error can't find the container with id aa5b9048b48d12469d981e476e73281aef73dc37021690e85a63958e7c3ccc13 Mar 14 09:09:32 crc kubenswrapper[5129]: I0314 09:09:32.344536 5129 scope.go:117] "RemoveContainer" containerID="b4f01e7b018f2109951d066ddc8241a08eb628110228c9223f3c0295305d9cf4" Mar 14 09:09:33 crc kubenswrapper[5129]: I0314 09:09:33.088595 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerStarted","Data":"aa5b9048b48d12469d981e476e73281aef73dc37021690e85a63958e7c3ccc13"} Mar 14 09:09:36 crc kubenswrapper[5129]: I0314 09:09:36.115677 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerStarted","Data":"2a687bcc0d95bdcae086f9da915c54b96a6a68555a1b8aa3f24b523d2ec09657"} Mar 14 09:09:36 crc kubenswrapper[5129]: I0314 09:09:36.899121 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:36 crc kubenswrapper[5129]: I0314 09:09:36.901671 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:37 crc kubenswrapper[5129]: I0314 09:09:37.126591 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.135933 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerStarted","Data":"5fe99f618439dd8e7f7684ebb5bb4e826468a4333927629fe49685014597915d"} Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.494585 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.494864 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" containerName="openstackclient" containerID="cri-o://fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064" gracePeriod=2 Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.506800 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.535268 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 09:09:38 crc kubenswrapper[5129]: E0314 09:09:38.535816 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" containerName="openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.535838 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" containerName="openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.536151 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" containerName="openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.536774 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.549875 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.570374 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" podUID="ad913420-c19c-4d99-8d9c-b854a4a605d6" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.601716 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad913420-c19c-4d99-8d9c-b854a4a605d6-openstack-config\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.601869 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zkwm\" (UniqueName: \"kubernetes.io/projected/ad913420-c19c-4d99-8d9c-b854a4a605d6-kube-api-access-6zkwm\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.601922 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad913420-c19c-4d99-8d9c-b854a4a605d6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.602004 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad913420-c19c-4d99-8d9c-b854a4a605d6-openstack-config-secret\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.704382 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad913420-c19c-4d99-8d9c-b854a4a605d6-openstack-config-secret\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.704911 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad913420-c19c-4d99-8d9c-b854a4a605d6-openstack-config\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.705062 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zkwm\" (UniqueName: \"kubernetes.io/projected/ad913420-c19c-4d99-8d9c-b854a4a605d6-kube-api-access-6zkwm\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.705124 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad913420-c19c-4d99-8d9c-b854a4a605d6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.705806 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad913420-c19c-4d99-8d9c-b854a4a605d6-openstack-config\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.709500 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad913420-c19c-4d99-8d9c-b854a4a605d6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.709696 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad913420-c19c-4d99-8d9c-b854a4a605d6-openstack-config-secret\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.721578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zkwm\" (UniqueName: \"kubernetes.io/projected/ad913420-c19c-4d99-8d9c-b854a4a605d6-kube-api-access-6zkwm\") pod \"openstackclient\" (UID: \"ad913420-c19c-4d99-8d9c-b854a4a605d6\") " pod="openstack/openstackclient" Mar 14 09:09:38 crc kubenswrapper[5129]: I0314 09:09:38.857405 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:09:39 crc kubenswrapper[5129]: I0314 09:09:39.159128 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerStarted","Data":"af0fc9a09f53cf40a8651dde95fd578743aac92b1a21656d8fe8f0c95652b9fa"} Mar 14 09:09:39 crc kubenswrapper[5129]: I0314 09:09:39.441899 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 09:09:39 crc kubenswrapper[5129]: W0314 09:09:39.448748 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad913420_c19c_4d99_8d9c_b854a4a605d6.slice/crio-cabc0c8b711057183abc1438a0b3cd1b6739f25f7f86e7abdc3397e0d9b3b02a WatchSource:0}: Error finding container cabc0c8b711057183abc1438a0b3cd1b6739f25f7f86e7abdc3397e0d9b3b02a: Status 404 returned error can't find the container with id cabc0c8b711057183abc1438a0b3cd1b6739f25f7f86e7abdc3397e0d9b3b02a Mar 14 09:09:39 crc kubenswrapper[5129]: I0314 09:09:39.757358 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:09:39 crc kubenswrapper[5129]: I0314 09:09:39.757632 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="prometheus" containerID="cri-o://adeaabe649c5f9fc6e74965c100d8dc7d4e8522b464b16abb8f2e08ec6f21599" gracePeriod=600 Mar 14 09:09:39 crc kubenswrapper[5129]: I0314 09:09:39.757722 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="config-reloader" containerID="cri-o://a006a2b463c1141270b951f0c6e1d4f8b2d390a4321cfdc8dd255597c6c4f482" gracePeriod=600 Mar 14 09:09:39 crc kubenswrapper[5129]: I0314 09:09:39.757719 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="thanos-sidecar" containerID="cri-o://cf4e6c2136d46fa122905e8c055ef8a52b31c5ee8159cd2c9f9dfb8e9e781d19" gracePeriod=600 Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.174268 5129 generic.go:334] "Generic (PLEG): container finished" podID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerID="cf4e6c2136d46fa122905e8c055ef8a52b31c5ee8159cd2c9f9dfb8e9e781d19" exitCode=0 Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.174299 5129 generic.go:334] "Generic (PLEG): container finished" podID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerID="a006a2b463c1141270b951f0c6e1d4f8b2d390a4321cfdc8dd255597c6c4f482" exitCode=0 Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.174307 5129 generic.go:334] "Generic (PLEG): container finished" podID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerID="adeaabe649c5f9fc6e74965c100d8dc7d4e8522b464b16abb8f2e08ec6f21599" exitCode=0 Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.174346 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerDied","Data":"cf4e6c2136d46fa122905e8c055ef8a52b31c5ee8159cd2c9f9dfb8e9e781d19"} Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.174372 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerDied","Data":"a006a2b463c1141270b951f0c6e1d4f8b2d390a4321cfdc8dd255597c6c4f482"} Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.174382 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerDied","Data":"adeaabe649c5f9fc6e74965c100d8dc7d4e8522b464b16abb8f2e08ec6f21599"} Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.176172 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ad913420-c19c-4d99-8d9c-b854a4a605d6","Type":"ContainerStarted","Data":"7f4455b35c4fcf4f291eebaeb249ed8c48172bc8976d87f67928e85fa66e1a85"} Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.176199 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ad913420-c19c-4d99-8d9c-b854a4a605d6","Type":"ContainerStarted","Data":"cabc0c8b711057183abc1438a0b3cd1b6739f25f7f86e7abdc3397e0d9b3b02a"} Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.189667 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.18964394 podStartE2EDuration="2.18964394s" podCreationTimestamp="2026-03-14 09:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:09:40.188242791 +0000 UTC m=+7842.940157975" watchObservedRunningTime="2026-03-14 09:09:40.18964394 +0000 UTC m=+7842.941559134" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.718285 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.756556 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-2\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.756692 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-thanos-prometheus-http-client-file\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.756748 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr5ql\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-kube-api-access-sr5ql\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.756774 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.756868 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-web-config\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.757022 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.757074 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-0\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.757136 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-1\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.757252 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-tls-assets\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.757344 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config-out\") pod \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\" (UID: \"f93209f3-9bdb-4c13-973a-2c3960f3cf92\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.757409 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.758002 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.758231 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.790530 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.790648 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config-out" (OuterVolumeSpecName: "config-out") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.791340 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config" (OuterVolumeSpecName: "config") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.792859 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.805026 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-kube-api-access-sr5ql" (OuterVolumeSpecName: "kube-api-access-sr5ql") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "kube-api-access-sr5ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.818192 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.818710 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-web-config" (OuterVolumeSpecName: "web-config") pod "f93209f3-9bdb-4c13-973a-2c3960f3cf92" (UID: "f93209f3-9bdb-4c13-973a-2c3960f3cf92"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861088 5129 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config-out\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861129 5129 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861141 5129 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861151 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861162 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr5ql\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-kube-api-access-sr5ql\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861196 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") on node \"crc\" " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861209 5129 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f93209f3-9bdb-4c13-973a-2c3960f3cf92-web-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861223 5129 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861236 5129 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f93209f3-9bdb-4c13-973a-2c3960f3cf92-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.861250 5129 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f93209f3-9bdb-4c13-973a-2c3960f3cf92-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.904961 5129 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.905197 5129 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e") on node "crc" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.918302 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.962119 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config\") pod \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.962256 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-combined-ca-bundle\") pod \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.962447 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chstl\" (UniqueName: \"kubernetes.io/projected/8036b698-3fea-4f7d-8ebc-9959d0d8757c-kube-api-access-chstl\") pod \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.962516 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config-secret\") pod \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\" (UID: \"8036b698-3fea-4f7d-8ebc-9959d0d8757c\") " Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.963155 5129 reconciler_common.go:293] "Volume detached for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:40 crc kubenswrapper[5129]: I0314 09:09:40.970857 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8036b698-3fea-4f7d-8ebc-9959d0d8757c-kube-api-access-chstl" (OuterVolumeSpecName: "kube-api-access-chstl") pod "8036b698-3fea-4f7d-8ebc-9959d0d8757c" (UID: "8036b698-3fea-4f7d-8ebc-9959d0d8757c"). InnerVolumeSpecName "kube-api-access-chstl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:40.996571 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8036b698-3fea-4f7d-8ebc-9959d0d8757c" (UID: "8036b698-3fea-4f7d-8ebc-9959d0d8757c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.006254 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8036b698-3fea-4f7d-8ebc-9959d0d8757c" (UID: "8036b698-3fea-4f7d-8ebc-9959d0d8757c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.026167 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8036b698-3fea-4f7d-8ebc-9959d0d8757c" (UID: "8036b698-3fea-4f7d-8ebc-9959d0d8757c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.065904 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.065941 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.065953 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chstl\" (UniqueName: \"kubernetes.io/projected/8036b698-3fea-4f7d-8ebc-9959d0d8757c-kube-api-access-chstl\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.065963 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8036b698-3fea-4f7d-8ebc-9959d0d8757c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.188861 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f93209f3-9bdb-4c13-973a-2c3960f3cf92","Type":"ContainerDied","Data":"2e2b27857d6b0bd05be47a71b7b7c94851f1468fc8e4d35c3bdd6752a5e3b010"} Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.190266 5129 scope.go:117] "RemoveContainer" containerID="cf4e6c2136d46fa122905e8c055ef8a52b31c5ee8159cd2c9f9dfb8e9e781d19" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.188868 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.192159 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerStarted","Data":"ad78992bda7be7da2380fee9b733eeeeedd07c4d9410d36c72f0a7e9ea2faa79"} Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.192461 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.194274 5129 generic.go:334] "Generic (PLEG): container finished" podID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" containerID="fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064" exitCode=137 Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.194350 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.226892 5129 scope.go:117] "RemoveContainer" containerID="a006a2b463c1141270b951f0c6e1d4f8b2d390a4321cfdc8dd255597c6c4f482" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.242036 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.144982939 podStartE2EDuration="10.242011718s" podCreationTimestamp="2026-03-14 09:09:31 +0000 UTC" firstStartedPulling="2026-03-14 09:09:32.307676187 +0000 UTC m=+7835.059591371" lastFinishedPulling="2026-03-14 09:09:40.404704966 +0000 UTC m=+7843.156620150" observedRunningTime="2026-03-14 09:09:41.225978463 +0000 UTC m=+7843.977893657" watchObservedRunningTime="2026-03-14 09:09:41.242011718 +0000 UTC m=+7843.993926902" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.249547 5129 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" podUID="ad913420-c19c-4d99-8d9c-b854a4a605d6" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.271679 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.279140 5129 scope.go:117] "RemoveContainer" containerID="adeaabe649c5f9fc6e74965c100d8dc7d4e8522b464b16abb8f2e08ec6f21599" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.284503 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.295713 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:09:41 crc kubenswrapper[5129]: E0314 09:09:41.296145 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="init-config-reloader" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.296164 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="init-config-reloader" Mar 14 09:09:41 crc kubenswrapper[5129]: E0314 09:09:41.296181 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="thanos-sidecar" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.296187 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="thanos-sidecar" Mar 14 09:09:41 crc kubenswrapper[5129]: E0314 09:09:41.296202 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="prometheus" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.296211 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="prometheus" Mar 14 09:09:41 crc kubenswrapper[5129]: E0314 09:09:41.296229 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="config-reloader" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.296235 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="config-reloader" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.296406 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="prometheus" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.296419 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="thanos-sidecar" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.296433 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" containerName="config-reloader" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.298162 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.300099 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.310597 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.310863 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.311834 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.311848 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.311870 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.312062 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-566s7" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.314943 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.318863 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.322294 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.346087 5129 scope.go:117] "RemoveContainer" containerID="24a131c394ce5ea99cb689c7ce13e93a139d28f11f18f547ddd9d64bc7c10fb8" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383719 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383769 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58775514-86d8-43ef-8b77-d25a3a2e0380-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383865 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383896 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdwb\" (UniqueName: \"kubernetes.io/projected/58775514-86d8-43ef-8b77-d25a3a2e0380-kube-api-access-6vdwb\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383924 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383946 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383966 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-config\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.383984 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.384003 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.384028 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.384060 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58775514-86d8-43ef-8b77-d25a3a2e0380-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.384100 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.384122 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.388899 5129 scope.go:117] "RemoveContainer" containerID="fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.441893 5129 scope.go:117] "RemoveContainer" containerID="fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064" Mar 14 09:09:41 crc kubenswrapper[5129]: E0314 09:09:41.442473 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064\": container with ID starting with fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064 not found: ID does not exist" containerID="fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.442512 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064"} err="failed to get container status \"fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064\": rpc error: code = NotFound desc = could not find container \"fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064\": container with ID starting with fb03c3cec9764a12b191577be99f00ba49201ba759538060a694ede08cf51064 not found: ID does not exist" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485650 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58775514-86d8-43ef-8b77-d25a3a2e0380-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485759 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485789 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdwb\" (UniqueName: \"kubernetes.io/projected/58775514-86d8-43ef-8b77-d25a3a2e0380-kube-api-access-6vdwb\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485830 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485851 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485870 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-config\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485889 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485907 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485928 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485956 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58775514-86d8-43ef-8b77-d25a3a2e0380-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.485996 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.486020 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.486051 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.486979 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.487399 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.490137 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/58775514-86d8-43ef-8b77-d25a3a2e0380-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.494284 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-config\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.494988 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.495538 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.496263 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58775514-86d8-43ef-8b77-d25a3a2e0380-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.496995 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.498111 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.498312 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58775514-86d8-43ef-8b77-d25a3a2e0380-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.502326 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/58775514-86d8-43ef-8b77-d25a3a2e0380-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.512441 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdwb\" (UniqueName: \"kubernetes.io/projected/58775514-86d8-43ef-8b77-d25a3a2e0380-kube-api-access-6vdwb\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.519397 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.519441 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/201f60d526c983c528086870f6a56125f3223dbe138095610707f9d15227e9fd/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.629613 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7e702c6-1c6b-409c-8d9e-256bb8092a4e\") pod \"prometheus-metric-storage-0\" (UID: \"58775514-86d8-43ef-8b77-d25a3a2e0380\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:41 crc kubenswrapper[5129]: I0314 09:09:41.919313 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:09:42 crc kubenswrapper[5129]: I0314 09:09:42.130677 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8036b698-3fea-4f7d-8ebc-9959d0d8757c" path="/var/lib/kubelet/pods/8036b698-3fea-4f7d-8ebc-9959d0d8757c/volumes" Mar 14 09:09:42 crc kubenswrapper[5129]: I0314 09:09:42.131636 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93209f3-9bdb-4c13-973a-2c3960f3cf92" path="/var/lib/kubelet/pods/f93209f3-9bdb-4c13-973a-2c3960f3cf92/volumes" Mar 14 09:09:42 crc kubenswrapper[5129]: I0314 09:09:42.490334 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:09:43 crc kubenswrapper[5129]: I0314 09:09:43.222968 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"58775514-86d8-43ef-8b77-d25a3a2e0380","Type":"ContainerStarted","Data":"8db6b1532f9a760561713c602af6ab3ee52cebcdd1ffc8ac41e17462312d09de"} Mar 14 09:09:44 crc kubenswrapper[5129]: I0314 09:09:44.037230 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:09:44 crc kubenswrapper[5129]: E0314 09:09:44.038417 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.309166 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-72j5h"] Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.310564 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.332097 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-72j5h"] Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.376487 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r59kf\" (UniqueName: \"kubernetes.io/projected/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-kube-api-access-r59kf\") pod \"aodh-db-create-72j5h\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.376902 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-operator-scripts\") pod \"aodh-db-create-72j5h\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.415862 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-a95b-account-create-update-kdg2n"] Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.417507 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.423096 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.439564 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a95b-account-create-update-kdg2n"] Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.479776 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8l69\" (UniqueName: \"kubernetes.io/projected/cb1b092b-8bbc-4ed2-8952-7c5a84167130-kube-api-access-q8l69\") pod \"aodh-a95b-account-create-update-kdg2n\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.480507 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r59kf\" (UniqueName: \"kubernetes.io/projected/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-kube-api-access-r59kf\") pod \"aodh-db-create-72j5h\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.480729 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb1b092b-8bbc-4ed2-8952-7c5a84167130-operator-scripts\") pod \"aodh-a95b-account-create-update-kdg2n\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.480780 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-operator-scripts\") pod \"aodh-db-create-72j5h\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.481755 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-operator-scripts\") pod \"aodh-db-create-72j5h\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.516554 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r59kf\" (UniqueName: \"kubernetes.io/projected/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-kube-api-access-r59kf\") pod \"aodh-db-create-72j5h\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.592867 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb1b092b-8bbc-4ed2-8952-7c5a84167130-operator-scripts\") pod \"aodh-a95b-account-create-update-kdg2n\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.593527 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8l69\" (UniqueName: \"kubernetes.io/projected/cb1b092b-8bbc-4ed2-8952-7c5a84167130-kube-api-access-q8l69\") pod \"aodh-a95b-account-create-update-kdg2n\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.594128 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb1b092b-8bbc-4ed2-8952-7c5a84167130-operator-scripts\") pod \"aodh-a95b-account-create-update-kdg2n\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.627804 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8l69\" (UniqueName: \"kubernetes.io/projected/cb1b092b-8bbc-4ed2-8952-7c5a84167130-kube-api-access-q8l69\") pod \"aodh-a95b-account-create-update-kdg2n\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.627889 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:45 crc kubenswrapper[5129]: I0314 09:09:45.748083 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:46 crc kubenswrapper[5129]: I0314 09:09:46.092144 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-72j5h"] Mar 14 09:09:46 crc kubenswrapper[5129]: W0314 09:09:46.111938 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62447c8_ed83_43dd_9b97_a2ca49dde6ff.slice/crio-0f360a75d5f76d95171736f68c856dc3a714677c88460ac70b7b57314a96c40b WatchSource:0}: Error finding container 0f360a75d5f76d95171736f68c856dc3a714677c88460ac70b7b57314a96c40b: Status 404 returned error can't find the container with id 0f360a75d5f76d95171736f68c856dc3a714677c88460ac70b7b57314a96c40b Mar 14 09:09:46 crc kubenswrapper[5129]: I0314 09:09:46.281442 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"58775514-86d8-43ef-8b77-d25a3a2e0380","Type":"ContainerStarted","Data":"a18af605bcbd1b7665d813f1726b909a3326f4e3579a8a38842043974eb762c4"} Mar 14 09:09:46 crc kubenswrapper[5129]: I0314 09:09:46.287301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-72j5h" event={"ID":"d62447c8-ed83-43dd-9b97-a2ca49dde6ff","Type":"ContainerStarted","Data":"0f360a75d5f76d95171736f68c856dc3a714677c88460ac70b7b57314a96c40b"} Mar 14 09:09:46 crc kubenswrapper[5129]: W0314 09:09:46.462549 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb1b092b_8bbc_4ed2_8952_7c5a84167130.slice/crio-ec0c86c7561cff7fd668dbfcb841a0597dab19f30bc8caf6f1c5b481d95e6a88 WatchSource:0}: Error finding container ec0c86c7561cff7fd668dbfcb841a0597dab19f30bc8caf6f1c5b481d95e6a88: Status 404 returned error can't find the container with id ec0c86c7561cff7fd668dbfcb841a0597dab19f30bc8caf6f1c5b481d95e6a88 Mar 14 09:09:46 crc kubenswrapper[5129]: I0314 09:09:46.468694 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a95b-account-create-update-kdg2n"] Mar 14 09:09:47 crc kubenswrapper[5129]: I0314 09:09:47.304700 5129 generic.go:334] "Generic (PLEG): container finished" podID="cb1b092b-8bbc-4ed2-8952-7c5a84167130" containerID="377d4cb9f048e4c344f41759c1679015c994f008dd29f7de9a5036586413ec47" exitCode=0 Mar 14 09:09:47 crc kubenswrapper[5129]: I0314 09:09:47.305554 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a95b-account-create-update-kdg2n" event={"ID":"cb1b092b-8bbc-4ed2-8952-7c5a84167130","Type":"ContainerDied","Data":"377d4cb9f048e4c344f41759c1679015c994f008dd29f7de9a5036586413ec47"} Mar 14 09:09:47 crc kubenswrapper[5129]: I0314 09:09:47.305589 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a95b-account-create-update-kdg2n" event={"ID":"cb1b092b-8bbc-4ed2-8952-7c5a84167130","Type":"ContainerStarted","Data":"ec0c86c7561cff7fd668dbfcb841a0597dab19f30bc8caf6f1c5b481d95e6a88"} Mar 14 09:09:47 crc kubenswrapper[5129]: I0314 09:09:47.308151 5129 generic.go:334] "Generic (PLEG): container finished" podID="d62447c8-ed83-43dd-9b97-a2ca49dde6ff" containerID="abaa35f0d625fb25e18b7e8b71c187ccca30803e213f9203fa92f8900b9313d7" exitCode=0 Mar 14 09:09:47 crc kubenswrapper[5129]: I0314 09:09:47.308271 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-72j5h" event={"ID":"d62447c8-ed83-43dd-9b97-a2ca49dde6ff","Type":"ContainerDied","Data":"abaa35f0d625fb25e18b7e8b71c187ccca30803e213f9203fa92f8900b9313d7"} Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.733110 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.868969 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.880337 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-operator-scripts\") pod \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.880736 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r59kf\" (UniqueName: \"kubernetes.io/projected/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-kube-api-access-r59kf\") pod \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\" (UID: \"d62447c8-ed83-43dd-9b97-a2ca49dde6ff\") " Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.881571 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d62447c8-ed83-43dd-9b97-a2ca49dde6ff" (UID: "d62447c8-ed83-43dd-9b97-a2ca49dde6ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.890342 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-kube-api-access-r59kf" (OuterVolumeSpecName: "kube-api-access-r59kf") pod "d62447c8-ed83-43dd-9b97-a2ca49dde6ff" (UID: "d62447c8-ed83-43dd-9b97-a2ca49dde6ff"). InnerVolumeSpecName "kube-api-access-r59kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.983318 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8l69\" (UniqueName: \"kubernetes.io/projected/cb1b092b-8bbc-4ed2-8952-7c5a84167130-kube-api-access-q8l69\") pod \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.983451 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb1b092b-8bbc-4ed2-8952-7c5a84167130-operator-scripts\") pod \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\" (UID: \"cb1b092b-8bbc-4ed2-8952-7c5a84167130\") " Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.984006 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r59kf\" (UniqueName: \"kubernetes.io/projected/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-kube-api-access-r59kf\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.984030 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d62447c8-ed83-43dd-9b97-a2ca49dde6ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.984535 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb1b092b-8bbc-4ed2-8952-7c5a84167130-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb1b092b-8bbc-4ed2-8952-7c5a84167130" (UID: "cb1b092b-8bbc-4ed2-8952-7c5a84167130"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:09:48 crc kubenswrapper[5129]: I0314 09:09:48.988448 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb1b092b-8bbc-4ed2-8952-7c5a84167130-kube-api-access-q8l69" (OuterVolumeSpecName: "kube-api-access-q8l69") pod "cb1b092b-8bbc-4ed2-8952-7c5a84167130" (UID: "cb1b092b-8bbc-4ed2-8952-7c5a84167130"). InnerVolumeSpecName "kube-api-access-q8l69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.086875 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8l69\" (UniqueName: \"kubernetes.io/projected/cb1b092b-8bbc-4ed2-8952-7c5a84167130-kube-api-access-q8l69\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.086947 5129 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb1b092b-8bbc-4ed2-8952-7c5a84167130-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.339939 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-72j5h" event={"ID":"d62447c8-ed83-43dd-9b97-a2ca49dde6ff","Type":"ContainerDied","Data":"0f360a75d5f76d95171736f68c856dc3a714677c88460ac70b7b57314a96c40b"} Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.340026 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f360a75d5f76d95171736f68c856dc3a714677c88460ac70b7b57314a96c40b" Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.339943 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-72j5h" Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.342846 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a95b-account-create-update-kdg2n" event={"ID":"cb1b092b-8bbc-4ed2-8952-7c5a84167130","Type":"ContainerDied","Data":"ec0c86c7561cff7fd668dbfcb841a0597dab19f30bc8caf6f1c5b481d95e6a88"} Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.342915 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a95b-account-create-update-kdg2n" Mar 14 09:09:49 crc kubenswrapper[5129]: I0314 09:09:49.342921 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec0c86c7561cff7fd668dbfcb841a0597dab19f30bc8caf6f1c5b481d95e6a88" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.778290 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-5kphx"] Mar 14 09:09:50 crc kubenswrapper[5129]: E0314 09:09:50.779092 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62447c8-ed83-43dd-9b97-a2ca49dde6ff" containerName="mariadb-database-create" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.779106 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62447c8-ed83-43dd-9b97-a2ca49dde6ff" containerName="mariadb-database-create" Mar 14 09:09:50 crc kubenswrapper[5129]: E0314 09:09:50.779127 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb1b092b-8bbc-4ed2-8952-7c5a84167130" containerName="mariadb-account-create-update" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.779133 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb1b092b-8bbc-4ed2-8952-7c5a84167130" containerName="mariadb-account-create-update" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.779304 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62447c8-ed83-43dd-9b97-a2ca49dde6ff" containerName="mariadb-database-create" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.779330 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb1b092b-8bbc-4ed2-8952-7c5a84167130" containerName="mariadb-account-create-update" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.779981 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.783820 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2rb7k" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.784117 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.784470 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.784581 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.804212 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5kphx"] Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.931451 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvvh\" (UniqueName: \"kubernetes.io/projected/d03179c2-ff34-4407-9ebd-89f120c07123-kube-api-access-ntvvh\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.931506 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-combined-ca-bundle\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.931534 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-config-data\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:50 crc kubenswrapper[5129]: I0314 09:09:50.931557 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-scripts\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.033103 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvvh\" (UniqueName: \"kubernetes.io/projected/d03179c2-ff34-4407-9ebd-89f120c07123-kube-api-access-ntvvh\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.033158 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-combined-ca-bundle\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.033187 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-config-data\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.033212 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-scripts\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.040524 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-combined-ca-bundle\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.040886 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-scripts\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.059859 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-config-data\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.081152 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvvh\" (UniqueName: \"kubernetes.io/projected/d03179c2-ff34-4407-9ebd-89f120c07123-kube-api-access-ntvvh\") pod \"aodh-db-sync-5kphx\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.101862 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5kphx" Mar 14 09:09:51 crc kubenswrapper[5129]: I0314 09:09:51.680894 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5kphx"] Mar 14 09:09:51 crc kubenswrapper[5129]: W0314 09:09:51.683394 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd03179c2_ff34_4407_9ebd_89f120c07123.slice/crio-21a0b2f903544f8936934b4e547cd002d27380bc6dfa1e63584c9c6d0a8b7388 WatchSource:0}: Error finding container 21a0b2f903544f8936934b4e547cd002d27380bc6dfa1e63584c9c6d0a8b7388: Status 404 returned error can't find the container with id 21a0b2f903544f8936934b4e547cd002d27380bc6dfa1e63584c9c6d0a8b7388 Mar 14 09:09:52 crc kubenswrapper[5129]: I0314 09:09:52.382389 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5kphx" event={"ID":"d03179c2-ff34-4407-9ebd-89f120c07123","Type":"ContainerStarted","Data":"21a0b2f903544f8936934b4e547cd002d27380bc6dfa1e63584c9c6d0a8b7388"} Mar 14 09:09:55 crc kubenswrapper[5129]: I0314 09:09:55.036886 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:09:55 crc kubenswrapper[5129]: E0314 09:09:55.038158 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:09:55 crc kubenswrapper[5129]: I0314 09:09:55.430471 5129 generic.go:334] "Generic (PLEG): container finished" podID="58775514-86d8-43ef-8b77-d25a3a2e0380" containerID="a18af605bcbd1b7665d813f1726b909a3326f4e3579a8a38842043974eb762c4" exitCode=0 Mar 14 09:09:55 crc kubenswrapper[5129]: I0314 09:09:55.430516 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"58775514-86d8-43ef-8b77-d25a3a2e0380","Type":"ContainerDied","Data":"a18af605bcbd1b7665d813f1726b909a3326f4e3579a8a38842043974eb762c4"} Mar 14 09:09:57 crc kubenswrapper[5129]: I0314 09:09:57.467127 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5kphx" event={"ID":"d03179c2-ff34-4407-9ebd-89f120c07123","Type":"ContainerStarted","Data":"c7ecd68941f37b693969af2ca495592910ebef03c9fefd3522844aec3e3d7816"} Mar 14 09:09:57 crc kubenswrapper[5129]: I0314 09:09:57.470593 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"58775514-86d8-43ef-8b77-d25a3a2e0380","Type":"ContainerStarted","Data":"7ed874aba928b12742246fad2277044e1b65cd80ccc5ff0fb65e41d21b87cd5a"} Mar 14 09:09:57 crc kubenswrapper[5129]: I0314 09:09:57.487059 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-5kphx" podStartSLOduration=2.730885308 podStartE2EDuration="7.487038015s" podCreationTimestamp="2026-03-14 09:09:50 +0000 UTC" firstStartedPulling="2026-03-14 09:09:51.685216042 +0000 UTC m=+7854.437131226" lastFinishedPulling="2026-03-14 09:09:56.441368749 +0000 UTC m=+7859.193283933" observedRunningTime="2026-03-14 09:09:57.48133034 +0000 UTC m=+7860.233245534" watchObservedRunningTime="2026-03-14 09:09:57.487038015 +0000 UTC m=+7860.238953199" Mar 14 09:09:59 crc kubenswrapper[5129]: I0314 09:09:59.490077 5129 generic.go:334] "Generic (PLEG): container finished" podID="d03179c2-ff34-4407-9ebd-89f120c07123" containerID="c7ecd68941f37b693969af2ca495592910ebef03c9fefd3522844aec3e3d7816" exitCode=0 Mar 14 09:09:59 crc kubenswrapper[5129]: I0314 09:09:59.490181 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5kphx" event={"ID":"d03179c2-ff34-4407-9ebd-89f120c07123","Type":"ContainerDied","Data":"c7ecd68941f37b693969af2ca495592910ebef03c9fefd3522844aec3e3d7816"} Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.138818 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557990-swcvh"] Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.140713 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-swcvh" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.142877 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.143540 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.143584 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.153293 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-swcvh"] Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.236179 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wkzf\" (UniqueName: \"kubernetes.io/projected/aa91aa1c-d620-401e-a77f-2b5071e3338f-kube-api-access-9wkzf\") pod \"auto-csr-approver-29557990-swcvh\" (UID: \"aa91aa1c-d620-401e-a77f-2b5071e3338f\") " pod="openshift-infra/auto-csr-approver-29557990-swcvh" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.338343 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wkzf\" (UniqueName: \"kubernetes.io/projected/aa91aa1c-d620-401e-a77f-2b5071e3338f-kube-api-access-9wkzf\") pod \"auto-csr-approver-29557990-swcvh\" (UID: \"aa91aa1c-d620-401e-a77f-2b5071e3338f\") " pod="openshift-infra/auto-csr-approver-29557990-swcvh" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.360906 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wkzf\" (UniqueName: \"kubernetes.io/projected/aa91aa1c-d620-401e-a77f-2b5071e3338f-kube-api-access-9wkzf\") pod \"auto-csr-approver-29557990-swcvh\" (UID: \"aa91aa1c-d620-401e-a77f-2b5071e3338f\") " pod="openshift-infra/auto-csr-approver-29557990-swcvh" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.461120 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-swcvh" Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.518828 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"58775514-86d8-43ef-8b77-d25a3a2e0380","Type":"ContainerStarted","Data":"bb67ed84162549ef60bad0fda76efc5868797fcdd8ea9996d7385ff9af7b854e"} Mar 14 09:10:00 crc kubenswrapper[5129]: I0314 09:10:00.966395 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5kphx" Mar 14 09:10:01 crc kubenswrapper[5129]: W0314 09:10:01.019193 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa91aa1c_d620_401e_a77f_2b5071e3338f.slice/crio-ba8a14b4c8bc412c0ba824afba268505d98cdb381ae4fc33bc0f0f2e05ccefa1 WatchSource:0}: Error finding container ba8a14b4c8bc412c0ba824afba268505d98cdb381ae4fc33bc0f0f2e05ccefa1: Status 404 returned error can't find the container with id ba8a14b4c8bc412c0ba824afba268505d98cdb381ae4fc33bc0f0f2e05ccefa1 Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.029281 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-swcvh"] Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.059430 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-combined-ca-bundle\") pod \"d03179c2-ff34-4407-9ebd-89f120c07123\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.059567 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-config-data\") pod \"d03179c2-ff34-4407-9ebd-89f120c07123\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.059644 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntvvh\" (UniqueName: \"kubernetes.io/projected/d03179c2-ff34-4407-9ebd-89f120c07123-kube-api-access-ntvvh\") pod \"d03179c2-ff34-4407-9ebd-89f120c07123\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.059760 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-scripts\") pod \"d03179c2-ff34-4407-9ebd-89f120c07123\" (UID: \"d03179c2-ff34-4407-9ebd-89f120c07123\") " Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.066400 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-scripts" (OuterVolumeSpecName: "scripts") pod "d03179c2-ff34-4407-9ebd-89f120c07123" (UID: "d03179c2-ff34-4407-9ebd-89f120c07123"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.067006 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03179c2-ff34-4407-9ebd-89f120c07123-kube-api-access-ntvvh" (OuterVolumeSpecName: "kube-api-access-ntvvh") pod "d03179c2-ff34-4407-9ebd-89f120c07123" (UID: "d03179c2-ff34-4407-9ebd-89f120c07123"). InnerVolumeSpecName "kube-api-access-ntvvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.088825 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d03179c2-ff34-4407-9ebd-89f120c07123" (UID: "d03179c2-ff34-4407-9ebd-89f120c07123"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.113551 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-config-data" (OuterVolumeSpecName: "config-data") pod "d03179c2-ff34-4407-9ebd-89f120c07123" (UID: "d03179c2-ff34-4407-9ebd-89f120c07123"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.163089 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.163851 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.163890 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntvvh\" (UniqueName: \"kubernetes.io/projected/d03179c2-ff34-4407-9ebd-89f120c07123-kube-api-access-ntvvh\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.163904 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d03179c2-ff34-4407-9ebd-89f120c07123-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.536290 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"58775514-86d8-43ef-8b77-d25a3a2e0380","Type":"ContainerStarted","Data":"2f8470025d246f71d6789d580379110cb172221642778cca5f0567115fcee285"} Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.537888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-swcvh" event={"ID":"aa91aa1c-d620-401e-a77f-2b5071e3338f","Type":"ContainerStarted","Data":"ba8a14b4c8bc412c0ba824afba268505d98cdb381ae4fc33bc0f0f2e05ccefa1"} Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.540672 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5kphx" event={"ID":"d03179c2-ff34-4407-9ebd-89f120c07123","Type":"ContainerDied","Data":"21a0b2f903544f8936934b4e547cd002d27380bc6dfa1e63584c9c6d0a8b7388"} Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.540712 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21a0b2f903544f8936934b4e547cd002d27380bc6dfa1e63584c9c6d0a8b7388" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.540755 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5kphx" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.565901 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.565881122 podStartE2EDuration="20.565881122s" podCreationTimestamp="2026-03-14 09:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:10:01.563766125 +0000 UTC m=+7864.315681329" watchObservedRunningTime="2026-03-14 09:10:01.565881122 +0000 UTC m=+7864.317796306" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.776992 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 09:10:01 crc kubenswrapper[5129]: I0314 09:10:01.920345 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 14 09:10:03 crc kubenswrapper[5129]: I0314 09:10:03.573241 5129 generic.go:334] "Generic (PLEG): container finished" podID="aa91aa1c-d620-401e-a77f-2b5071e3338f" containerID="3c851339df5aa5000308021aaa1e8f5140c2ac9d984afd6a6f2a1009d42cf10f" exitCode=0 Mar 14 09:10:03 crc kubenswrapper[5129]: I0314 09:10:03.573359 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-swcvh" event={"ID":"aa91aa1c-d620-401e-a77f-2b5071e3338f","Type":"ContainerDied","Data":"3c851339df5aa5000308021aaa1e8f5140c2ac9d984afd6a6f2a1009d42cf10f"} Mar 14 09:10:04 crc kubenswrapper[5129]: I0314 09:10:04.974887 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-swcvh" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.050701 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wkzf\" (UniqueName: \"kubernetes.io/projected/aa91aa1c-d620-401e-a77f-2b5071e3338f-kube-api-access-9wkzf\") pod \"aa91aa1c-d620-401e-a77f-2b5071e3338f\" (UID: \"aa91aa1c-d620-401e-a77f-2b5071e3338f\") " Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.068993 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa91aa1c-d620-401e-a77f-2b5071e3338f-kube-api-access-9wkzf" (OuterVolumeSpecName: "kube-api-access-9wkzf") pod "aa91aa1c-d620-401e-a77f-2b5071e3338f" (UID: "aa91aa1c-d620-401e-a77f-2b5071e3338f"). InnerVolumeSpecName "kube-api-access-9wkzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.162110 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wkzf\" (UniqueName: \"kubernetes.io/projected/aa91aa1c-d620-401e-a77f-2b5071e3338f-kube-api-access-9wkzf\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.616584 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-swcvh" event={"ID":"aa91aa1c-d620-401e-a77f-2b5071e3338f","Type":"ContainerDied","Data":"ba8a14b4c8bc412c0ba824afba268505d98cdb381ae4fc33bc0f0f2e05ccefa1"} Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.616705 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba8a14b4c8bc412c0ba824afba268505d98cdb381ae4fc33bc0f0f2e05ccefa1" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.616841 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-swcvh" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.868682 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:05 crc kubenswrapper[5129]: E0314 09:10:05.869569 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03179c2-ff34-4407-9ebd-89f120c07123" containerName="aodh-db-sync" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.869592 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03179c2-ff34-4407-9ebd-89f120c07123" containerName="aodh-db-sync" Mar 14 09:10:05 crc kubenswrapper[5129]: E0314 09:10:05.869663 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa91aa1c-d620-401e-a77f-2b5071e3338f" containerName="oc" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.869674 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa91aa1c-d620-401e-a77f-2b5071e3338f" containerName="oc" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.870030 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03179c2-ff34-4407-9ebd-89f120c07123" containerName="aodh-db-sync" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.870067 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa91aa1c-d620-401e-a77f-2b5071e3338f" containerName="oc" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.873301 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.876493 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.876871 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.878298 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2rb7k" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.882768 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.982140 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-config-data\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.983386 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.983874 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-scripts\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:05 crc kubenswrapper[5129]: I0314 09:10:05.984071 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64g7d\" (UniqueName: \"kubernetes.io/projected/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-kube-api-access-64g7d\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.074714 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-t2fgw"] Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.095067 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-t2fgw"] Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.097318 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-scripts\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.097369 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64g7d\" (UniqueName: \"kubernetes.io/projected/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-kube-api-access-64g7d\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.097468 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-config-data\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.097514 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.104439 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.106549 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-config-data\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.109990 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-scripts\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.136710 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64g7d\" (UniqueName: \"kubernetes.io/projected/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-kube-api-access-64g7d\") pod \"aodh-0\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.215333 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 09:10:06 crc kubenswrapper[5129]: I0314 09:10:06.837912 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:07 crc kubenswrapper[5129]: I0314 09:10:07.641804 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerStarted","Data":"98c5a627b38797caac479ca9ebb4dfd05409ac188a4f1d58b93f3432efa4b155"} Mar 14 09:10:07 crc kubenswrapper[5129]: I0314 09:10:07.642923 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerStarted","Data":"687b5bf31a581460fb41cd58a7e8bbd5f38c82bcf13ec2fafc405a0d26d7cb09"} Mar 14 09:10:08 crc kubenswrapper[5129]: I0314 09:10:08.075784 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3376bd41-1051-449e-a7cf-1e3500a04bcf" path="/var/lib/kubelet/pods/3376bd41-1051-449e-a7cf-1e3500a04bcf/volumes" Mar 14 09:10:08 crc kubenswrapper[5129]: I0314 09:10:08.591086 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:08 crc kubenswrapper[5129]: I0314 09:10:08.595068 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-notification-agent" containerID="cri-o://5fe99f618439dd8e7f7684ebb5bb4e826468a4333927629fe49685014597915d" gracePeriod=30 Mar 14 09:10:08 crc kubenswrapper[5129]: I0314 09:10:08.595332 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-central-agent" containerID="cri-o://2a687bcc0d95bdcae086f9da915c54b96a6a68555a1b8aa3f24b523d2ec09657" gracePeriod=30 Mar 14 09:10:08 crc kubenswrapper[5129]: I0314 09:10:08.595065 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="sg-core" containerID="cri-o://af0fc9a09f53cf40a8651dde95fd578743aac92b1a21656d8fe8f0c95652b9fa" gracePeriod=30 Mar 14 09:10:08 crc kubenswrapper[5129]: I0314 09:10:08.595139 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="proxy-httpd" containerID="cri-o://ad78992bda7be7da2380fee9b733eeeeedd07c4d9410d36c72f0a7e9ea2faa79" gracePeriod=30 Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.676951 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684692 5129 generic.go:334] "Generic (PLEG): container finished" podID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerID="ad78992bda7be7da2380fee9b733eeeeedd07c4d9410d36c72f0a7e9ea2faa79" exitCode=0 Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684720 5129 generic.go:334] "Generic (PLEG): container finished" podID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerID="af0fc9a09f53cf40a8651dde95fd578743aac92b1a21656d8fe8f0c95652b9fa" exitCode=2 Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684726 5129 generic.go:334] "Generic (PLEG): container finished" podID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerID="5fe99f618439dd8e7f7684ebb5bb4e826468a4333927629fe49685014597915d" exitCode=0 Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684850 5129 generic.go:334] "Generic (PLEG): container finished" podID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerID="2a687bcc0d95bdcae086f9da915c54b96a6a68555a1b8aa3f24b523d2ec09657" exitCode=0 Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684898 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerDied","Data":"ad78992bda7be7da2380fee9b733eeeeedd07c4d9410d36c72f0a7e9ea2faa79"} Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684922 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerDied","Data":"af0fc9a09f53cf40a8651dde95fd578743aac92b1a21656d8fe8f0c95652b9fa"} Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerDied","Data":"5fe99f618439dd8e7f7684ebb5bb4e826468a4333927629fe49685014597915d"} Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.684940 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerDied","Data":"2a687bcc0d95bdcae086f9da915c54b96a6a68555a1b8aa3f24b523d2ec09657"} Mar 14 09:10:09 crc kubenswrapper[5129]: I0314 09:10:09.687110 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerStarted","Data":"37b1ae2b8dc16fa899a3bfabdb1360a24ff8112ea41061fb93242eb0c05eeaf7"} Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.036247 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:10:10 crc kubenswrapper[5129]: E0314 09:10:10.037256 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.043638 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.104870 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-run-httpd\") pod \"707bb11e-41bb-4fc5-9481-d0b29dce3753\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.104973 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbfk8\" (UniqueName: \"kubernetes.io/projected/707bb11e-41bb-4fc5-9481-d0b29dce3753-kube-api-access-dbfk8\") pod \"707bb11e-41bb-4fc5-9481-d0b29dce3753\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.105116 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-scripts\") pod \"707bb11e-41bb-4fc5-9481-d0b29dce3753\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.105136 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-sg-core-conf-yaml\") pod \"707bb11e-41bb-4fc5-9481-d0b29dce3753\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.105158 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-combined-ca-bundle\") pod \"707bb11e-41bb-4fc5-9481-d0b29dce3753\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.105236 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-config-data\") pod \"707bb11e-41bb-4fc5-9481-d0b29dce3753\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.105253 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-log-httpd\") pod \"707bb11e-41bb-4fc5-9481-d0b29dce3753\" (UID: \"707bb11e-41bb-4fc5-9481-d0b29dce3753\") " Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.106141 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "707bb11e-41bb-4fc5-9481-d0b29dce3753" (UID: "707bb11e-41bb-4fc5-9481-d0b29dce3753"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.107011 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "707bb11e-41bb-4fc5-9481-d0b29dce3753" (UID: "707bb11e-41bb-4fc5-9481-d0b29dce3753"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.115817 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-scripts" (OuterVolumeSpecName: "scripts") pod "707bb11e-41bb-4fc5-9481-d0b29dce3753" (UID: "707bb11e-41bb-4fc5-9481-d0b29dce3753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.116091 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707bb11e-41bb-4fc5-9481-d0b29dce3753-kube-api-access-dbfk8" (OuterVolumeSpecName: "kube-api-access-dbfk8") pod "707bb11e-41bb-4fc5-9481-d0b29dce3753" (UID: "707bb11e-41bb-4fc5-9481-d0b29dce3753"). InnerVolumeSpecName "kube-api-access-dbfk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.152660 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "707bb11e-41bb-4fc5-9481-d0b29dce3753" (UID: "707bb11e-41bb-4fc5-9481-d0b29dce3753"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.209527 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.209558 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.209569 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.209578 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/707bb11e-41bb-4fc5-9481-d0b29dce3753-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.209586 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbfk8\" (UniqueName: \"kubernetes.io/projected/707bb11e-41bb-4fc5-9481-d0b29dce3753-kube-api-access-dbfk8\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.243703 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707bb11e-41bb-4fc5-9481-d0b29dce3753" (UID: "707bb11e-41bb-4fc5-9481-d0b29dce3753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.259194 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-config-data" (OuterVolumeSpecName: "config-data") pod "707bb11e-41bb-4fc5-9481-d0b29dce3753" (UID: "707bb11e-41bb-4fc5-9481-d0b29dce3753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.311274 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.311309 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707bb11e-41bb-4fc5-9481-d0b29dce3753-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.706589 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerStarted","Data":"4988b7dc5a626278027bef7f79187362250cc21b92f71ad6a258bfe9c44be91d"} Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.710990 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"707bb11e-41bb-4fc5-9481-d0b29dce3753","Type":"ContainerDied","Data":"aa5b9048b48d12469d981e476e73281aef73dc37021690e85a63958e7c3ccc13"} Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.711043 5129 scope.go:117] "RemoveContainer" containerID="ad78992bda7be7da2380fee9b733eeeeedd07c4d9410d36c72f0a7e9ea2faa79" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.711058 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.758394 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.769367 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.792166 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:10 crc kubenswrapper[5129]: E0314 09:10:10.794915 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="proxy-httpd" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795170 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="proxy-httpd" Mar 14 09:10:10 crc kubenswrapper[5129]: E0314 09:10:10.795204 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-central-agent" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795212 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-central-agent" Mar 14 09:10:10 crc kubenswrapper[5129]: E0314 09:10:10.795227 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-notification-agent" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795233 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-notification-agent" Mar 14 09:10:10 crc kubenswrapper[5129]: E0314 09:10:10.795242 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="sg-core" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795248 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="sg-core" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795438 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-notification-agent" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795460 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="proxy-httpd" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795468 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="sg-core" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.795483 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" containerName="ceilometer-central-agent" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.799439 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.804968 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.805255 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.825444 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.922123 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-scripts\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.922174 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-config-data\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.922195 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-log-httpd\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.922271 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-run-httpd\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.922314 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhvh2\" (UniqueName: \"kubernetes.io/projected/aa0b8409-ab85-473f-96ff-127a34634ae9-kube-api-access-vhvh2\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.922418 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:10 crc kubenswrapper[5129]: I0314 09:10:10.922653 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024061 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-scripts\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024114 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-config-data\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024133 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-log-httpd\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024175 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-run-httpd\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024228 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhvh2\" (UniqueName: \"kubernetes.io/projected/aa0b8409-ab85-473f-96ff-127a34634ae9-kube-api-access-vhvh2\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024251 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024321 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024823 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-log-httpd\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.024966 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-run-httpd\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.032201 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.032285 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-scripts\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.032914 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.043643 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-config-data\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.060019 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhvh2\" (UniqueName: \"kubernetes.io/projected/aa0b8409-ab85-473f-96ff-127a34634ae9-kube-api-access-vhvh2\") pod \"ceilometer-0\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.132134 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.279886 5129 scope.go:117] "RemoveContainer" containerID="af0fc9a09f53cf40a8651dde95fd578743aac92b1a21656d8fe8f0c95652b9fa" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.314568 5129 scope.go:117] "RemoveContainer" containerID="5fe99f618439dd8e7f7684ebb5bb4e826468a4333927629fe49685014597915d" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.365171 5129 scope.go:117] "RemoveContainer" containerID="2a687bcc0d95bdcae086f9da915c54b96a6a68555a1b8aa3f24b523d2ec09657" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.920026 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.932365 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.958566 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:11 crc kubenswrapper[5129]: I0314 09:10:11.980920 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:12 crc kubenswrapper[5129]: W0314 09:10:12.003982 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0b8409_ab85_473f_96ff_127a34634ae9.slice/crio-e86465b73dbb071c5b129431362c0e0d69ce36292846f33f16dce1fc3285cd1a WatchSource:0}: Error finding container e86465b73dbb071c5b129431362c0e0d69ce36292846f33f16dce1fc3285cd1a: Status 404 returned error can't find the container with id e86465b73dbb071c5b129431362c0e0d69ce36292846f33f16dce1fc3285cd1a Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.051554 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707bb11e-41bb-4fc5-9481-d0b29dce3753" path="/var/lib/kubelet/pods/707bb11e-41bb-4fc5-9481-d0b29dce3753/volumes" Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.737306 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerStarted","Data":"e38711b51796d5b787c5a8ffdb9f06c9fbb7103044af0c7a8700968ee51f9f35"} Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.738118 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerStarted","Data":"e86465b73dbb071c5b129431362c0e0d69ce36292846f33f16dce1fc3285cd1a"} Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.740432 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerStarted","Data":"47a0ce9cb13d1c71888688d356b86e9938b03beb4d1fe8c4458b869671832354"} Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.740673 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-api" containerID="cri-o://98c5a627b38797caac479ca9ebb4dfd05409ac188a4f1d58b93f3432efa4b155" gracePeriod=30 Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.740732 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-listener" containerID="cri-o://47a0ce9cb13d1c71888688d356b86e9938b03beb4d1fe8c4458b869671832354" gracePeriod=30 Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.740848 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-evaluator" containerID="cri-o://37b1ae2b8dc16fa899a3bfabdb1360a24ff8112ea41061fb93242eb0c05eeaf7" gracePeriod=30 Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.740801 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-notifier" containerID="cri-o://4988b7dc5a626278027bef7f79187362250cc21b92f71ad6a258bfe9c44be91d" gracePeriod=30 Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.751085 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 14 09:10:12 crc kubenswrapper[5129]: I0314 09:10:12.831752 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.303766418 podStartE2EDuration="7.831719793s" podCreationTimestamp="2026-03-14 09:10:05 +0000 UTC" firstStartedPulling="2026-03-14 09:10:06.848918588 +0000 UTC m=+7869.600833772" lastFinishedPulling="2026-03-14 09:10:11.376871963 +0000 UTC m=+7874.128787147" observedRunningTime="2026-03-14 09:10:12.784664206 +0000 UTC m=+7875.536579400" watchObservedRunningTime="2026-03-14 09:10:12.831719793 +0000 UTC m=+7875.583634977" Mar 14 09:10:13 crc kubenswrapper[5129]: I0314 09:10:13.754212 5129 generic.go:334] "Generic (PLEG): container finished" podID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerID="4988b7dc5a626278027bef7f79187362250cc21b92f71ad6a258bfe9c44be91d" exitCode=0 Mar 14 09:10:13 crc kubenswrapper[5129]: I0314 09:10:13.754976 5129 generic.go:334] "Generic (PLEG): container finished" podID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerID="37b1ae2b8dc16fa899a3bfabdb1360a24ff8112ea41061fb93242eb0c05eeaf7" exitCode=0 Mar 14 09:10:13 crc kubenswrapper[5129]: I0314 09:10:13.754990 5129 generic.go:334] "Generic (PLEG): container finished" podID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerID="98c5a627b38797caac479ca9ebb4dfd05409ac188a4f1d58b93f3432efa4b155" exitCode=0 Mar 14 09:10:13 crc kubenswrapper[5129]: I0314 09:10:13.754302 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerDied","Data":"4988b7dc5a626278027bef7f79187362250cc21b92f71ad6a258bfe9c44be91d"} Mar 14 09:10:13 crc kubenswrapper[5129]: I0314 09:10:13.755111 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerDied","Data":"37b1ae2b8dc16fa899a3bfabdb1360a24ff8112ea41061fb93242eb0c05eeaf7"} Mar 14 09:10:13 crc kubenswrapper[5129]: I0314 09:10:13.755134 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerDied","Data":"98c5a627b38797caac479ca9ebb4dfd05409ac188a4f1d58b93f3432efa4b155"} Mar 14 09:10:13 crc kubenswrapper[5129]: I0314 09:10:13.759949 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerStarted","Data":"e8485f22da847dd03986bf7d3ccb855bf06bcaffc3014457dc7b4ec346ffc709"} Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.352899 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbv2v"] Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.355173 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.367820 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbv2v"] Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.442798 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-utilities\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.443462 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-catalog-content\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.443616 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bpmd\" (UniqueName: \"kubernetes.io/projected/3b338d21-24c0-4117-81a1-bf2bea78c719-kube-api-access-2bpmd\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.545138 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-catalog-content\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.545290 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bpmd\" (UniqueName: \"kubernetes.io/projected/3b338d21-24c0-4117-81a1-bf2bea78c719-kube-api-access-2bpmd\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.545422 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-utilities\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.546067 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-catalog-content\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.546140 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-utilities\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.575843 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bpmd\" (UniqueName: \"kubernetes.io/projected/3b338d21-24c0-4117-81a1-bf2bea78c719-kube-api-access-2bpmd\") pod \"community-operators-mbv2v\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.677613 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:14 crc kubenswrapper[5129]: I0314 09:10:14.787791 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerStarted","Data":"127c8bd0e76212ae2fc832e6f28179e0e16af3c786015d96af29e3969062c121"} Mar 14 09:10:15 crc kubenswrapper[5129]: I0314 09:10:15.322860 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbv2v"] Mar 14 09:10:15 crc kubenswrapper[5129]: I0314 09:10:15.798183 5129 generic.go:334] "Generic (PLEG): container finished" podID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerID="34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1" exitCode=0 Mar 14 09:10:15 crc kubenswrapper[5129]: I0314 09:10:15.798664 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbv2v" event={"ID":"3b338d21-24c0-4117-81a1-bf2bea78c719","Type":"ContainerDied","Data":"34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1"} Mar 14 09:10:15 crc kubenswrapper[5129]: I0314 09:10:15.800526 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbv2v" event={"ID":"3b338d21-24c0-4117-81a1-bf2bea78c719","Type":"ContainerStarted","Data":"1eee0227135c66bf1818c2b2713de4183b22bbd8ee838232270a30685a64a252"} Mar 14 09:10:16 crc kubenswrapper[5129]: I0314 09:10:16.818457 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerStarted","Data":"5b3b7d4ae74df8f9a98df95a47a91da4d7b6dc86ffb6093ca9c06f4d79fb1d28"} Mar 14 09:10:16 crc kubenswrapper[5129]: I0314 09:10:16.819091 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-central-agent" containerID="cri-o://e38711b51796d5b787c5a8ffdb9f06c9fbb7103044af0c7a8700968ee51f9f35" gracePeriod=30 Mar 14 09:10:16 crc kubenswrapper[5129]: I0314 09:10:16.819208 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 09:10:16 crc kubenswrapper[5129]: I0314 09:10:16.819731 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="proxy-httpd" containerID="cri-o://5b3b7d4ae74df8f9a98df95a47a91da4d7b6dc86ffb6093ca9c06f4d79fb1d28" gracePeriod=30 Mar 14 09:10:16 crc kubenswrapper[5129]: I0314 09:10:16.819849 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="sg-core" containerID="cri-o://127c8bd0e76212ae2fc832e6f28179e0e16af3c786015d96af29e3969062c121" gracePeriod=30 Mar 14 09:10:16 crc kubenswrapper[5129]: I0314 09:10:16.819909 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-notification-agent" containerID="cri-o://e8485f22da847dd03986bf7d3ccb855bf06bcaffc3014457dc7b4ec346ffc709" gracePeriod=30 Mar 14 09:10:16 crc kubenswrapper[5129]: I0314 09:10:16.873971 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.01443141 podStartE2EDuration="6.873950046s" podCreationTimestamp="2026-03-14 09:10:10 +0000 UTC" firstStartedPulling="2026-03-14 09:10:12.015985506 +0000 UTC m=+7874.767900690" lastFinishedPulling="2026-03-14 09:10:15.875504142 +0000 UTC m=+7878.627419326" observedRunningTime="2026-03-14 09:10:16.867357608 +0000 UTC m=+7879.619272792" watchObservedRunningTime="2026-03-14 09:10:16.873950046 +0000 UTC m=+7879.625865230" Mar 14 09:10:17 crc kubenswrapper[5129]: I0314 09:10:17.834832 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbv2v" event={"ID":"3b338d21-24c0-4117-81a1-bf2bea78c719","Type":"ContainerStarted","Data":"fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f"} Mar 14 09:10:17 crc kubenswrapper[5129]: I0314 09:10:17.842432 5129 generic.go:334] "Generic (PLEG): container finished" podID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerID="127c8bd0e76212ae2fc832e6f28179e0e16af3c786015d96af29e3969062c121" exitCode=2 Mar 14 09:10:17 crc kubenswrapper[5129]: I0314 09:10:17.842471 5129 generic.go:334] "Generic (PLEG): container finished" podID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerID="e8485f22da847dd03986bf7d3ccb855bf06bcaffc3014457dc7b4ec346ffc709" exitCode=0 Mar 14 09:10:17 crc kubenswrapper[5129]: I0314 09:10:17.842523 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerDied","Data":"127c8bd0e76212ae2fc832e6f28179e0e16af3c786015d96af29e3969062c121"} Mar 14 09:10:17 crc kubenswrapper[5129]: I0314 09:10:17.842589 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerDied","Data":"e8485f22da847dd03986bf7d3ccb855bf06bcaffc3014457dc7b4ec346ffc709"} Mar 14 09:10:18 crc kubenswrapper[5129]: I0314 09:10:18.853097 5129 generic.go:334] "Generic (PLEG): container finished" podID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerID="fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f" exitCode=0 Mar 14 09:10:18 crc kubenswrapper[5129]: I0314 09:10:18.853205 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbv2v" event={"ID":"3b338d21-24c0-4117-81a1-bf2bea78c719","Type":"ContainerDied","Data":"fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f"} Mar 14 09:10:19 crc kubenswrapper[5129]: I0314 09:10:19.867502 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbv2v" event={"ID":"3b338d21-24c0-4117-81a1-bf2bea78c719","Type":"ContainerStarted","Data":"401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e"} Mar 14 09:10:19 crc kubenswrapper[5129]: I0314 09:10:19.902498 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbv2v" podStartSLOduration=2.426028711 podStartE2EDuration="5.902474251s" podCreationTimestamp="2026-03-14 09:10:14 +0000 UTC" firstStartedPulling="2026-03-14 09:10:15.800960339 +0000 UTC m=+7878.552875533" lastFinishedPulling="2026-03-14 09:10:19.277405899 +0000 UTC m=+7882.029321073" observedRunningTime="2026-03-14 09:10:19.888567854 +0000 UTC m=+7882.640483038" watchObservedRunningTime="2026-03-14 09:10:19.902474251 +0000 UTC m=+7882.654389435" Mar 14 09:10:21 crc kubenswrapper[5129]: I0314 09:10:21.036550 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:10:21 crc kubenswrapper[5129]: E0314 09:10:21.037388 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:10:22 crc kubenswrapper[5129]: I0314 09:10:22.908366 5129 generic.go:334] "Generic (PLEG): container finished" podID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerID="e38711b51796d5b787c5a8ffdb9f06c9fbb7103044af0c7a8700968ee51f9f35" exitCode=0 Mar 14 09:10:22 crc kubenswrapper[5129]: I0314 09:10:22.908436 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerDied","Data":"e38711b51796d5b787c5a8ffdb9f06c9fbb7103044af0c7a8700968ee51f9f35"} Mar 14 09:10:24 crc kubenswrapper[5129]: I0314 09:10:24.678309 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:24 crc kubenswrapper[5129]: I0314 09:10:24.678381 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:25 crc kubenswrapper[5129]: I0314 09:10:25.725526 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mbv2v" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="registry-server" probeResult="failure" output=< Mar 14 09:10:25 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:10:25 crc kubenswrapper[5129]: > Mar 14 09:10:32 crc kubenswrapper[5129]: I0314 09:10:32.466385 5129 scope.go:117] "RemoveContainer" containerID="5bd98513322662e1f7e7fc825437c846359baf0cb15c38a39da6ba98b9e0960b" Mar 14 09:10:34 crc kubenswrapper[5129]: I0314 09:10:34.741032 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:34 crc kubenswrapper[5129]: I0314 09:10:34.800260 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:34 crc kubenswrapper[5129]: I0314 09:10:34.990391 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbv2v"] Mar 14 09:10:35 crc kubenswrapper[5129]: I0314 09:10:35.035645 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:10:35 crc kubenswrapper[5129]: E0314 09:10:35.035854 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.049279 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbv2v" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="registry-server" containerID="cri-o://401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e" gracePeriod=2 Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.726037 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.857185 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-catalog-content\") pod \"3b338d21-24c0-4117-81a1-bf2bea78c719\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.857289 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bpmd\" (UniqueName: \"kubernetes.io/projected/3b338d21-24c0-4117-81a1-bf2bea78c719-kube-api-access-2bpmd\") pod \"3b338d21-24c0-4117-81a1-bf2bea78c719\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.857344 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-utilities\") pod \"3b338d21-24c0-4117-81a1-bf2bea78c719\" (UID: \"3b338d21-24c0-4117-81a1-bf2bea78c719\") " Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.859380 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-utilities" (OuterVolumeSpecName: "utilities") pod "3b338d21-24c0-4117-81a1-bf2bea78c719" (UID: "3b338d21-24c0-4117-81a1-bf2bea78c719"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.867178 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b338d21-24c0-4117-81a1-bf2bea78c719-kube-api-access-2bpmd" (OuterVolumeSpecName: "kube-api-access-2bpmd") pod "3b338d21-24c0-4117-81a1-bf2bea78c719" (UID: "3b338d21-24c0-4117-81a1-bf2bea78c719"). InnerVolumeSpecName "kube-api-access-2bpmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.919339 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b338d21-24c0-4117-81a1-bf2bea78c719" (UID: "3b338d21-24c0-4117-81a1-bf2bea78c719"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.960743 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bpmd\" (UniqueName: \"kubernetes.io/projected/3b338d21-24c0-4117-81a1-bf2bea78c719-kube-api-access-2bpmd\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.960840 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:36 crc kubenswrapper[5129]: I0314 09:10:36.960855 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b338d21-24c0-4117-81a1-bf2bea78c719-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.064733 5129 generic.go:334] "Generic (PLEG): container finished" podID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerID="401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e" exitCode=0 Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.064800 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbv2v" event={"ID":"3b338d21-24c0-4117-81a1-bf2bea78c719","Type":"ContainerDied","Data":"401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e"} Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.064853 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbv2v" event={"ID":"3b338d21-24c0-4117-81a1-bf2bea78c719","Type":"ContainerDied","Data":"1eee0227135c66bf1818c2b2713de4183b22bbd8ee838232270a30685a64a252"} Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.064880 5129 scope.go:117] "RemoveContainer" containerID="401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.064910 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbv2v" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.099176 5129 scope.go:117] "RemoveContainer" containerID="fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.108758 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbv2v"] Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.143403 5129 scope.go:117] "RemoveContainer" containerID="34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.148673 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbv2v"] Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.198724 5129 scope.go:117] "RemoveContainer" containerID="401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e" Mar 14 09:10:37 crc kubenswrapper[5129]: E0314 09:10:37.199292 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e\": container with ID starting with 401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e not found: ID does not exist" containerID="401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.199392 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e"} err="failed to get container status \"401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e\": rpc error: code = NotFound desc = could not find container \"401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e\": container with ID starting with 401e11a972e6c59fa3dda16fa1d5c227f66cfca5384bb5f038ffd929d7105d6e not found: ID does not exist" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.199458 5129 scope.go:117] "RemoveContainer" containerID="fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f" Mar 14 09:10:37 crc kubenswrapper[5129]: E0314 09:10:37.200144 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f\": container with ID starting with fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f not found: ID does not exist" containerID="fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.200184 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f"} err="failed to get container status \"fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f\": rpc error: code = NotFound desc = could not find container \"fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f\": container with ID starting with fcb638014d5e0993c204f55a8741d43123d829e44b70b6bd7f60a0f7b731f44f not found: ID does not exist" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.200209 5129 scope.go:117] "RemoveContainer" containerID="34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1" Mar 14 09:10:37 crc kubenswrapper[5129]: E0314 09:10:37.200546 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1\": container with ID starting with 34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1 not found: ID does not exist" containerID="34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1" Mar 14 09:10:37 crc kubenswrapper[5129]: I0314 09:10:37.200573 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1"} err="failed to get container status \"34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1\": rpc error: code = NotFound desc = could not find container \"34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1\": container with ID starting with 34102e15249e993aecf05f13ccc9b6a88da76ba132031c5bac5f9ea6179e19c1 not found: ID does not exist" Mar 14 09:10:38 crc kubenswrapper[5129]: I0314 09:10:38.056338 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" path="/var/lib/kubelet/pods/3b338d21-24c0-4117-81a1-bf2bea78c719/volumes" Mar 14 09:10:41 crc kubenswrapper[5129]: I0314 09:10:41.140352 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.145566 5129 generic.go:334] "Generic (PLEG): container finished" podID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerID="47a0ce9cb13d1c71888688d356b86e9938b03beb4d1fe8c4458b869671832354" exitCode=137 Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.145652 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerDied","Data":"47a0ce9cb13d1c71888688d356b86e9938b03beb4d1fe8c4458b869671832354"} Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.253719 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.324793 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-combined-ca-bundle\") pod \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.324873 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-scripts\") pod \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.324979 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64g7d\" (UniqueName: \"kubernetes.io/projected/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-kube-api-access-64g7d\") pod \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.324998 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-config-data\") pod \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\" (UID: \"f1cf8b69-8f3c-4a1d-91da-12d2d8806285\") " Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.340854 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-kube-api-access-64g7d" (OuterVolumeSpecName: "kube-api-access-64g7d") pod "f1cf8b69-8f3c-4a1d-91da-12d2d8806285" (UID: "f1cf8b69-8f3c-4a1d-91da-12d2d8806285"). InnerVolumeSpecName "kube-api-access-64g7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.345434 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-scripts" (OuterVolumeSpecName: "scripts") pod "f1cf8b69-8f3c-4a1d-91da-12d2d8806285" (UID: "f1cf8b69-8f3c-4a1d-91da-12d2d8806285"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.428944 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64g7d\" (UniqueName: \"kubernetes.io/projected/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-kube-api-access-64g7d\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.429024 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.457996 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-config-data" (OuterVolumeSpecName: "config-data") pod "f1cf8b69-8f3c-4a1d-91da-12d2d8806285" (UID: "f1cf8b69-8f3c-4a1d-91da-12d2d8806285"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.470724 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1cf8b69-8f3c-4a1d-91da-12d2d8806285" (UID: "f1cf8b69-8f3c-4a1d-91da-12d2d8806285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.531918 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:43 crc kubenswrapper[5129]: I0314 09:10:43.531953 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1cf8b69-8f3c-4a1d-91da-12d2d8806285-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.169070 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1cf8b69-8f3c-4a1d-91da-12d2d8806285","Type":"ContainerDied","Data":"687b5bf31a581460fb41cd58a7e8bbd5f38c82bcf13ec2fafc405a0d26d7cb09"} Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.169534 5129 scope.go:117] "RemoveContainer" containerID="47a0ce9cb13d1c71888688d356b86e9938b03beb4d1fe8c4458b869671832354" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.169143 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.236066 5129 scope.go:117] "RemoveContainer" containerID="4988b7dc5a626278027bef7f79187362250cc21b92f71ad6a258bfe9c44be91d" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.237297 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.250283 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.263432 5129 scope.go:117] "RemoveContainer" containerID="37b1ae2b8dc16fa899a3bfabdb1360a24ff8112ea41061fb93242eb0c05eeaf7" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.276938 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:44 crc kubenswrapper[5129]: E0314 09:10:44.277533 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-listener" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277556 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-listener" Mar 14 09:10:44 crc kubenswrapper[5129]: E0314 09:10:44.277581 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="extract-utilities" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277589 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="extract-utilities" Mar 14 09:10:44 crc kubenswrapper[5129]: E0314 09:10:44.277609 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="extract-content" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277637 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="extract-content" Mar 14 09:10:44 crc kubenswrapper[5129]: E0314 09:10:44.277663 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-api" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277673 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-api" Mar 14 09:10:44 crc kubenswrapper[5129]: E0314 09:10:44.277682 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-evaluator" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277689 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-evaluator" Mar 14 09:10:44 crc kubenswrapper[5129]: E0314 09:10:44.277715 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-notifier" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277725 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-notifier" Mar 14 09:10:44 crc kubenswrapper[5129]: E0314 09:10:44.277740 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="registry-server" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277748 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="registry-server" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.277992 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-api" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.278013 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b338d21-24c0-4117-81a1-bf2bea78c719" containerName="registry-server" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.278034 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-notifier" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.278044 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-evaluator" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.278064 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" containerName="aodh-listener" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.280015 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.284504 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.284538 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2rb7k" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.284741 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.285041 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.285512 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.294014 5129 scope.go:117] "RemoveContainer" containerID="98c5a627b38797caac479ca9ebb4dfd05409ac188a4f1d58b93f3432efa4b155" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.295287 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.383578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vbcn\" (UniqueName: \"kubernetes.io/projected/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-kube-api-access-7vbcn\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.383798 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.383828 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-config-data\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.383869 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-internal-tls-certs\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.384022 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-public-tls-certs\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.384113 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-scripts\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.486662 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vbcn\" (UniqueName: \"kubernetes.io/projected/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-kube-api-access-7vbcn\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.486778 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-config-data\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.486802 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.486825 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-internal-tls-certs\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.486874 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-public-tls-certs\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.486906 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-scripts\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.493947 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-public-tls-certs\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.495520 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-config-data\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.497129 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-internal-tls-certs\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.499166 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-scripts\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.499232 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.505267 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vbcn\" (UniqueName: \"kubernetes.io/projected/7a72b7ce-3f7c-46f2-b282-8bf3268588ca-kube-api-access-7vbcn\") pod \"aodh-0\" (UID: \"7a72b7ce-3f7c-46f2-b282-8bf3268588ca\") " pod="openstack/aodh-0" Mar 14 09:10:44 crc kubenswrapper[5129]: I0314 09:10:44.614721 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 14 09:10:45 crc kubenswrapper[5129]: W0314 09:10:45.136877 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a72b7ce_3f7c_46f2_b282_8bf3268588ca.slice/crio-f3a15706f0bf92fc8e755b967062115d6cb4e99d7658788477bfecbf4c2de87b WatchSource:0}: Error finding container f3a15706f0bf92fc8e755b967062115d6cb4e99d7658788477bfecbf4c2de87b: Status 404 returned error can't find the container with id f3a15706f0bf92fc8e755b967062115d6cb4e99d7658788477bfecbf4c2de87b Mar 14 09:10:45 crc kubenswrapper[5129]: I0314 09:10:45.137395 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 14 09:10:45 crc kubenswrapper[5129]: I0314 09:10:45.178092 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7a72b7ce-3f7c-46f2-b282-8bf3268588ca","Type":"ContainerStarted","Data":"f3a15706f0bf92fc8e755b967062115d6cb4e99d7658788477bfecbf4c2de87b"} Mar 14 09:10:46 crc kubenswrapper[5129]: I0314 09:10:46.055425 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1cf8b69-8f3c-4a1d-91da-12d2d8806285" path="/var/lib/kubelet/pods/f1cf8b69-8f3c-4a1d-91da-12d2d8806285/volumes" Mar 14 09:10:46 crc kubenswrapper[5129]: I0314 09:10:46.191876 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7a72b7ce-3f7c-46f2-b282-8bf3268588ca","Type":"ContainerStarted","Data":"10602cb7cbeb419aca520fabe07e5b8a13cc2af2859b08565ced3ececeed8e7b"} Mar 14 09:10:46 crc kubenswrapper[5129]: I0314 09:10:46.191926 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7a72b7ce-3f7c-46f2-b282-8bf3268588ca","Type":"ContainerStarted","Data":"35def3a72da7e16424c2741816e6d9ad9f2bfcc71e812e0e2e751645d89294b4"} Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.211072 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7a72b7ce-3f7c-46f2-b282-8bf3268588ca","Type":"ContainerStarted","Data":"8689e8c3ea93b0fe700b9f802ef747e0fa4104603ee73186fd096cc0e7059c8f"} Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.211846 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7a72b7ce-3f7c-46f2-b282-8bf3268588ca","Type":"ContainerStarted","Data":"ab3f90e47a2c4ceadda70e32ea737d35663d122f1475504abb3691aa76882819"} Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.218847 5129 generic.go:334] "Generic (PLEG): container finished" podID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerID="5b3b7d4ae74df8f9a98df95a47a91da4d7b6dc86ffb6093ca9c06f4d79fb1d28" exitCode=137 Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.218972 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerDied","Data":"5b3b7d4ae74df8f9a98df95a47a91da4d7b6dc86ffb6093ca9c06f4d79fb1d28"} Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.219046 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa0b8409-ab85-473f-96ff-127a34634ae9","Type":"ContainerDied","Data":"e86465b73dbb071c5b129431362c0e0d69ce36292846f33f16dce1fc3285cd1a"} Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.219152 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e86465b73dbb071c5b129431362c0e0d69ce36292846f33f16dce1fc3285cd1a" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.242067 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.695807822 podStartE2EDuration="3.242049553s" podCreationTimestamp="2026-03-14 09:10:44 +0000 UTC" firstStartedPulling="2026-03-14 09:10:45.139123866 +0000 UTC m=+7907.891039050" lastFinishedPulling="2026-03-14 09:10:46.685365597 +0000 UTC m=+7909.437280781" observedRunningTime="2026-03-14 09:10:47.240261925 +0000 UTC m=+7909.992177109" watchObservedRunningTime="2026-03-14 09:10:47.242049553 +0000 UTC m=+7909.993964737" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.255761 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.351315 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-config-data\") pod \"aa0b8409-ab85-473f-96ff-127a34634ae9\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.351400 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-scripts\") pod \"aa0b8409-ab85-473f-96ff-127a34634ae9\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.351521 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-sg-core-conf-yaml\") pod \"aa0b8409-ab85-473f-96ff-127a34634ae9\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.351597 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-log-httpd\") pod \"aa0b8409-ab85-473f-96ff-127a34634ae9\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.351638 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-run-httpd\") pod \"aa0b8409-ab85-473f-96ff-127a34634ae9\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.351693 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-combined-ca-bundle\") pod \"aa0b8409-ab85-473f-96ff-127a34634ae9\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.351712 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhvh2\" (UniqueName: \"kubernetes.io/projected/aa0b8409-ab85-473f-96ff-127a34634ae9-kube-api-access-vhvh2\") pod \"aa0b8409-ab85-473f-96ff-127a34634ae9\" (UID: \"aa0b8409-ab85-473f-96ff-127a34634ae9\") " Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.354152 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa0b8409-ab85-473f-96ff-127a34634ae9" (UID: "aa0b8409-ab85-473f-96ff-127a34634ae9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.356061 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa0b8409-ab85-473f-96ff-127a34634ae9" (UID: "aa0b8409-ab85-473f-96ff-127a34634ae9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.361463 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0b8409-ab85-473f-96ff-127a34634ae9-kube-api-access-vhvh2" (OuterVolumeSpecName: "kube-api-access-vhvh2") pod "aa0b8409-ab85-473f-96ff-127a34634ae9" (UID: "aa0b8409-ab85-473f-96ff-127a34634ae9"). InnerVolumeSpecName "kube-api-access-vhvh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.401331 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-scripts" (OuterVolumeSpecName: "scripts") pod "aa0b8409-ab85-473f-96ff-127a34634ae9" (UID: "aa0b8409-ab85-473f-96ff-127a34634ae9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.412851 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa0b8409-ab85-473f-96ff-127a34634ae9" (UID: "aa0b8409-ab85-473f-96ff-127a34634ae9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.453755 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.453784 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa0b8409-ab85-473f-96ff-127a34634ae9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.453793 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhvh2\" (UniqueName: \"kubernetes.io/projected/aa0b8409-ab85-473f-96ff-127a34634ae9-kube-api-access-vhvh2\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.453804 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.453812 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.480549 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa0b8409-ab85-473f-96ff-127a34634ae9" (UID: "aa0b8409-ab85-473f-96ff-127a34634ae9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.504216 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-config-data" (OuterVolumeSpecName: "config-data") pod "aa0b8409-ab85-473f-96ff-127a34634ae9" (UID: "aa0b8409-ab85-473f-96ff-127a34634ae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.555499 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:47 crc kubenswrapper[5129]: I0314 09:10:47.555547 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0b8409-ab85-473f-96ff-127a34634ae9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.229145 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.264777 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.284205 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.295867 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:48 crc kubenswrapper[5129]: E0314 09:10:48.296930 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="sg-core" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.297043 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="sg-core" Mar 14 09:10:48 crc kubenswrapper[5129]: E0314 09:10:48.297162 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-notification-agent" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.297222 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-notification-agent" Mar 14 09:10:48 crc kubenswrapper[5129]: E0314 09:10:48.297303 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-central-agent" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.297367 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-central-agent" Mar 14 09:10:48 crc kubenswrapper[5129]: E0314 09:10:48.297437 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="proxy-httpd" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.297490 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="proxy-httpd" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.297800 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-notification-agent" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.297889 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="sg-core" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.297956 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="proxy-httpd" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.298061 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" containerName="ceilometer-central-agent" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.300470 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.307485 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.320397 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.323306 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.377266 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.378467 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-log-httpd\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.378645 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-config-data\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.378780 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv6mc\" (UniqueName: \"kubernetes.io/projected/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-kube-api-access-vv6mc\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.378954 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-scripts\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.378979 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-run-httpd\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.379159 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481077 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-config-data\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481169 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv6mc\" (UniqueName: \"kubernetes.io/projected/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-kube-api-access-vv6mc\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481225 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-scripts\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481240 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-run-httpd\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481310 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481334 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481354 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-log-httpd\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.481827 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-log-httpd\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.482876 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-run-httpd\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.488145 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.488222 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-scripts\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.488494 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-config-data\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.489064 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.502217 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv6mc\" (UniqueName: \"kubernetes.io/projected/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-kube-api-access-vv6mc\") pod \"ceilometer-0\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " pod="openstack/ceilometer-0" Mar 14 09:10:48 crc kubenswrapper[5129]: I0314 09:10:48.634079 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:10:49 crc kubenswrapper[5129]: I0314 09:10:49.036675 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:10:49 crc kubenswrapper[5129]: E0314 09:10:49.037449 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:10:49 crc kubenswrapper[5129]: I0314 09:10:49.153049 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:10:49 crc kubenswrapper[5129]: I0314 09:10:49.239231 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerStarted","Data":"3f2ffa64367cc7225a6ff5db77bba2897c93096cb8172c8c9ad7349bcc8d688c"} Mar 14 09:10:50 crc kubenswrapper[5129]: I0314 09:10:50.050773 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0b8409-ab85-473f-96ff-127a34634ae9" path="/var/lib/kubelet/pods/aa0b8409-ab85-473f-96ff-127a34634ae9/volumes" Mar 14 09:10:50 crc kubenswrapper[5129]: I0314 09:10:50.253068 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerStarted","Data":"bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5"} Mar 14 09:10:50 crc kubenswrapper[5129]: I0314 09:10:50.253124 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerStarted","Data":"d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5"} Mar 14 09:10:51 crc kubenswrapper[5129]: I0314 09:10:51.270531 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerStarted","Data":"2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4"} Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.134773 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ff58d8665-hqpjr"] Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.138430 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.141312 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.156301 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff58d8665-hqpjr"] Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.324525 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6flv\" (UniqueName: \"kubernetes.io/projected/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-kube-api-access-h6flv\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.324580 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-dns-svc\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.324648 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-nb\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.324703 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-openstack-cell1\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.324776 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-sb\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.324810 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-config\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.357667 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff58d8665-hqpjr"] Mar 14 09:10:53 crc kubenswrapper[5129]: E0314 09:10:53.358471 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-h6flv openstack-cell1 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" podUID="cdf50c4a-630f-4100-9ce2-cad7bb52be1d" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.379550 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-697d6cf577-vtrd8"] Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.381883 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.385251 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-networker" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.401701 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697d6cf577-vtrd8"] Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.427477 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-nb\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.427678 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-openstack-cell1\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.427832 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-sb\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.427893 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-config\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.427956 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6flv\" (UniqueName: \"kubernetes.io/projected/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-kube-api-access-h6flv\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.428007 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-dns-svc\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.429297 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-dns-svc\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.429919 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-nb\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.430799 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-openstack-cell1\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.431078 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-sb\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.431205 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-config\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.456345 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6flv\" (UniqueName: \"kubernetes.io/projected/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-kube-api-access-h6flv\") pod \"dnsmasq-dns-6ff58d8665-hqpjr\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.529868 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-cell1\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.530204 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-networker\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.530236 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4nvn\" (UniqueName: \"kubernetes.io/projected/54691f31-25f9-49a0-81fe-b7517bd39506-kube-api-access-c4nvn\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.530291 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-config\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.530330 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-dns-svc\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.530778 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-nb\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.530854 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-sb\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.632875 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-dns-svc\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.632999 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-nb\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.633027 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-sb\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.633074 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-cell1\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.633105 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-networker\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.633133 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4nvn\" (UniqueName: \"kubernetes.io/projected/54691f31-25f9-49a0-81fe-b7517bd39506-kube-api-access-c4nvn\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.633175 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-config\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.634213 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-dns-svc\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.634725 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-cell1\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.635051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-networker\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.635067 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-nb\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.635083 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-config\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.635385 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-sb\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.665747 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4nvn\" (UniqueName: \"kubernetes.io/projected/54691f31-25f9-49a0-81fe-b7517bd39506-kube-api-access-c4nvn\") pod \"dnsmasq-dns-697d6cf577-vtrd8\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:53 crc kubenswrapper[5129]: I0314 09:10:53.701148 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.065591 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697d6cf577-vtrd8"] Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.317695 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerStarted","Data":"6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950"} Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.318274 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.327415 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.327756 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" event={"ID":"54691f31-25f9-49a0-81fe-b7517bd39506","Type":"ContainerStarted","Data":"eb5de89006f97f0dc3f14f5d7ab1cf05bb0cdb88a1e392035bb8321df09fc3e9"} Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.341629 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.359092 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.542853047 podStartE2EDuration="6.359069337s" podCreationTimestamp="2026-03-14 09:10:48 +0000 UTC" firstStartedPulling="2026-03-14 09:10:49.139834523 +0000 UTC m=+7911.891749707" lastFinishedPulling="2026-03-14 09:10:52.956050803 +0000 UTC m=+7915.707965997" observedRunningTime="2026-03-14 09:10:54.351692026 +0000 UTC m=+7917.103607210" watchObservedRunningTime="2026-03-14 09:10:54.359069337 +0000 UTC m=+7917.110984521" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.451260 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-sb\") pod \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.451393 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-config\") pod \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.451590 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-openstack-cell1\") pod \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.452318 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-nb\") pod \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.452131 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cdf50c4a-630f-4100-9ce2-cad7bb52be1d" (UID: "cdf50c4a-630f-4100-9ce2-cad7bb52be1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.452225 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-config" (OuterVolumeSpecName: "config") pod "cdf50c4a-630f-4100-9ce2-cad7bb52be1d" (UID: "cdf50c4a-630f-4100-9ce2-cad7bb52be1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.452298 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "cdf50c4a-630f-4100-9ce2-cad7bb52be1d" (UID: "cdf50c4a-630f-4100-9ce2-cad7bb52be1d"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.452833 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cdf50c4a-630f-4100-9ce2-cad7bb52be1d" (UID: "cdf50c4a-630f-4100-9ce2-cad7bb52be1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.454904 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-dns-svc\") pod \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.455027 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6flv\" (UniqueName: \"kubernetes.io/projected/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-kube-api-access-h6flv\") pod \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\" (UID: \"cdf50c4a-630f-4100-9ce2-cad7bb52be1d\") " Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.455282 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cdf50c4a-630f-4100-9ce2-cad7bb52be1d" (UID: "cdf50c4a-630f-4100-9ce2-cad7bb52be1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.458399 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.458433 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.458445 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.458456 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.458467 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.463682 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-kube-api-access-h6flv" (OuterVolumeSpecName: "kube-api-access-h6flv") pod "cdf50c4a-630f-4100-9ce2-cad7bb52be1d" (UID: "cdf50c4a-630f-4100-9ce2-cad7bb52be1d"). InnerVolumeSpecName "kube-api-access-h6flv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:54 crc kubenswrapper[5129]: I0314 09:10:54.560267 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6flv\" (UniqueName: \"kubernetes.io/projected/cdf50c4a-630f-4100-9ce2-cad7bb52be1d-kube-api-access-h6flv\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:55 crc kubenswrapper[5129]: I0314 09:10:55.338107 5129 generic.go:334] "Generic (PLEG): container finished" podID="54691f31-25f9-49a0-81fe-b7517bd39506" containerID="879201c794c2b974d0f77ee064f2526b83f6426fe12456ecfb8571fd8008f872" exitCode=0 Mar 14 09:10:55 crc kubenswrapper[5129]: I0314 09:10:55.338225 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" event={"ID":"54691f31-25f9-49a0-81fe-b7517bd39506","Type":"ContainerDied","Data":"879201c794c2b974d0f77ee064f2526b83f6426fe12456ecfb8571fd8008f872"} Mar 14 09:10:55 crc kubenswrapper[5129]: I0314 09:10:55.338470 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff58d8665-hqpjr" Mar 14 09:10:55 crc kubenswrapper[5129]: I0314 09:10:55.574547 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff58d8665-hqpjr"] Mar 14 09:10:55 crc kubenswrapper[5129]: I0314 09:10:55.583075 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ff58d8665-hqpjr"] Mar 14 09:10:56 crc kubenswrapper[5129]: I0314 09:10:56.050501 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf50c4a-630f-4100-9ce2-cad7bb52be1d" path="/var/lib/kubelet/pods/cdf50c4a-630f-4100-9ce2-cad7bb52be1d/volumes" Mar 14 09:10:56 crc kubenswrapper[5129]: I0314 09:10:56.351540 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" event={"ID":"54691f31-25f9-49a0-81fe-b7517bd39506","Type":"ContainerStarted","Data":"38fd26f3ee9a5aed35a5367b913b454652e7b86647d0c664461d438a48d56891"} Mar 14 09:10:56 crc kubenswrapper[5129]: I0314 09:10:56.351748 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:10:56 crc kubenswrapper[5129]: I0314 09:10:56.380987 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" podStartSLOduration=3.380967145 podStartE2EDuration="3.380967145s" podCreationTimestamp="2026-03-14 09:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:10:56.373570914 +0000 UTC m=+7919.125486098" watchObservedRunningTime="2026-03-14 09:10:56.380967145 +0000 UTC m=+7919.132882329" Mar 14 09:11:03 crc kubenswrapper[5129]: I0314 09:11:03.036514 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:11:03 crc kubenswrapper[5129]: E0314 09:11:03.037227 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:11:03 crc kubenswrapper[5129]: I0314 09:11:03.702840 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:11:03 crc kubenswrapper[5129]: I0314 09:11:03.770474 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bdb48dc75-jmswb"] Mar 14 09:11:03 crc kubenswrapper[5129]: I0314 09:11:03.770823 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" podUID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerName="dnsmasq-dns" containerID="cri-o://aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a" gracePeriod=10 Mar 14 09:11:03 crc kubenswrapper[5129]: I0314 09:11:03.950431 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc78fbb6c-kc5wx"] Mar 14 09:11:03 crc kubenswrapper[5129]: I0314 09:11:03.952759 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.018788 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc78fbb6c-kc5wx"] Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.080725 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-networker\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.080790 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsm54\" (UniqueName: \"kubernetes.io/projected/4ac6df17-a845-4a30-8fc6-582fca777ab7-kube-api-access-wsm54\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.080819 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-config\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.080867 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.080890 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-cell1\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.080921 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.080937 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-dns-svc\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.183257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-networker\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.183318 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsm54\" (UniqueName: \"kubernetes.io/projected/4ac6df17-a845-4a30-8fc6-582fca777ab7-kube-api-access-wsm54\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.183348 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-config\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.183395 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.183417 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-cell1\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.183460 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.183476 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-dns-svc\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.184252 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-dns-svc\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.184790 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-networker\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.185530 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-config\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.185999 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.186677 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.186735 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-cell1\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.243127 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsm54\" (UniqueName: \"kubernetes.io/projected/4ac6df17-a845-4a30-8fc6-582fca777ab7-kube-api-access-wsm54\") pod \"dnsmasq-dns-5cc78fbb6c-kc5wx\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.278936 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.460982 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.461222 5129 generic.go:334] "Generic (PLEG): container finished" podID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerID="aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a" exitCode=0 Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.461247 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" event={"ID":"40bec282-5c17-4ec5-ac05-5608aedeae3e","Type":"ContainerDied","Data":"aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a"} Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.461398 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" event={"ID":"40bec282-5c17-4ec5-ac05-5608aedeae3e","Type":"ContainerDied","Data":"de219b65384cc862895ba55ddaca50a24b1ccda11bef1390261ac25cb0729d5e"} Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.461415 5129 scope.go:117] "RemoveContainer" containerID="aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.484249 5129 scope.go:117] "RemoveContainer" containerID="956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.536314 5129 scope.go:117] "RemoveContainer" containerID="aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a" Mar 14 09:11:04 crc kubenswrapper[5129]: E0314 09:11:04.536825 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a\": container with ID starting with aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a not found: ID does not exist" containerID="aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.536866 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a"} err="failed to get container status \"aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a\": rpc error: code = NotFound desc = could not find container \"aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a\": container with ID starting with aa7fe9b1cc9b7709bd9a0a3f1bbacc855c5fb097b3a050f4f193432b2e67111a not found: ID does not exist" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.536890 5129 scope.go:117] "RemoveContainer" containerID="956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af" Mar 14 09:11:04 crc kubenswrapper[5129]: E0314 09:11:04.537309 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af\": container with ID starting with 956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af not found: ID does not exist" containerID="956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.537340 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af"} err="failed to get container status \"956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af\": rpc error: code = NotFound desc = could not find container \"956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af\": container with ID starting with 956a74101a590321be54cab6284ccecb66ec4a415bdcaa9db841d48e94fe85af not found: ID does not exist" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.590091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx5xr\" (UniqueName: \"kubernetes.io/projected/40bec282-5c17-4ec5-ac05-5608aedeae3e-kube-api-access-hx5xr\") pod \"40bec282-5c17-4ec5-ac05-5608aedeae3e\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.590182 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-config\") pod \"40bec282-5c17-4ec5-ac05-5608aedeae3e\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.590288 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-nb\") pod \"40bec282-5c17-4ec5-ac05-5608aedeae3e\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.590316 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-sb\") pod \"40bec282-5c17-4ec5-ac05-5608aedeae3e\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.590639 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-dns-svc\") pod \"40bec282-5c17-4ec5-ac05-5608aedeae3e\" (UID: \"40bec282-5c17-4ec5-ac05-5608aedeae3e\") " Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.598040 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40bec282-5c17-4ec5-ac05-5608aedeae3e-kube-api-access-hx5xr" (OuterVolumeSpecName: "kube-api-access-hx5xr") pod "40bec282-5c17-4ec5-ac05-5608aedeae3e" (UID: "40bec282-5c17-4ec5-ac05-5608aedeae3e"). InnerVolumeSpecName "kube-api-access-hx5xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.645267 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40bec282-5c17-4ec5-ac05-5608aedeae3e" (UID: "40bec282-5c17-4ec5-ac05-5608aedeae3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.650863 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-config" (OuterVolumeSpecName: "config") pod "40bec282-5c17-4ec5-ac05-5608aedeae3e" (UID: "40bec282-5c17-4ec5-ac05-5608aedeae3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.681039 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "40bec282-5c17-4ec5-ac05-5608aedeae3e" (UID: "40bec282-5c17-4ec5-ac05-5608aedeae3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.686165 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40bec282-5c17-4ec5-ac05-5608aedeae3e" (UID: "40bec282-5c17-4ec5-ac05-5608aedeae3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.693298 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx5xr\" (UniqueName: \"kubernetes.io/projected/40bec282-5c17-4ec5-ac05-5608aedeae3e-kube-api-access-hx5xr\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.693348 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.693362 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.693373 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.693385 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bec282-5c17-4ec5-ac05-5608aedeae3e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:04 crc kubenswrapper[5129]: I0314 09:11:04.891222 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc78fbb6c-kc5wx"] Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.065166 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7rjql"] Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.089475 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7rjql"] Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.480289 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" event={"ID":"4ac6df17-a845-4a30-8fc6-582fca777ab7","Type":"ContainerDied","Data":"cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404"} Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.480168 5129 generic.go:334] "Generic (PLEG): container finished" podID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerID="cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404" exitCode=0 Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.480519 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" event={"ID":"4ac6df17-a845-4a30-8fc6-582fca777ab7","Type":"ContainerStarted","Data":"455c893965527a815ba1bdd9c8849530069079736780be79cf07113085589a30"} Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.483684 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bdb48dc75-jmswb" Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.583615 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bdb48dc75-jmswb"] Mar 14 09:11:05 crc kubenswrapper[5129]: I0314 09:11:05.593396 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bdb48dc75-jmswb"] Mar 14 09:11:06 crc kubenswrapper[5129]: I0314 09:11:06.108448 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40bec282-5c17-4ec5-ac05-5608aedeae3e" path="/var/lib/kubelet/pods/40bec282-5c17-4ec5-ac05-5608aedeae3e/volumes" Mar 14 09:11:06 crc kubenswrapper[5129]: I0314 09:11:06.109131 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c093d97-b823-4772-a6ac-e324f8c64188" path="/var/lib/kubelet/pods/7c093d97-b823-4772-a6ac-e324f8c64188/volumes" Mar 14 09:11:06 crc kubenswrapper[5129]: I0314 09:11:06.110429 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-276d-account-create-update-v486s"] Mar 14 09:11:06 crc kubenswrapper[5129]: I0314 09:11:06.110459 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-276d-account-create-update-v486s"] Mar 14 09:11:06 crc kubenswrapper[5129]: I0314 09:11:06.497109 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" event={"ID":"4ac6df17-a845-4a30-8fc6-582fca777ab7","Type":"ContainerStarted","Data":"b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b"} Mar 14 09:11:06 crc kubenswrapper[5129]: I0314 09:11:06.497262 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:06 crc kubenswrapper[5129]: I0314 09:11:06.521204 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" podStartSLOduration=3.521184367 podStartE2EDuration="3.521184367s" podCreationTimestamp="2026-03-14 09:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:11:06.518032442 +0000 UTC m=+7929.269947626" watchObservedRunningTime="2026-03-14 09:11:06.521184367 +0000 UTC m=+7929.273099551" Mar 14 09:11:08 crc kubenswrapper[5129]: I0314 09:11:08.059401 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61da916-484d-43c8-98a1-b7d5846218cf" path="/var/lib/kubelet/pods/b61da916-484d-43c8-98a1-b7d5846218cf/volumes" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.280796 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.363364 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697d6cf577-vtrd8"] Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.364644 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" podUID="54691f31-25f9-49a0-81fe-b7517bd39506" containerName="dnsmasq-dns" containerID="cri-o://38fd26f3ee9a5aed35a5367b913b454652e7b86647d0c664461d438a48d56891" gracePeriod=10 Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.598949 5129 generic.go:334] "Generic (PLEG): container finished" podID="54691f31-25f9-49a0-81fe-b7517bd39506" containerID="38fd26f3ee9a5aed35a5367b913b454652e7b86647d0c664461d438a48d56891" exitCode=0 Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.599022 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" event={"ID":"54691f31-25f9-49a0-81fe-b7517bd39506","Type":"ContainerDied","Data":"38fd26f3ee9a5aed35a5367b913b454652e7b86647d0c664461d438a48d56891"} Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.775718 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bc4b66b87-lpkvz"] Mar 14 09:11:14 crc kubenswrapper[5129]: E0314 09:11:14.776341 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerName="init" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.776363 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerName="init" Mar 14 09:11:14 crc kubenswrapper[5129]: E0314 09:11:14.776392 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerName="dnsmasq-dns" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.776400 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerName="dnsmasq-dns" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.776633 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="40bec282-5c17-4ec5-ac05-5608aedeae3e" containerName="dnsmasq-dns" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.778152 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.787270 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc4b66b87-lpkvz"] Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.852226 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxp7b\" (UniqueName: \"kubernetes.io/projected/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-kube-api-access-vxp7b\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.852686 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-cell1\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.852820 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-config\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.852891 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.852985 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-dns-svc\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.853012 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.853036 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-networker\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955040 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-networker\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955150 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxp7b\" (UniqueName: \"kubernetes.io/projected/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-kube-api-access-vxp7b\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955231 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-cell1\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955272 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-config\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955323 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955422 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-dns-svc\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955450 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.955970 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-networker\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.956311 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.956781 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-dns-svc\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.956948 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-cell1\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.957110 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.957107 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-config\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.981037 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxp7b\" (UniqueName: \"kubernetes.io/projected/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-kube-api-access-vxp7b\") pod \"dnsmasq-dns-7bc4b66b87-lpkvz\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:14 crc kubenswrapper[5129]: I0314 09:11:14.983976 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.056871 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-sb\") pod \"54691f31-25f9-49a0-81fe-b7517bd39506\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.056950 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4nvn\" (UniqueName: \"kubernetes.io/projected/54691f31-25f9-49a0-81fe-b7517bd39506-kube-api-access-c4nvn\") pod \"54691f31-25f9-49a0-81fe-b7517bd39506\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.056999 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-nb\") pod \"54691f31-25f9-49a0-81fe-b7517bd39506\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.057171 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-networker\") pod \"54691f31-25f9-49a0-81fe-b7517bd39506\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.057190 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-cell1\") pod \"54691f31-25f9-49a0-81fe-b7517bd39506\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.057210 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-dns-svc\") pod \"54691f31-25f9-49a0-81fe-b7517bd39506\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.057250 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-config\") pod \"54691f31-25f9-49a0-81fe-b7517bd39506\" (UID: \"54691f31-25f9-49a0-81fe-b7517bd39506\") " Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.062029 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54691f31-25f9-49a0-81fe-b7517bd39506-kube-api-access-c4nvn" (OuterVolumeSpecName: "kube-api-access-c4nvn") pod "54691f31-25f9-49a0-81fe-b7517bd39506" (UID: "54691f31-25f9-49a0-81fe-b7517bd39506"). InnerVolumeSpecName "kube-api-access-c4nvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.117254 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.119824 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "54691f31-25f9-49a0-81fe-b7517bd39506" (UID: "54691f31-25f9-49a0-81fe-b7517bd39506"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.122429 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-config" (OuterVolumeSpecName: "config") pod "54691f31-25f9-49a0-81fe-b7517bd39506" (UID: "54691f31-25f9-49a0-81fe-b7517bd39506"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.127476 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54691f31-25f9-49a0-81fe-b7517bd39506" (UID: "54691f31-25f9-49a0-81fe-b7517bd39506"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.132420 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "54691f31-25f9-49a0-81fe-b7517bd39506" (UID: "54691f31-25f9-49a0-81fe-b7517bd39506"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.134377 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54691f31-25f9-49a0-81fe-b7517bd39506" (UID: "54691f31-25f9-49a0-81fe-b7517bd39506"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.138685 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "54691f31-25f9-49a0-81fe-b7517bd39506" (UID: "54691f31-25f9-49a0-81fe-b7517bd39506"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.166086 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.166123 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.166135 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.166147 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.166156 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.166164 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4nvn\" (UniqueName: \"kubernetes.io/projected/54691f31-25f9-49a0-81fe-b7517bd39506-kube-api-access-c4nvn\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.166175 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54691f31-25f9-49a0-81fe-b7517bd39506-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.615886 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" event={"ID":"54691f31-25f9-49a0-81fe-b7517bd39506","Type":"ContainerDied","Data":"eb5de89006f97f0dc3f14f5d7ab1cf05bb0cdb88a1e392035bb8321df09fc3e9"} Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.615993 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697d6cf577-vtrd8" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.616359 5129 scope.go:117] "RemoveContainer" containerID="38fd26f3ee9a5aed35a5367b913b454652e7b86647d0c664461d438a48d56891" Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.634854 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc4b66b87-lpkvz"] Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.657493 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697d6cf577-vtrd8"] Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.663739 5129 scope.go:117] "RemoveContainer" containerID="879201c794c2b974d0f77ee064f2526b83f6426fe12456ecfb8571fd8008f872" Mar 14 09:11:15 crc kubenswrapper[5129]: W0314 09:11:15.669891 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f5f08c_6373_4ac4_8fd9_278bb200b1ef.slice/crio-dd6e6a0244dfacf1495215e3c0a6d987426454575814991f5492564561c5ff48 WatchSource:0}: Error finding container dd6e6a0244dfacf1495215e3c0a6d987426454575814991f5492564561c5ff48: Status 404 returned error can't find the container with id dd6e6a0244dfacf1495215e3c0a6d987426454575814991f5492564561c5ff48 Mar 14 09:11:15 crc kubenswrapper[5129]: I0314 09:11:15.670045 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-697d6cf577-vtrd8"] Mar 14 09:11:16 crc kubenswrapper[5129]: I0314 09:11:16.040247 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:11:16 crc kubenswrapper[5129]: E0314 09:11:16.041936 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:11:16 crc kubenswrapper[5129]: I0314 09:11:16.056402 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54691f31-25f9-49a0-81fe-b7517bd39506" path="/var/lib/kubelet/pods/54691f31-25f9-49a0-81fe-b7517bd39506/volumes" Mar 14 09:11:16 crc kubenswrapper[5129]: I0314 09:11:16.631418 5129 generic.go:334] "Generic (PLEG): container finished" podID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerID="3e953d7615c6c835cb5b26484f6ee8e9e3aef2506bbdd2dd58068567eb332220" exitCode=0 Mar 14 09:11:16 crc kubenswrapper[5129]: I0314 09:11:16.632501 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" event={"ID":"52f5f08c-6373-4ac4-8fd9-278bb200b1ef","Type":"ContainerDied","Data":"3e953d7615c6c835cb5b26484f6ee8e9e3aef2506bbdd2dd58068567eb332220"} Mar 14 09:11:16 crc kubenswrapper[5129]: I0314 09:11:16.632529 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" event={"ID":"52f5f08c-6373-4ac4-8fd9-278bb200b1ef","Type":"ContainerStarted","Data":"dd6e6a0244dfacf1495215e3c0a6d987426454575814991f5492564561c5ff48"} Mar 14 09:11:17 crc kubenswrapper[5129]: I0314 09:11:17.647659 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" event={"ID":"52f5f08c-6373-4ac4-8fd9-278bb200b1ef","Type":"ContainerStarted","Data":"6d0016b4051a055bef1eedd7c861f429722b0d3c7a3ef2d21f8dd9c7bc64dd03"} Mar 14 09:11:17 crc kubenswrapper[5129]: I0314 09:11:17.647944 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:17 crc kubenswrapper[5129]: I0314 09:11:17.680227 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" podStartSLOduration=3.680197719 podStartE2EDuration="3.680197719s" podCreationTimestamp="2026-03-14 09:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:11:17.666672112 +0000 UTC m=+7940.418587316" watchObservedRunningTime="2026-03-14 09:11:17.680197719 +0000 UTC m=+7940.432112903" Mar 14 09:11:18 crc kubenswrapper[5129]: I0314 09:11:18.645536 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.288528 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.289383 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="318cba12-018f-4218-9fb6-6e3f1c5e2970" containerName="kube-state-metrics" containerID="cri-o://0861d3de50b790da5f4a9a807f4108900a0f99e60667fb8a6f3bb3ca2babb3f4" gracePeriod=30 Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.708325 5129 generic.go:334] "Generic (PLEG): container finished" podID="318cba12-018f-4218-9fb6-6e3f1c5e2970" containerID="0861d3de50b790da5f4a9a807f4108900a0f99e60667fb8a6f3bb3ca2babb3f4" exitCode=2 Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.708403 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"318cba12-018f-4218-9fb6-6e3f1c5e2970","Type":"ContainerDied","Data":"0861d3de50b790da5f4a9a807f4108900a0f99e60667fb8a6f3bb3ca2babb3f4"} Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.708712 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"318cba12-018f-4218-9fb6-6e3f1c5e2970","Type":"ContainerDied","Data":"e307039545896866f50faa1a01f6cffa3348bc3cf1e13c1729d2d42b2a6497c9"} Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.708739 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e307039545896866f50faa1a01f6cffa3348bc3cf1e13c1729d2d42b2a6497c9" Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.821299 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.947638 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvtxv\" (UniqueName: \"kubernetes.io/projected/318cba12-018f-4218-9fb6-6e3f1c5e2970-kube-api-access-hvtxv\") pod \"318cba12-018f-4218-9fb6-6e3f1c5e2970\" (UID: \"318cba12-018f-4218-9fb6-6e3f1c5e2970\") " Mar 14 09:11:22 crc kubenswrapper[5129]: I0314 09:11:22.958150 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318cba12-018f-4218-9fb6-6e3f1c5e2970-kube-api-access-hvtxv" (OuterVolumeSpecName: "kube-api-access-hvtxv") pod "318cba12-018f-4218-9fb6-6e3f1c5e2970" (UID: "318cba12-018f-4218-9fb6-6e3f1c5e2970"). InnerVolumeSpecName "kube-api-access-hvtxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.049942 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvtxv\" (UniqueName: \"kubernetes.io/projected/318cba12-018f-4218-9fb6-6e3f1c5e2970-kube-api-access-hvtxv\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.717333 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.762781 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.792707 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.800456 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:11:23 crc kubenswrapper[5129]: E0314 09:11:23.801221 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54691f31-25f9-49a0-81fe-b7517bd39506" containerName="dnsmasq-dns" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.801248 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="54691f31-25f9-49a0-81fe-b7517bd39506" containerName="dnsmasq-dns" Mar 14 09:11:23 crc kubenswrapper[5129]: E0314 09:11:23.801299 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54691f31-25f9-49a0-81fe-b7517bd39506" containerName="init" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.801310 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="54691f31-25f9-49a0-81fe-b7517bd39506" containerName="init" Mar 14 09:11:23 crc kubenswrapper[5129]: E0314 09:11:23.801333 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318cba12-018f-4218-9fb6-6e3f1c5e2970" containerName="kube-state-metrics" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.801341 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="318cba12-018f-4218-9fb6-6e3f1c5e2970" containerName="kube-state-metrics" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.801625 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="318cba12-018f-4218-9fb6-6e3f1c5e2970" containerName="kube-state-metrics" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.801680 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="54691f31-25f9-49a0-81fe-b7517bd39506" containerName="dnsmasq-dns" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.802864 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.806446 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.809905 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.831270 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.869521 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.869808 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9tf\" (UniqueName: \"kubernetes.io/projected/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-api-access-dr9tf\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.869893 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.870036 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.972363 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.972460 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr9tf\" (UniqueName: \"kubernetes.io/projected/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-api-access-dr9tf\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.972491 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.972529 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.979110 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.979321 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.979865 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66a75c1-a404-4f9f-9017-3bb734aee917-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:23 crc kubenswrapper[5129]: I0314 09:11:23.990079 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr9tf\" (UniqueName: \"kubernetes.io/projected/f66a75c1-a404-4f9f-9017-3bb734aee917-kube-api-access-dr9tf\") pod \"kube-state-metrics-0\" (UID: \"f66a75c1-a404-4f9f-9017-3bb734aee917\") " pod="openstack/kube-state-metrics-0" Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.053233 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318cba12-018f-4218-9fb6-6e3f1c5e2970" path="/var/lib/kubelet/pods/318cba12-018f-4218-9fb6-6e3f1c5e2970/volumes" Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.136712 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.381409 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.382290 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-central-agent" containerID="cri-o://d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5" gracePeriod=30 Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.382353 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="proxy-httpd" containerID="cri-o://6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950" gracePeriod=30 Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.382408 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="sg-core" containerID="cri-o://2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4" gracePeriod=30 Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.382447 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-notification-agent" containerID="cri-o://bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5" gracePeriod=30 Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.683891 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 09:11:24 crc kubenswrapper[5129]: W0314 09:11:24.687183 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf66a75c1_a404_4f9f_9017_3bb734aee917.slice/crio-21924ca8cabc71fa29aa60952d54f3ad957ba5aea19479f741edc1e3ce9ba814 WatchSource:0}: Error finding container 21924ca8cabc71fa29aa60952d54f3ad957ba5aea19479f741edc1e3ce9ba814: Status 404 returned error can't find the container with id 21924ca8cabc71fa29aa60952d54f3ad957ba5aea19479f741edc1e3ce9ba814 Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.690557 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.732020 5129 generic.go:334] "Generic (PLEG): container finished" podID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerID="6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950" exitCode=0 Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.732661 5129 generic.go:334] "Generic (PLEG): container finished" podID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerID="2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4" exitCode=2 Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.732088 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerDied","Data":"6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950"} Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.732782 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerDied","Data":"2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4"} Mar 14 09:11:24 crc kubenswrapper[5129]: I0314 09:11:24.734961 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f66a75c1-a404-4f9f-9017-3bb734aee917","Type":"ContainerStarted","Data":"21924ca8cabc71fa29aa60952d54f3ad957ba5aea19479f741edc1e3ce9ba814"} Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.118901 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.228884 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc78fbb6c-kc5wx"] Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.229269 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" podUID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerName="dnsmasq-dns" containerID="cri-o://b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b" gracePeriod=10 Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.690702 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.716844 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-dns-svc\") pod \"4ac6df17-a845-4a30-8fc6-582fca777ab7\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.716943 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-nb\") pod \"4ac6df17-a845-4a30-8fc6-582fca777ab7\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.716966 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-config\") pod \"4ac6df17-a845-4a30-8fc6-582fca777ab7\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.717010 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsm54\" (UniqueName: \"kubernetes.io/projected/4ac6df17-a845-4a30-8fc6-582fca777ab7-kube-api-access-wsm54\") pod \"4ac6df17-a845-4a30-8fc6-582fca777ab7\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.717129 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-cell1\") pod \"4ac6df17-a845-4a30-8fc6-582fca777ab7\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.717149 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-networker\") pod \"4ac6df17-a845-4a30-8fc6-582fca777ab7\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.717236 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-sb\") pod \"4ac6df17-a845-4a30-8fc6-582fca777ab7\" (UID: \"4ac6df17-a845-4a30-8fc6-582fca777ab7\") " Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.729778 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac6df17-a845-4a30-8fc6-582fca777ab7-kube-api-access-wsm54" (OuterVolumeSpecName: "kube-api-access-wsm54") pod "4ac6df17-a845-4a30-8fc6-582fca777ab7" (UID: "4ac6df17-a845-4a30-8fc6-582fca777ab7"). InnerVolumeSpecName "kube-api-access-wsm54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.759490 5129 generic.go:334] "Generic (PLEG): container finished" podID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerID="d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5" exitCode=0 Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.759569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerDied","Data":"d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5"} Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.762279 5129 generic.go:334] "Generic (PLEG): container finished" podID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerID="b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b" exitCode=0 Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.762323 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" event={"ID":"4ac6df17-a845-4a30-8fc6-582fca777ab7","Type":"ContainerDied","Data":"b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b"} Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.762374 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.762382 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc78fbb6c-kc5wx" event={"ID":"4ac6df17-a845-4a30-8fc6-582fca777ab7","Type":"ContainerDied","Data":"455c893965527a815ba1bdd9c8849530069079736780be79cf07113085589a30"} Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.762406 5129 scope.go:117] "RemoveContainer" containerID="b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.764701 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f66a75c1-a404-4f9f-9017-3bb734aee917","Type":"ContainerStarted","Data":"7864a107180b1be64a6e66222390898e6f08982269c9c3ea45aee7167a67a651"} Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.764855 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.775259 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-config" (OuterVolumeSpecName: "config") pod "4ac6df17-a845-4a30-8fc6-582fca777ab7" (UID: "4ac6df17-a845-4a30-8fc6-582fca777ab7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.778934 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ac6df17-a845-4a30-8fc6-582fca777ab7" (UID: "4ac6df17-a845-4a30-8fc6-582fca777ab7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.793460 5129 scope.go:117] "RemoveContainer" containerID="cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.795386 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "4ac6df17-a845-4a30-8fc6-582fca777ab7" (UID: "4ac6df17-a845-4a30-8fc6-582fca777ab7"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.811537 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ac6df17-a845-4a30-8fc6-582fca777ab7" (UID: "4ac6df17-a845-4a30-8fc6-582fca777ab7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.815699 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ac6df17-a845-4a30-8fc6-582fca777ab7" (UID: "4ac6df17-a845-4a30-8fc6-582fca777ab7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.820505 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.820546 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.820564 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.820578 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.820590 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.820622 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsm54\" (UniqueName: \"kubernetes.io/projected/4ac6df17-a845-4a30-8fc6-582fca777ab7-kube-api-access-wsm54\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.820903 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "4ac6df17-a845-4a30-8fc6-582fca777ab7" (UID: "4ac6df17-a845-4a30-8fc6-582fca777ab7"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.833041 5129 scope.go:117] "RemoveContainer" containerID="b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b" Mar 14 09:11:25 crc kubenswrapper[5129]: E0314 09:11:25.833666 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b\": container with ID starting with b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b not found: ID does not exist" containerID="b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.833753 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b"} err="failed to get container status \"b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b\": rpc error: code = NotFound desc = could not find container \"b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b\": container with ID starting with b5bae7706b70d592e45920284fc14e199c867141da750d3797c647681d60677b not found: ID does not exist" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.833803 5129 scope.go:117] "RemoveContainer" containerID="cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404" Mar 14 09:11:25 crc kubenswrapper[5129]: E0314 09:11:25.834225 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404\": container with ID starting with cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404 not found: ID does not exist" containerID="cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.834267 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404"} err="failed to get container status \"cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404\": rpc error: code = NotFound desc = could not find container \"cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404\": container with ID starting with cf6ca59f3e7c48c814bc4b676e1beef623e256ab379ed9b3b0056fbfd3d69404 not found: ID does not exist" Mar 14 09:11:25 crc kubenswrapper[5129]: I0314 09:11:25.923283 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4ac6df17-a845-4a30-8fc6-582fca777ab7-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.096039 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.62698135 podStartE2EDuration="3.095999538s" podCreationTimestamp="2026-03-14 09:11:23 +0000 UTC" firstStartedPulling="2026-03-14 09:11:24.690373283 +0000 UTC m=+7947.442288467" lastFinishedPulling="2026-03-14 09:11:25.159391451 +0000 UTC m=+7947.911306655" observedRunningTime="2026-03-14 09:11:25.795621827 +0000 UTC m=+7948.547537011" watchObservedRunningTime="2026-03-14 09:11:26.095999538 +0000 UTC m=+7948.847914722" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.104019 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc78fbb6c-kc5wx"] Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.116091 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cc78fbb6c-kc5wx"] Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.583360 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.636994 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-config-data\") pod \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.637037 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv6mc\" (UniqueName: \"kubernetes.io/projected/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-kube-api-access-vv6mc\") pod \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.637169 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-run-httpd\") pod \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.637217 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-scripts\") pod \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.637351 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-log-httpd\") pod \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.637403 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-combined-ca-bundle\") pod \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.637430 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-sg-core-conf-yaml\") pod \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\" (UID: \"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933\") " Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.638270 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" (UID: "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.638393 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" (UID: "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.646039 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-scripts" (OuterVolumeSpecName: "scripts") pod "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" (UID: "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.646331 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-kube-api-access-vv6mc" (OuterVolumeSpecName: "kube-api-access-vv6mc") pod "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" (UID: "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933"). InnerVolumeSpecName "kube-api-access-vv6mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.671514 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" (UID: "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.745555 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv6mc\" (UniqueName: \"kubernetes.io/projected/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-kube-api-access-vv6mc\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.745597 5129 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.745622 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.745631 5129 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.745643 5129 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.745654 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" (UID: "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.762847 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-config-data" (OuterVolumeSpecName: "config-data") pod "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" (UID: "bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.783264 5129 generic.go:334] "Generic (PLEG): container finished" podID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerID="bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5" exitCode=0 Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.783338 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.783373 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerDied","Data":"bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5"} Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.783502 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933","Type":"ContainerDied","Data":"3f2ffa64367cc7225a6ff5db77bba2897c93096cb8172c8c9ad7349bcc8d688c"} Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.783541 5129 scope.go:117] "RemoveContainer" containerID="6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.831221 5129 scope.go:117] "RemoveContainer" containerID="2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.832280 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.848500 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.848545 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.853641 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.871067 5129 scope.go:117] "RemoveContainer" containerID="bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.872535 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.872964 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerName="dnsmasq-dns" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.872976 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerName="dnsmasq-dns" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.873000 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="proxy-httpd" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873006 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="proxy-httpd" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.873026 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-notification-agent" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873032 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-notification-agent" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.873050 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerName="init" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873056 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerName="init" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.873068 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-central-agent" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873074 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-central-agent" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.873087 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="sg-core" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873093 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="sg-core" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873269 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac6df17-a845-4a30-8fc6-582fca777ab7" containerName="dnsmasq-dns" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873284 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="sg-core" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873298 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-central-agent" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873306 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="proxy-httpd" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.873321 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" containerName="ceilometer-notification-agent" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.885012 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.885292 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.887639 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.888139 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.891102 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.898110 5129 scope.go:117] "RemoveContainer" containerID="d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.929436 5129 scope.go:117] "RemoveContainer" containerID="6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.930084 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950\": container with ID starting with 6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950 not found: ID does not exist" containerID="6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.930122 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950"} err="failed to get container status \"6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950\": rpc error: code = NotFound desc = could not find container \"6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950\": container with ID starting with 6e3d7b98a2727f19b4709cedaf73021445b50d3b40032de742d1f6ac5a4f8950 not found: ID does not exist" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.930145 5129 scope.go:117] "RemoveContainer" containerID="2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.930532 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4\": container with ID starting with 2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4 not found: ID does not exist" containerID="2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.930584 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4"} err="failed to get container status \"2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4\": rpc error: code = NotFound desc = could not find container \"2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4\": container with ID starting with 2b148081e3612f9945701f546758b96390e6ca61d9986e1b4f9888608d2a30f4 not found: ID does not exist" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.930645 5129 scope.go:117] "RemoveContainer" containerID="bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.931028 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5\": container with ID starting with bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5 not found: ID does not exist" containerID="bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.931081 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5"} err="failed to get container status \"bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5\": rpc error: code = NotFound desc = could not find container \"bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5\": container with ID starting with bf8dec1235f56a4f032a3dd3965a64da6ef04bf1988e3affa498b8eaae6438f5 not found: ID does not exist" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.931115 5129 scope.go:117] "RemoveContainer" containerID="d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5" Mar 14 09:11:26 crc kubenswrapper[5129]: E0314 09:11:26.931447 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5\": container with ID starting with d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5 not found: ID does not exist" containerID="d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.931493 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5"} err="failed to get container status \"d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5\": rpc error: code = NotFound desc = could not find container \"d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5\": container with ID starting with d508bcb1a691d588c1cf488bdc8049536ef901934a9449fc866055f3998183d5 not found: ID does not exist" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.950835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.950934 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.950971 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c530f52-0988-468f-95d9-c45c3550a14c-log-httpd\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.950998 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-config-data\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.951018 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.951161 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c530f52-0988-468f-95d9-c45c3550a14c-run-httpd\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.951555 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q8s5\" (UniqueName: \"kubernetes.io/projected/1c530f52-0988-468f-95d9-c45c3550a14c-kube-api-access-2q8s5\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:26 crc kubenswrapper[5129]: I0314 09:11:26.951967 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-scripts\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.053686 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.053751 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.053773 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c530f52-0988-468f-95d9-c45c3550a14c-log-httpd\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.053801 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-config-data\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.053823 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.053852 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c530f52-0988-468f-95d9-c45c3550a14c-run-httpd\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.053932 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q8s5\" (UniqueName: \"kubernetes.io/projected/1c530f52-0988-468f-95d9-c45c3550a14c-kube-api-access-2q8s5\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.054024 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-scripts\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.057106 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c530f52-0988-468f-95d9-c45c3550a14c-log-httpd\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.057120 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c530f52-0988-468f-95d9-c45c3550a14c-run-httpd\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.060345 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-config-data\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.060348 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.060724 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-scripts\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.061095 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.062282 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c530f52-0988-468f-95d9-c45c3550a14c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.075450 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q8s5\" (UniqueName: \"kubernetes.io/projected/1c530f52-0988-468f-95d9-c45c3550a14c-kube-api-access-2q8s5\") pod \"ceilometer-0\" (UID: \"1c530f52-0988-468f-95d9-c45c3550a14c\") " pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.203410 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 09:11:27 crc kubenswrapper[5129]: I0314 09:11:27.792344 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 09:11:27 crc kubenswrapper[5129]: W0314 09:11:27.795259 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c530f52_0988_468f_95d9_c45c3550a14c.slice/crio-ec39fcb904ccf498290af32cdc0312ca40254e94bb5fcb5deef288514c2e38f8 WatchSource:0}: Error finding container ec39fcb904ccf498290af32cdc0312ca40254e94bb5fcb5deef288514c2e38f8: Status 404 returned error can't find the container with id ec39fcb904ccf498290af32cdc0312ca40254e94bb5fcb5deef288514c2e38f8 Mar 14 09:11:28 crc kubenswrapper[5129]: I0314 09:11:28.050552 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac6df17-a845-4a30-8fc6-582fca777ab7" path="/var/lib/kubelet/pods/4ac6df17-a845-4a30-8fc6-582fca777ab7/volumes" Mar 14 09:11:28 crc kubenswrapper[5129]: I0314 09:11:28.051744 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933" path="/var/lib/kubelet/pods/bcfa2cc5-76f3-4cfb-bbf4-bb9b80049933/volumes" Mar 14 09:11:28 crc kubenswrapper[5129]: I0314 09:11:28.811537 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c530f52-0988-468f-95d9-c45c3550a14c","Type":"ContainerStarted","Data":"41391f471f7de0c88a0d118914565692a74f546e5e29d05e045a021bcc7a123c"} Mar 14 09:11:28 crc kubenswrapper[5129]: I0314 09:11:28.812826 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c530f52-0988-468f-95d9-c45c3550a14c","Type":"ContainerStarted","Data":"b18cbf63628e7cbebc5ba0cd6325e4006b1d72b385889d9c330233bd5d5fa2cd"} Mar 14 09:11:28 crc kubenswrapper[5129]: I0314 09:11:28.812915 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c530f52-0988-468f-95d9-c45c3550a14c","Type":"ContainerStarted","Data":"ec39fcb904ccf498290af32cdc0312ca40254e94bb5fcb5deef288514c2e38f8"} Mar 14 09:11:29 crc kubenswrapper[5129]: I0314 09:11:29.037213 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:11:29 crc kubenswrapper[5129]: E0314 09:11:29.037646 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:11:29 crc kubenswrapper[5129]: I0314 09:11:29.824067 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c530f52-0988-468f-95d9-c45c3550a14c","Type":"ContainerStarted","Data":"364c6ac31a285771413348b207f7ca9cf56be34a00caec256aca483d9ed4409d"} Mar 14 09:11:30 crc kubenswrapper[5129]: I0314 09:11:30.056360 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-q2sjz"] Mar 14 09:11:30 crc kubenswrapper[5129]: I0314 09:11:30.056411 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-q2sjz"] Mar 14 09:11:31 crc kubenswrapper[5129]: I0314 09:11:31.845550 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c530f52-0988-468f-95d9-c45c3550a14c","Type":"ContainerStarted","Data":"202c4d64ad3372bf003defc2de558fd642f6eb3949e0d22930e317d75258bbb5"} Mar 14 09:11:31 crc kubenswrapper[5129]: I0314 09:11:31.847401 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 09:11:31 crc kubenswrapper[5129]: I0314 09:11:31.895532 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.893943295 podStartE2EDuration="5.895505978s" podCreationTimestamp="2026-03-14 09:11:26 +0000 UTC" firstStartedPulling="2026-03-14 09:11:27.797899692 +0000 UTC m=+7950.549814876" lastFinishedPulling="2026-03-14 09:11:30.799462375 +0000 UTC m=+7953.551377559" observedRunningTime="2026-03-14 09:11:31.875994389 +0000 UTC m=+7954.627909573" watchObservedRunningTime="2026-03-14 09:11:31.895505978 +0000 UTC m=+7954.647421162" Mar 14 09:11:32 crc kubenswrapper[5129]: I0314 09:11:32.074055 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7cfecd-212c-45e6-b678-5e673ff37968" path="/var/lib/kubelet/pods/2d7cfecd-212c-45e6-b678-5e673ff37968/volumes" Mar 14 09:11:32 crc kubenswrapper[5129]: I0314 09:11:32.616647 5129 scope.go:117] "RemoveContainer" containerID="b04c7ca870d112e206a7ab30467f1f8f7dc9350cc8b0b085ee81ae36772ae0fd" Mar 14 09:11:32 crc kubenswrapper[5129]: I0314 09:11:32.642627 5129 scope.go:117] "RemoveContainer" containerID="b7ec48e09f49897d15d62ac69c4f6b5157fde2f77f6ab2e3d6369c7f088652c2" Mar 14 09:11:32 crc kubenswrapper[5129]: I0314 09:11:32.713075 5129 scope.go:117] "RemoveContainer" containerID="414a4b03f77f957031a3dcc3fa489ae983a68da4a1c13fb14da0287016ae3aee" Mar 14 09:11:34 crc kubenswrapper[5129]: I0314 09:11:34.151781 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.558656 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4"] Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.562250 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.570507 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.571102 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.583589 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.584571 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.587768 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944"] Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.594232 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.602446 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.602892 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.609389 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4"] Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.621188 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944"] Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656217 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656268 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656335 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdkqv\" (UniqueName: \"kubernetes.io/projected/8719eb25-616a-4a7d-9f70-ebf7f6216b59-kube-api-access-hdkqv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656392 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656415 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656459 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656493 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt8j5\" (UniqueName: \"kubernetes.io/projected/e5aeecff-523e-415e-bb7c-121a3ca25973-kube-api-access-tt8j5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.656530 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758630 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758694 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt8j5\" (UniqueName: \"kubernetes.io/projected/e5aeecff-523e-415e-bb7c-121a3ca25973-kube-api-access-tt8j5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758737 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758765 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758790 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758843 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdkqv\" (UniqueName: \"kubernetes.io/projected/8719eb25-616a-4a7d-9f70-ebf7f6216b59-kube-api-access-hdkqv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758898 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.758925 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.765367 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.765518 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.766917 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.768426 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.768688 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.770943 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.780807 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdkqv\" (UniqueName: \"kubernetes.io/projected/8719eb25-616a-4a7d-9f70-ebf7f6216b59-kube-api-access-hdkqv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nhf944\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.786105 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt8j5\" (UniqueName: \"kubernetes.io/projected/e5aeecff-523e-415e-bb7c-121a3ca25973-kube-api-access-tt8j5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.903969 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:11:39 crc kubenswrapper[5129]: I0314 09:11:39.934763 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:11:40 crc kubenswrapper[5129]: I0314 09:11:40.702781 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944"] Mar 14 09:11:40 crc kubenswrapper[5129]: I0314 09:11:40.954881 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" event={"ID":"8719eb25-616a-4a7d-9f70-ebf7f6216b59","Type":"ContainerStarted","Data":"9316b05623705ac647701aa3f4792f10bf018cdc591628bc5811366689145897"} Mar 14 09:11:41 crc kubenswrapper[5129]: W0314 09:11:41.233745 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5aeecff_523e_415e_bb7c_121a3ca25973.slice/crio-3852a210f6532efa90161ae00a554881e7c71e8b928a24a62fb4c1c7835760cf WatchSource:0}: Error finding container 3852a210f6532efa90161ae00a554881e7c71e8b928a24a62fb4c1c7835760cf: Status 404 returned error can't find the container with id 3852a210f6532efa90161ae00a554881e7c71e8b928a24a62fb4c1c7835760cf Mar 14 09:11:41 crc kubenswrapper[5129]: I0314 09:11:41.238845 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4"] Mar 14 09:11:41 crc kubenswrapper[5129]: I0314 09:11:41.964277 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" event={"ID":"e5aeecff-523e-415e-bb7c-121a3ca25973","Type":"ContainerStarted","Data":"3852a210f6532efa90161ae00a554881e7c71e8b928a24a62fb4c1c7835760cf"} Mar 14 09:11:44 crc kubenswrapper[5129]: I0314 09:11:44.037519 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:11:44 crc kubenswrapper[5129]: E0314 09:11:44.038072 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:11:50 crc kubenswrapper[5129]: I0314 09:11:50.072774 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" event={"ID":"8719eb25-616a-4a7d-9f70-ebf7f6216b59","Type":"ContainerStarted","Data":"4a6faa2dabdfd238a02796c08ee2b5fec3d098e6d6f515465a43c8f278b67244"} Mar 14 09:11:50 crc kubenswrapper[5129]: I0314 09:11:50.080706 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" event={"ID":"e5aeecff-523e-415e-bb7c-121a3ca25973","Type":"ContainerStarted","Data":"6f828b86b8f14aeb4d7814d1778313bdfe50bb21912f61b69f6e5937da05f150"} Mar 14 09:11:50 crc kubenswrapper[5129]: I0314 09:11:50.099299 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" podStartSLOduration=2.663250912 podStartE2EDuration="11.099280389s" podCreationTimestamp="2026-03-14 09:11:39 +0000 UTC" firstStartedPulling="2026-03-14 09:11:40.713229352 +0000 UTC m=+7963.465144536" lastFinishedPulling="2026-03-14 09:11:49.149258819 +0000 UTC m=+7971.901174013" observedRunningTime="2026-03-14 09:11:50.095362093 +0000 UTC m=+7972.847277277" watchObservedRunningTime="2026-03-14 09:11:50.099280389 +0000 UTC m=+7972.851195573" Mar 14 09:11:50 crc kubenswrapper[5129]: I0314 09:11:50.118543 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" podStartSLOduration=3.154942965 podStartE2EDuration="11.118526292s" podCreationTimestamp="2026-03-14 09:11:39 +0000 UTC" firstStartedPulling="2026-03-14 09:11:41.236066959 +0000 UTC m=+7963.987982143" lastFinishedPulling="2026-03-14 09:11:49.199650286 +0000 UTC m=+7971.951565470" observedRunningTime="2026-03-14 09:11:50.112497498 +0000 UTC m=+7972.864412682" watchObservedRunningTime="2026-03-14 09:11:50.118526292 +0000 UTC m=+7972.870441476" Mar 14 09:11:55 crc kubenswrapper[5129]: I0314 09:11:55.036184 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:11:55 crc kubenswrapper[5129]: E0314 09:11:55.037002 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:11:57 crc kubenswrapper[5129]: I0314 09:11:57.215043 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 09:11:58 crc kubenswrapper[5129]: I0314 09:11:58.062653 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6cbzr"] Mar 14 09:11:58 crc kubenswrapper[5129]: I0314 09:11:58.063047 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6cbzr"] Mar 14 09:11:59 crc kubenswrapper[5129]: I0314 09:11:59.029792 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0580-account-create-update-2gjwf"] Mar 14 09:11:59 crc kubenswrapper[5129]: I0314 09:11:59.037857 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0580-account-create-update-2gjwf"] Mar 14 09:11:59 crc kubenswrapper[5129]: I0314 09:11:59.170100 5129 generic.go:334] "Generic (PLEG): container finished" podID="8719eb25-616a-4a7d-9f70-ebf7f6216b59" containerID="4a6faa2dabdfd238a02796c08ee2b5fec3d098e6d6f515465a43c8f278b67244" exitCode=0 Mar 14 09:11:59 crc kubenswrapper[5129]: I0314 09:11:59.170162 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" event={"ID":"8719eb25-616a-4a7d-9f70-ebf7f6216b59","Type":"ContainerDied","Data":"4a6faa2dabdfd238a02796c08ee2b5fec3d098e6d6f515465a43c8f278b67244"} Mar 14 09:11:59 crc kubenswrapper[5129]: E0314 09:11:59.886238 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5aeecff_523e_415e_bb7c_121a3ca25973.slice/crio-6f828b86b8f14aeb4d7814d1778313bdfe50bb21912f61b69f6e5937da05f150.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5aeecff_523e_415e_bb7c_121a3ca25973.slice/crio-conmon-6f828b86b8f14aeb4d7814d1778313bdfe50bb21912f61b69f6e5937da05f150.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.052122 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4d5345-6c9c-4eb4-872b-8674628d408c" path="/var/lib/kubelet/pods/3c4d5345-6c9c-4eb4-872b-8674628d408c/volumes" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.052886 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8" path="/var/lib/kubelet/pods/4e84f4a2-b97a-4cd0-b1fc-afae5966d3f8/volumes" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.144492 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557992-k6wtl"] Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.146300 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.148252 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.148636 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.148833 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.159260 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-k6wtl"] Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.179647 5129 generic.go:334] "Generic (PLEG): container finished" podID="e5aeecff-523e-415e-bb7c-121a3ca25973" containerID="6f828b86b8f14aeb4d7814d1778313bdfe50bb21912f61b69f6e5937da05f150" exitCode=0 Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.179826 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" event={"ID":"e5aeecff-523e-415e-bb7c-121a3ca25973","Type":"ContainerDied","Data":"6f828b86b8f14aeb4d7814d1778313bdfe50bb21912f61b69f6e5937da05f150"} Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.284109 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnwg\" (UniqueName: \"kubernetes.io/projected/845c1e77-2312-4b8b-90af-042e6863f825-kube-api-access-rjnwg\") pod \"auto-csr-approver-29557992-k6wtl\" (UID: \"845c1e77-2312-4b8b-90af-042e6863f825\") " pod="openshift-infra/auto-csr-approver-29557992-k6wtl" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.387869 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnwg\" (UniqueName: \"kubernetes.io/projected/845c1e77-2312-4b8b-90af-042e6863f825-kube-api-access-rjnwg\") pod \"auto-csr-approver-29557992-k6wtl\" (UID: \"845c1e77-2312-4b8b-90af-042e6863f825\") " pod="openshift-infra/auto-csr-approver-29557992-k6wtl" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.409686 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnwg\" (UniqueName: \"kubernetes.io/projected/845c1e77-2312-4b8b-90af-042e6863f825-kube-api-access-rjnwg\") pod \"auto-csr-approver-29557992-k6wtl\" (UID: \"845c1e77-2312-4b8b-90af-042e6863f825\") " pod="openshift-infra/auto-csr-approver-29557992-k6wtl" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.469133 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.604749 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.697167 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-inventory\") pod \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.697310 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-pre-adoption-validation-combined-ca-bundle\") pod \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.697339 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdkqv\" (UniqueName: \"kubernetes.io/projected/8719eb25-616a-4a7d-9f70-ebf7f6216b59-kube-api-access-hdkqv\") pod \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.697408 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-ssh-key-openstack-networker\") pod \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\" (UID: \"8719eb25-616a-4a7d-9f70-ebf7f6216b59\") " Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.703563 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8719eb25-616a-4a7d-9f70-ebf7f6216b59-kube-api-access-hdkqv" (OuterVolumeSpecName: "kube-api-access-hdkqv") pod "8719eb25-616a-4a7d-9f70-ebf7f6216b59" (UID: "8719eb25-616a-4a7d-9f70-ebf7f6216b59"). InnerVolumeSpecName "kube-api-access-hdkqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.710820 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "8719eb25-616a-4a7d-9f70-ebf7f6216b59" (UID: "8719eb25-616a-4a7d-9f70-ebf7f6216b59"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.728945 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-inventory" (OuterVolumeSpecName: "inventory") pod "8719eb25-616a-4a7d-9f70-ebf7f6216b59" (UID: "8719eb25-616a-4a7d-9f70-ebf7f6216b59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.730116 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "8719eb25-616a-4a7d-9f70-ebf7f6216b59" (UID: "8719eb25-616a-4a7d-9f70-ebf7f6216b59"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.800748 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.800789 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.800802 5129 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8719eb25-616a-4a7d-9f70-ebf7f6216b59-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.800814 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdkqv\" (UniqueName: \"kubernetes.io/projected/8719eb25-616a-4a7d-9f70-ebf7f6216b59-kube-api-access-hdkqv\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:00 crc kubenswrapper[5129]: I0314 09:12:00.931861 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-k6wtl"] Mar 14 09:12:00 crc kubenswrapper[5129]: W0314 09:12:00.937362 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod845c1e77_2312_4b8b_90af_042e6863f825.slice/crio-656d3e4c3a478f9a56e6cffe662edb45033166aa674e818ab90852d2ef9fcdbe WatchSource:0}: Error finding container 656d3e4c3a478f9a56e6cffe662edb45033166aa674e818ab90852d2ef9fcdbe: Status 404 returned error can't find the container with id 656d3e4c3a478f9a56e6cffe662edb45033166aa674e818ab90852d2ef9fcdbe Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.191284 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" event={"ID":"845c1e77-2312-4b8b-90af-042e6863f825","Type":"ContainerStarted","Data":"656d3e4c3a478f9a56e6cffe662edb45033166aa674e818ab90852d2ef9fcdbe"} Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.193508 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.193590 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nhf944" event={"ID":"8719eb25-616a-4a7d-9f70-ebf7f6216b59","Type":"ContainerDied","Data":"9316b05623705ac647701aa3f4792f10bf018cdc591628bc5811366689145897"} Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.193634 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9316b05623705ac647701aa3f4792f10bf018cdc591628bc5811366689145897" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.689139 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.819987 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-pre-adoption-validation-combined-ca-bundle\") pod \"e5aeecff-523e-415e-bb7c-121a3ca25973\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.820210 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-inventory\") pod \"e5aeecff-523e-415e-bb7c-121a3ca25973\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.820277 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt8j5\" (UniqueName: \"kubernetes.io/projected/e5aeecff-523e-415e-bb7c-121a3ca25973-kube-api-access-tt8j5\") pod \"e5aeecff-523e-415e-bb7c-121a3ca25973\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.820329 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-ssh-key-openstack-cell1\") pod \"e5aeecff-523e-415e-bb7c-121a3ca25973\" (UID: \"e5aeecff-523e-415e-bb7c-121a3ca25973\") " Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.827255 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "e5aeecff-523e-415e-bb7c-121a3ca25973" (UID: "e5aeecff-523e-415e-bb7c-121a3ca25973"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.829789 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5aeecff-523e-415e-bb7c-121a3ca25973-kube-api-access-tt8j5" (OuterVolumeSpecName: "kube-api-access-tt8j5") pod "e5aeecff-523e-415e-bb7c-121a3ca25973" (UID: "e5aeecff-523e-415e-bb7c-121a3ca25973"). InnerVolumeSpecName "kube-api-access-tt8j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.848220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e5aeecff-523e-415e-bb7c-121a3ca25973" (UID: "e5aeecff-523e-415e-bb7c-121a3ca25973"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.866299 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-inventory" (OuterVolumeSpecName: "inventory") pod "e5aeecff-523e-415e-bb7c-121a3ca25973" (UID: "e5aeecff-523e-415e-bb7c-121a3ca25973"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.922403 5129 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.922712 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.922723 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt8j5\" (UniqueName: \"kubernetes.io/projected/e5aeecff-523e-415e-bb7c-121a3ca25973-kube-api-access-tt8j5\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:01 crc kubenswrapper[5129]: I0314 09:12:01.922736 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e5aeecff-523e-415e-bb7c-121a3ca25973-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:02 crc kubenswrapper[5129]: I0314 09:12:02.204419 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" event={"ID":"845c1e77-2312-4b8b-90af-042e6863f825","Type":"ContainerStarted","Data":"b18276702ee21ce85fc7cb4290368c316a6446ce845adfcb26f2310fbac407e9"} Mar 14 09:12:02 crc kubenswrapper[5129]: I0314 09:12:02.206939 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" event={"ID":"e5aeecff-523e-415e-bb7c-121a3ca25973","Type":"ContainerDied","Data":"3852a210f6532efa90161ae00a554881e7c71e8b928a24a62fb4c1c7835760cf"} Mar 14 09:12:02 crc kubenswrapper[5129]: I0314 09:12:02.206978 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3852a210f6532efa90161ae00a554881e7c71e8b928a24a62fb4c1c7835760cf" Mar 14 09:12:02 crc kubenswrapper[5129]: I0314 09:12:02.207034 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4" Mar 14 09:12:02 crc kubenswrapper[5129]: I0314 09:12:02.243287 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" podStartSLOduration=1.4274649689999999 podStartE2EDuration="2.243265749s" podCreationTimestamp="2026-03-14 09:12:00 +0000 UTC" firstStartedPulling="2026-03-14 09:12:00.94102052 +0000 UTC m=+7983.692935704" lastFinishedPulling="2026-03-14 09:12:01.7568213 +0000 UTC m=+7984.508736484" observedRunningTime="2026-03-14 09:12:02.218899759 +0000 UTC m=+7984.970814943" watchObservedRunningTime="2026-03-14 09:12:02.243265749 +0000 UTC m=+7984.995180933" Mar 14 09:12:03 crc kubenswrapper[5129]: I0314 09:12:03.218210 5129 generic.go:334] "Generic (PLEG): container finished" podID="845c1e77-2312-4b8b-90af-042e6863f825" containerID="b18276702ee21ce85fc7cb4290368c316a6446ce845adfcb26f2310fbac407e9" exitCode=0 Mar 14 09:12:03 crc kubenswrapper[5129]: I0314 09:12:03.218264 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" event={"ID":"845c1e77-2312-4b8b-90af-042e6863f825","Type":"ContainerDied","Data":"b18276702ee21ce85fc7cb4290368c316a6446ce845adfcb26f2310fbac407e9"} Mar 14 09:12:04 crc kubenswrapper[5129]: I0314 09:12:04.549000 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" Mar 14 09:12:04 crc kubenswrapper[5129]: I0314 09:12:04.681554 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnwg\" (UniqueName: \"kubernetes.io/projected/845c1e77-2312-4b8b-90af-042e6863f825-kube-api-access-rjnwg\") pod \"845c1e77-2312-4b8b-90af-042e6863f825\" (UID: \"845c1e77-2312-4b8b-90af-042e6863f825\") " Mar 14 09:12:04 crc kubenswrapper[5129]: I0314 09:12:04.688765 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845c1e77-2312-4b8b-90af-042e6863f825-kube-api-access-rjnwg" (OuterVolumeSpecName: "kube-api-access-rjnwg") pod "845c1e77-2312-4b8b-90af-042e6863f825" (UID: "845c1e77-2312-4b8b-90af-042e6863f825"). InnerVolumeSpecName "kube-api-access-rjnwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:04 crc kubenswrapper[5129]: I0314 09:12:04.783806 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnwg\" (UniqueName: \"kubernetes.io/projected/845c1e77-2312-4b8b-90af-042e6863f825-kube-api-access-rjnwg\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:05 crc kubenswrapper[5129]: I0314 09:12:05.240377 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" event={"ID":"845c1e77-2312-4b8b-90af-042e6863f825","Type":"ContainerDied","Data":"656d3e4c3a478f9a56e6cffe662edb45033166aa674e818ab90852d2ef9fcdbe"} Mar 14 09:12:05 crc kubenswrapper[5129]: I0314 09:12:05.240424 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="656d3e4c3a478f9a56e6cffe662edb45033166aa674e818ab90852d2ef9fcdbe" Mar 14 09:12:05 crc kubenswrapper[5129]: I0314 09:12:05.240486 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-k6wtl" Mar 14 09:12:05 crc kubenswrapper[5129]: I0314 09:12:05.288146 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-cmf8d"] Mar 14 09:12:05 crc kubenswrapper[5129]: I0314 09:12:05.297429 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-cmf8d"] Mar 14 09:12:06 crc kubenswrapper[5129]: I0314 09:12:06.037912 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:12:06 crc kubenswrapper[5129]: E0314 09:12:06.038543 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:12:06 crc kubenswrapper[5129]: I0314 09:12:06.052675 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b8dcd92-d201-4e88-8c81-8b3083bc1a76" path="/var/lib/kubelet/pods/8b8dcd92-d201-4e88-8c81-8b3083bc1a76/volumes" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.398393 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch"] Mar 14 09:12:07 crc kubenswrapper[5129]: E0314 09:12:07.399756 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8719eb25-616a-4a7d-9f70-ebf7f6216b59" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.399783 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8719eb25-616a-4a7d-9f70-ebf7f6216b59" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Mar 14 09:12:07 crc kubenswrapper[5129]: E0314 09:12:07.399803 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845c1e77-2312-4b8b-90af-042e6863f825" containerName="oc" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.399812 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="845c1e77-2312-4b8b-90af-042e6863f825" containerName="oc" Mar 14 09:12:07 crc kubenswrapper[5129]: E0314 09:12:07.399854 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5aeecff-523e-415e-bb7c-121a3ca25973" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.399861 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5aeecff-523e-415e-bb7c-121a3ca25973" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.400102 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="845c1e77-2312-4b8b-90af-042e6863f825" containerName="oc" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.400124 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8719eb25-616a-4a7d-9f70-ebf7f6216b59" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.400139 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5aeecff-523e-415e-bb7c-121a3ca25973" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.401105 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.403768 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.404530 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.404873 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.405261 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.421684 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch"] Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.434709 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q"] Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.436745 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.442198 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.442518 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.445625 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q"] Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539281 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539469 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539549 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539582 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539649 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539674 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539725 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w85fl\" (UniqueName: \"kubernetes.io/projected/78038caa-d464-4b63-a304-057a4393be51-kube-api-access-w85fl\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.539795 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgzrv\" (UniqueName: \"kubernetes.io/projected/93cba590-03e6-448b-a8d0-d61dbe971a6f-kube-api-access-jgzrv\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.641855 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.641960 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.642002 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.642037 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.642071 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.642113 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w85fl\" (UniqueName: \"kubernetes.io/projected/78038caa-d464-4b63-a304-057a4393be51-kube-api-access-w85fl\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.642174 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgzrv\" (UniqueName: \"kubernetes.io/projected/93cba590-03e6-448b-a8d0-d61dbe971a6f-kube-api-access-jgzrv\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.642239 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.650037 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.650068 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.651368 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.652264 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.657525 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.661278 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w85fl\" (UniqueName: \"kubernetes.io/projected/78038caa-d464-4b63-a304-057a4393be51-kube-api-access-w85fl\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.661929 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgzrv\" (UniqueName: \"kubernetes.io/projected/93cba590-03e6-448b-a8d0-d61dbe971a6f-kube-api-access-jgzrv\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.661950 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.731170 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:12:07 crc kubenswrapper[5129]: I0314 09:12:07.765979 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:12:08 crc kubenswrapper[5129]: I0314 09:12:08.054632 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fdncd"] Mar 14 09:12:08 crc kubenswrapper[5129]: I0314 09:12:08.062498 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fdncd"] Mar 14 09:12:08 crc kubenswrapper[5129]: I0314 09:12:08.348340 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch"] Mar 14 09:12:08 crc kubenswrapper[5129]: W0314 09:12:08.436303 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93cba590_03e6_448b_a8d0_d61dbe971a6f.slice/crio-3ff450860f2eaab624e501db2f7a6286a64af35e8682211b74afab572c0be37e WatchSource:0}: Error finding container 3ff450860f2eaab624e501db2f7a6286a64af35e8682211b74afab572c0be37e: Status 404 returned error can't find the container with id 3ff450860f2eaab624e501db2f7a6286a64af35e8682211b74afab572c0be37e Mar 14 09:12:08 crc kubenswrapper[5129]: I0314 09:12:08.437257 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q"] Mar 14 09:12:09 crc kubenswrapper[5129]: I0314 09:12:09.281512 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" event={"ID":"93cba590-03e6-448b-a8d0-d61dbe971a6f","Type":"ContainerStarted","Data":"1a6f3e906fc7089f90faafda269c88033aae2b12b2e32e32112bb18dcb43d3c7"} Mar 14 09:12:09 crc kubenswrapper[5129]: I0314 09:12:09.281885 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" event={"ID":"93cba590-03e6-448b-a8d0-d61dbe971a6f","Type":"ContainerStarted","Data":"3ff450860f2eaab624e501db2f7a6286a64af35e8682211b74afab572c0be37e"} Mar 14 09:12:09 crc kubenswrapper[5129]: I0314 09:12:09.285050 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" event={"ID":"78038caa-d464-4b63-a304-057a4393be51","Type":"ContainerStarted","Data":"649146b588b7dff41f016204c26bdc4ed1ce72acd9c650f86abd28cd761f77dd"} Mar 14 09:12:09 crc kubenswrapper[5129]: I0314 09:12:09.285101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" event={"ID":"78038caa-d464-4b63-a304-057a4393be51","Type":"ContainerStarted","Data":"767b01c54b582fd38b9f76edd7e57853ad2d361d3073dc4a3847ed07f96e1287"} Mar 14 09:12:09 crc kubenswrapper[5129]: I0314 09:12:09.313315 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" podStartSLOduration=1.904270648 podStartE2EDuration="2.313296168s" podCreationTimestamp="2026-03-14 09:12:07 +0000 UTC" firstStartedPulling="2026-03-14 09:12:08.439347132 +0000 UTC m=+7991.191262326" lastFinishedPulling="2026-03-14 09:12:08.848372652 +0000 UTC m=+7991.600287846" observedRunningTime="2026-03-14 09:12:09.30896366 +0000 UTC m=+7992.060878854" watchObservedRunningTime="2026-03-14 09:12:09.313296168 +0000 UTC m=+7992.065211352" Mar 14 09:12:09 crc kubenswrapper[5129]: I0314 09:12:09.338126 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" podStartSLOduration=1.8652622700000001 podStartE2EDuration="2.338107092s" podCreationTimestamp="2026-03-14 09:12:07 +0000 UTC" firstStartedPulling="2026-03-14 09:12:08.356445882 +0000 UTC m=+7991.108361066" lastFinishedPulling="2026-03-14 09:12:08.829290694 +0000 UTC m=+7991.581205888" observedRunningTime="2026-03-14 09:12:09.329750105 +0000 UTC m=+7992.081665299" watchObservedRunningTime="2026-03-14 09:12:09.338107092 +0000 UTC m=+7992.090022276" Mar 14 09:12:10 crc kubenswrapper[5129]: I0314 09:12:10.056878 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f5125ed-86c0-489c-892f-03f76a8ecc42" path="/var/lib/kubelet/pods/3f5125ed-86c0-489c-892f-03f76a8ecc42/volumes" Mar 14 09:12:18 crc kubenswrapper[5129]: I0314 09:12:18.043917 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:12:18 crc kubenswrapper[5129]: E0314 09:12:18.044709 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:12:31 crc kubenswrapper[5129]: I0314 09:12:31.036791 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:12:31 crc kubenswrapper[5129]: E0314 09:12:31.037679 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:12:32 crc kubenswrapper[5129]: I0314 09:12:32.928191 5129 scope.go:117] "RemoveContainer" containerID="bd48ded1d58df0ee5ea93d33c74a1fb6e912acc35b6a610adedfd18e7584ed0f" Mar 14 09:12:32 crc kubenswrapper[5129]: I0314 09:12:32.962120 5129 scope.go:117] "RemoveContainer" containerID="423a028bb5506319a7f6682b3693842a761b8815e832c155f3e31dbb92185903" Mar 14 09:12:33 crc kubenswrapper[5129]: I0314 09:12:33.016180 5129 scope.go:117] "RemoveContainer" containerID="28730beda4fa722818cea34bf81d00d50fed21bce4d72acf14170a6574183945" Mar 14 09:12:33 crc kubenswrapper[5129]: I0314 09:12:33.057671 5129 scope.go:117] "RemoveContainer" containerID="e11e6a139383e68a6e3b326589457eb23662847585bee3324637679fe054d7e5" Mar 14 09:12:42 crc kubenswrapper[5129]: I0314 09:12:42.036845 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:12:42 crc kubenswrapper[5129]: E0314 09:12:42.038250 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:12:53 crc kubenswrapper[5129]: I0314 09:12:53.037993 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:12:53 crc kubenswrapper[5129]: E0314 09:12:53.039921 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:13:04 crc kubenswrapper[5129]: I0314 09:13:04.036961 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:13:04 crc kubenswrapper[5129]: E0314 09:13:04.038251 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:13:08 crc kubenswrapper[5129]: I0314 09:13:08.926945 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-prgt4"] Mar 14 09:13:08 crc kubenswrapper[5129]: I0314 09:13:08.929784 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:08 crc kubenswrapper[5129]: I0314 09:13:08.942910 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prgt4"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.045733 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-utilities\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.045941 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-catalog-content\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.045987 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82jmj\" (UniqueName: \"kubernetes.io/projected/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-kube-api-access-82jmj\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.057655 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-cgdm4"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.075162 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tls68"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.085754 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wq6v2"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.098835 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d443-account-create-update-25726"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.110661 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d443-account-create-update-25726"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.122992 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-cgdm4"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.138784 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tls68"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.149969 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-catalog-content\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.149221 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-catalog-content\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.150553 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82jmj\" (UniqueName: \"kubernetes.io/projected/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-kube-api-access-82jmj\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.151706 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-utilities\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.151772 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wq6v2"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.152733 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-utilities\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.175752 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82jmj\" (UniqueName: \"kubernetes.io/projected/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-kube-api-access-82jmj\") pod \"redhat-marketplace-prgt4\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.249217 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.575727 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prgt4"] Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.881698 5129 generic.go:334] "Generic (PLEG): container finished" podID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerID="b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f" exitCode=0 Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.881796 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prgt4" event={"ID":"03fc8fc4-a59d-4f1d-9187-d4fd05999c10","Type":"ContainerDied","Data":"b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f"} Mar 14 09:13:09 crc kubenswrapper[5129]: I0314 09:13:09.882018 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prgt4" event={"ID":"03fc8fc4-a59d-4f1d-9187-d4fd05999c10","Type":"ContainerStarted","Data":"3bacf7688d5fa453617f4283d05f6c6361577bb03cb357ab670d412c0429514d"} Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.032492 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2020-account-create-update-lxdjn"] Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.049741 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c95390a-bb53-4827-8c22-07ebbd28ab75" path="/var/lib/kubelet/pods/1c95390a-bb53-4827-8c22-07ebbd28ab75/volumes" Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.050550 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="298699ac-5f93-42c8-aecb-d98ef33e5d0c" path="/var/lib/kubelet/pods/298699ac-5f93-42c8-aecb-d98ef33e5d0c/volumes" Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.051281 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b843f393-c69e-418c-9fde-9c694dba8294" path="/var/lib/kubelet/pods/b843f393-c69e-418c-9fde-9c694dba8294/volumes" Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.052005 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c114c0ae-fd24-4e0e-86d2-0586efa897cb" path="/var/lib/kubelet/pods/c114c0ae-fd24-4e0e-86d2-0586efa897cb/volumes" Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.053329 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-adc2-account-create-update-pcn6r"] Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.053368 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2020-account-create-update-lxdjn"] Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.065065 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-adc2-account-create-update-pcn6r"] Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.892570 5129 generic.go:334] "Generic (PLEG): container finished" podID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerID="eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a" exitCode=0 Mar 14 09:13:10 crc kubenswrapper[5129]: I0314 09:13:10.892686 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prgt4" event={"ID":"03fc8fc4-a59d-4f1d-9187-d4fd05999c10","Type":"ContainerDied","Data":"eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a"} Mar 14 09:13:11 crc kubenswrapper[5129]: I0314 09:13:11.905242 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prgt4" event={"ID":"03fc8fc4-a59d-4f1d-9187-d4fd05999c10","Type":"ContainerStarted","Data":"0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd"} Mar 14 09:13:11 crc kubenswrapper[5129]: I0314 09:13:11.927660 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-prgt4" podStartSLOduration=2.451965427 podStartE2EDuration="3.927639081s" podCreationTimestamp="2026-03-14 09:13:08 +0000 UTC" firstStartedPulling="2026-03-14 09:13:09.883545292 +0000 UTC m=+8052.635460476" lastFinishedPulling="2026-03-14 09:13:11.359218956 +0000 UTC m=+8054.111134130" observedRunningTime="2026-03-14 09:13:11.924484366 +0000 UTC m=+8054.676399550" watchObservedRunningTime="2026-03-14 09:13:11.927639081 +0000 UTC m=+8054.679554265" Mar 14 09:13:12 crc kubenswrapper[5129]: I0314 09:13:12.048757 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab6a3ac-9b76-4c9d-8ec6-a132f4d19697" path="/var/lib/kubelet/pods/aab6a3ac-9b76-4c9d-8ec6-a132f4d19697/volumes" Mar 14 09:13:12 crc kubenswrapper[5129]: I0314 09:13:12.049368 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeeeb0dd-39ba-4a10-a223-6c0d1079c766" path="/var/lib/kubelet/pods/eeeeb0dd-39ba-4a10-a223-6c0d1079c766/volumes" Mar 14 09:13:19 crc kubenswrapper[5129]: I0314 09:13:19.037531 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:13:19 crc kubenswrapper[5129]: E0314 09:13:19.038960 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:13:19 crc kubenswrapper[5129]: I0314 09:13:19.250179 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:19 crc kubenswrapper[5129]: I0314 09:13:19.250745 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:19 crc kubenswrapper[5129]: I0314 09:13:19.302370 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:20 crc kubenswrapper[5129]: I0314 09:13:20.069412 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:20 crc kubenswrapper[5129]: I0314 09:13:20.123302 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prgt4"] Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.032877 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-prgt4" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="registry-server" containerID="cri-o://0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd" gracePeriod=2 Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.500104 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.581995 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-utilities\") pod \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.582636 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82jmj\" (UniqueName: \"kubernetes.io/projected/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-kube-api-access-82jmj\") pod \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.582690 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-catalog-content\") pod \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\" (UID: \"03fc8fc4-a59d-4f1d-9187-d4fd05999c10\") " Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.582955 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-utilities" (OuterVolumeSpecName: "utilities") pod "03fc8fc4-a59d-4f1d-9187-d4fd05999c10" (UID: "03fc8fc4-a59d-4f1d-9187-d4fd05999c10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.583842 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.598991 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-kube-api-access-82jmj" (OuterVolumeSpecName: "kube-api-access-82jmj") pod "03fc8fc4-a59d-4f1d-9187-d4fd05999c10" (UID: "03fc8fc4-a59d-4f1d-9187-d4fd05999c10"). InnerVolumeSpecName "kube-api-access-82jmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.612313 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03fc8fc4-a59d-4f1d-9187-d4fd05999c10" (UID: "03fc8fc4-a59d-4f1d-9187-d4fd05999c10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.685429 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82jmj\" (UniqueName: \"kubernetes.io/projected/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-kube-api-access-82jmj\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:22 crc kubenswrapper[5129]: I0314 09:13:22.685479 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fc8fc4-a59d-4f1d-9187-d4fd05999c10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.045969 5129 generic.go:334] "Generic (PLEG): container finished" podID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerID="0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd" exitCode=0 Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.046039 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prgt4" event={"ID":"03fc8fc4-a59d-4f1d-9187-d4fd05999c10","Type":"ContainerDied","Data":"0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd"} Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.046095 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prgt4" event={"ID":"03fc8fc4-a59d-4f1d-9187-d4fd05999c10","Type":"ContainerDied","Data":"3bacf7688d5fa453617f4283d05f6c6361577bb03cb357ab670d412c0429514d"} Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.046111 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prgt4" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.046128 5129 scope.go:117] "RemoveContainer" containerID="0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.105675 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prgt4"] Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.108901 5129 scope.go:117] "RemoveContainer" containerID="eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.120876 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-prgt4"] Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.139945 5129 scope.go:117] "RemoveContainer" containerID="b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.205417 5129 scope.go:117] "RemoveContainer" containerID="0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd" Mar 14 09:13:23 crc kubenswrapper[5129]: E0314 09:13:23.206104 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd\": container with ID starting with 0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd not found: ID does not exist" containerID="0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.206159 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd"} err="failed to get container status \"0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd\": rpc error: code = NotFound desc = could not find container \"0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd\": container with ID starting with 0d6b455e92bbce5ae92593f0d6cd4968ab240187ba1fee6c5687864e3fdc9cdd not found: ID does not exist" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.206190 5129 scope.go:117] "RemoveContainer" containerID="eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a" Mar 14 09:13:23 crc kubenswrapper[5129]: E0314 09:13:23.206737 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a\": container with ID starting with eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a not found: ID does not exist" containerID="eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.206786 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a"} err="failed to get container status \"eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a\": rpc error: code = NotFound desc = could not find container \"eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a\": container with ID starting with eef8c85c63e23a200ea09082955817a0405573afbbe60591052809789210aa4a not found: ID does not exist" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.206817 5129 scope.go:117] "RemoveContainer" containerID="b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f" Mar 14 09:13:23 crc kubenswrapper[5129]: E0314 09:13:23.207136 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f\": container with ID starting with b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f not found: ID does not exist" containerID="b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f" Mar 14 09:13:23 crc kubenswrapper[5129]: I0314 09:13:23.207186 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f"} err="failed to get container status \"b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f\": rpc error: code = NotFound desc = could not find container \"b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f\": container with ID starting with b86b9186f4ed9b1072d89e5a6f64d127445fb26adfe3f1b31e36bdd7acf7949f not found: ID does not exist" Mar 14 09:13:24 crc kubenswrapper[5129]: I0314 09:13:24.059254 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" path="/var/lib/kubelet/pods/03fc8fc4-a59d-4f1d-9187-d4fd05999c10/volumes" Mar 14 09:13:31 crc kubenswrapper[5129]: I0314 09:13:31.032933 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2lss"] Mar 14 09:13:31 crc kubenswrapper[5129]: I0314 09:13:31.048574 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x2lss"] Mar 14 09:13:32 crc kubenswrapper[5129]: I0314 09:13:32.052275 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b" path="/var/lib/kubelet/pods/5c8f48bc-be9e-4c7f-93a2-1ed9fabb641b/volumes" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.036820 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:13:33 crc kubenswrapper[5129]: E0314 09:13:33.037137 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.218318 5129 scope.go:117] "RemoveContainer" containerID="50deee67b4488f6a60d6560ebf294e6cbfb76833bb2c4e7a2626445b081cde39" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.249975 5129 scope.go:117] "RemoveContainer" containerID="e7eea9eaa7f6b76f08bded4a614e18255db83f1f1acb2f4a436c98a06f0c917f" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.313142 5129 scope.go:117] "RemoveContainer" containerID="bc9ce5c0f454c94ae8dbd2d23c4efa43351db43bed5bbcc474aa72f1d9bce281" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.371277 5129 scope.go:117] "RemoveContainer" containerID="6b84d60870179db615b3da66187291eb69748aa57bf32c6c0c0ecb423866f1b6" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.439040 5129 scope.go:117] "RemoveContainer" containerID="c8804c5e26971b8475808ad1b883f48f0d56e512ebade61e2da21953ffa44c28" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.476843 5129 scope.go:117] "RemoveContainer" containerID="da19bf0fcb2a790ee491ccd4ea17e5a675924cf478b0af80ac7159eea0c5989e" Mar 14 09:13:33 crc kubenswrapper[5129]: I0314 09:13:33.547804 5129 scope.go:117] "RemoveContainer" containerID="e3ae1c978c21c7c3c941a7bd0b3c16150c402fc62f311c3e5c8b036c013135b2" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.664124 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l7tl9"] Mar 14 09:13:45 crc kubenswrapper[5129]: E0314 09:13:45.665088 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="registry-server" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.665101 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="registry-server" Mar 14 09:13:45 crc kubenswrapper[5129]: E0314 09:13:45.665132 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="extract-content" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.665140 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="extract-content" Mar 14 09:13:45 crc kubenswrapper[5129]: E0314 09:13:45.665150 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="extract-utilities" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.665158 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="extract-utilities" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.665339 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fc8fc4-a59d-4f1d-9187-d4fd05999c10" containerName="registry-server" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.666716 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.685719 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7tl9"] Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.851356 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-utilities\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.851574 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-catalog-content\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.851982 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtr5\" (UniqueName: \"kubernetes.io/projected/e836640e-ff5a-4c1f-9957-3f92e64022bf-kube-api-access-9qtr5\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.953950 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtr5\" (UniqueName: \"kubernetes.io/projected/e836640e-ff5a-4c1f-9957-3f92e64022bf-kube-api-access-9qtr5\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.954020 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-utilities\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.954105 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-catalog-content\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.954911 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-utilities\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.954941 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-catalog-content\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.974150 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtr5\" (UniqueName: \"kubernetes.io/projected/e836640e-ff5a-4c1f-9957-3f92e64022bf-kube-api-access-9qtr5\") pod \"certified-operators-l7tl9\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:45 crc kubenswrapper[5129]: I0314 09:13:45.992543 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:46 crc kubenswrapper[5129]: I0314 09:13:46.627590 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7tl9"] Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.036799 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:13:47 crc kubenswrapper[5129]: E0314 09:13:47.038501 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.049682 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nsmt"] Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.063907 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jltq"] Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.073881 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nsmt"] Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.083448 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jltq"] Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.327281 5129 generic.go:334] "Generic (PLEG): container finished" podID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerID="ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805" exitCode=0 Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.327320 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7tl9" event={"ID":"e836640e-ff5a-4c1f-9957-3f92e64022bf","Type":"ContainerDied","Data":"ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805"} Mar 14 09:13:47 crc kubenswrapper[5129]: I0314 09:13:47.327346 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7tl9" event={"ID":"e836640e-ff5a-4c1f-9957-3f92e64022bf","Type":"ContainerStarted","Data":"70bc668602078944af6e3d416721f0cdceadf0331066e31a12a23c578bdc1339"} Mar 14 09:13:48 crc kubenswrapper[5129]: I0314 09:13:48.048489 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7e6d99-1b35-48f2-aad6-6724ffe3629e" path="/var/lib/kubelet/pods/7f7e6d99-1b35-48f2-aad6-6724ffe3629e/volumes" Mar 14 09:13:48 crc kubenswrapper[5129]: I0314 09:13:48.049378 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682" path="/var/lib/kubelet/pods/c66b44f9-85eb-4e9f-b6ca-9b1d0a43c682/volumes" Mar 14 09:13:48 crc kubenswrapper[5129]: I0314 09:13:48.336726 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7tl9" event={"ID":"e836640e-ff5a-4c1f-9957-3f92e64022bf","Type":"ContainerStarted","Data":"6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169"} Mar 14 09:13:50 crc kubenswrapper[5129]: I0314 09:13:50.361376 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7tl9" event={"ID":"e836640e-ff5a-4c1f-9957-3f92e64022bf","Type":"ContainerDied","Data":"6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169"} Mar 14 09:13:50 crc kubenswrapper[5129]: I0314 09:13:50.361616 5129 generic.go:334] "Generic (PLEG): container finished" podID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerID="6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169" exitCode=0 Mar 14 09:13:51 crc kubenswrapper[5129]: I0314 09:13:51.371840 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7tl9" event={"ID":"e836640e-ff5a-4c1f-9957-3f92e64022bf","Type":"ContainerStarted","Data":"4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049"} Mar 14 09:13:51 crc kubenswrapper[5129]: I0314 09:13:51.399823 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l7tl9" podStartSLOduration=2.932098059 podStartE2EDuration="6.399802172s" podCreationTimestamp="2026-03-14 09:13:45 +0000 UTC" firstStartedPulling="2026-03-14 09:13:47.331361637 +0000 UTC m=+8090.083276821" lastFinishedPulling="2026-03-14 09:13:50.79906576 +0000 UTC m=+8093.550980934" observedRunningTime="2026-03-14 09:13:51.390668824 +0000 UTC m=+8094.142584018" watchObservedRunningTime="2026-03-14 09:13:51.399802172 +0000 UTC m=+8094.151717356" Mar 14 09:13:55 crc kubenswrapper[5129]: I0314 09:13:55.993703 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:55 crc kubenswrapper[5129]: I0314 09:13:55.994376 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:56 crc kubenswrapper[5129]: I0314 09:13:56.087486 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:56 crc kubenswrapper[5129]: I0314 09:13:56.490394 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:56 crc kubenswrapper[5129]: I0314 09:13:56.558019 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7tl9"] Mar 14 09:13:58 crc kubenswrapper[5129]: I0314 09:13:58.460669 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l7tl9" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="registry-server" containerID="cri-o://4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049" gracePeriod=2 Mar 14 09:13:58 crc kubenswrapper[5129]: I0314 09:13:58.965751 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.097385 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-catalog-content\") pod \"e836640e-ff5a-4c1f-9957-3f92e64022bf\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.097814 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-utilities\") pod \"e836640e-ff5a-4c1f-9957-3f92e64022bf\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.097857 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qtr5\" (UniqueName: \"kubernetes.io/projected/e836640e-ff5a-4c1f-9957-3f92e64022bf-kube-api-access-9qtr5\") pod \"e836640e-ff5a-4c1f-9957-3f92e64022bf\" (UID: \"e836640e-ff5a-4c1f-9957-3f92e64022bf\") " Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.099361 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-utilities" (OuterVolumeSpecName: "utilities") pod "e836640e-ff5a-4c1f-9957-3f92e64022bf" (UID: "e836640e-ff5a-4c1f-9957-3f92e64022bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.109036 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e836640e-ff5a-4c1f-9957-3f92e64022bf-kube-api-access-9qtr5" (OuterVolumeSpecName: "kube-api-access-9qtr5") pod "e836640e-ff5a-4c1f-9957-3f92e64022bf" (UID: "e836640e-ff5a-4c1f-9957-3f92e64022bf"). InnerVolumeSpecName "kube-api-access-9qtr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.172591 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e836640e-ff5a-4c1f-9957-3f92e64022bf" (UID: "e836640e-ff5a-4c1f-9957-3f92e64022bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.200003 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.200304 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e836640e-ff5a-4c1f-9957-3f92e64022bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.200396 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qtr5\" (UniqueName: \"kubernetes.io/projected/e836640e-ff5a-4c1f-9957-3f92e64022bf-kube-api-access-9qtr5\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.479662 5129 generic.go:334] "Generic (PLEG): container finished" podID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerID="4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049" exitCode=0 Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.479743 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7tl9" event={"ID":"e836640e-ff5a-4c1f-9957-3f92e64022bf","Type":"ContainerDied","Data":"4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049"} Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.479794 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7tl9" event={"ID":"e836640e-ff5a-4c1f-9957-3f92e64022bf","Type":"ContainerDied","Data":"70bc668602078944af6e3d416721f0cdceadf0331066e31a12a23c578bdc1339"} Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.479824 5129 scope.go:117] "RemoveContainer" containerID="4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.479953 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7tl9" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.511318 5129 scope.go:117] "RemoveContainer" containerID="6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.545650 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7tl9"] Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.546836 5129 scope.go:117] "RemoveContainer" containerID="ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.560263 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l7tl9"] Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.617474 5129 scope.go:117] "RemoveContainer" containerID="4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049" Mar 14 09:13:59 crc kubenswrapper[5129]: E0314 09:13:59.618160 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049\": container with ID starting with 4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049 not found: ID does not exist" containerID="4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.618195 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049"} err="failed to get container status \"4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049\": rpc error: code = NotFound desc = could not find container \"4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049\": container with ID starting with 4f0c2ea444aae8ee74030cdb4fcc05ae11d26d344bc9d34fc0e042ab64127049 not found: ID does not exist" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.618223 5129 scope.go:117] "RemoveContainer" containerID="6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169" Mar 14 09:13:59 crc kubenswrapper[5129]: E0314 09:13:59.618795 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169\": container with ID starting with 6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169 not found: ID does not exist" containerID="6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.618816 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169"} err="failed to get container status \"6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169\": rpc error: code = NotFound desc = could not find container \"6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169\": container with ID starting with 6e7d488a0978589903c98480f33b37531c97e045b2a5952f9f2d9b359852c169 not found: ID does not exist" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.618828 5129 scope.go:117] "RemoveContainer" containerID="ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805" Mar 14 09:13:59 crc kubenswrapper[5129]: E0314 09:13:59.619134 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805\": container with ID starting with ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805 not found: ID does not exist" containerID="ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805" Mar 14 09:13:59 crc kubenswrapper[5129]: I0314 09:13:59.619151 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805"} err="failed to get container status \"ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805\": rpc error: code = NotFound desc = could not find container \"ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805\": container with ID starting with ed9be5d4c3d37252989da75dba37b58611690e7c40b6f39048c156ed78112805 not found: ID does not exist" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.038493 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:14:00 crc kubenswrapper[5129]: E0314 09:14:00.039293 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.048035 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" path="/var/lib/kubelet/pods/e836640e-ff5a-4c1f-9957-3f92e64022bf/volumes" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.149449 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557994-4lts8"] Mar 14 09:14:00 crc kubenswrapper[5129]: E0314 09:14:00.150133 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="extract-content" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.150157 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="extract-content" Mar 14 09:14:00 crc kubenswrapper[5129]: E0314 09:14:00.150178 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="extract-utilities" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.150186 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="extract-utilities" Mar 14 09:14:00 crc kubenswrapper[5129]: E0314 09:14:00.150202 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="registry-server" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.150208 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="registry-server" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.150442 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e836640e-ff5a-4c1f-9957-3f92e64022bf" containerName="registry-server" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.151303 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-4lts8" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.153786 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.153790 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.157517 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.159461 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-4lts8"] Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.326819 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd2kb\" (UniqueName: \"kubernetes.io/projected/2019223e-c2a9-4105-aa7e-103a46334d67-kube-api-access-gd2kb\") pod \"auto-csr-approver-29557994-4lts8\" (UID: \"2019223e-c2a9-4105-aa7e-103a46334d67\") " pod="openshift-infra/auto-csr-approver-29557994-4lts8" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.428807 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd2kb\" (UniqueName: \"kubernetes.io/projected/2019223e-c2a9-4105-aa7e-103a46334d67-kube-api-access-gd2kb\") pod \"auto-csr-approver-29557994-4lts8\" (UID: \"2019223e-c2a9-4105-aa7e-103a46334d67\") " pod="openshift-infra/auto-csr-approver-29557994-4lts8" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.449106 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd2kb\" (UniqueName: \"kubernetes.io/projected/2019223e-c2a9-4105-aa7e-103a46334d67-kube-api-access-gd2kb\") pod \"auto-csr-approver-29557994-4lts8\" (UID: \"2019223e-c2a9-4105-aa7e-103a46334d67\") " pod="openshift-infra/auto-csr-approver-29557994-4lts8" Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.471437 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-4lts8" Mar 14 09:14:00 crc kubenswrapper[5129]: W0314 09:14:00.926731 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2019223e_c2a9_4105_aa7e_103a46334d67.slice/crio-f98b08c44dc266aa98a7beb643ec19e995389b8dfa471de3856959b42795b72b WatchSource:0}: Error finding container f98b08c44dc266aa98a7beb643ec19e995389b8dfa471de3856959b42795b72b: Status 404 returned error can't find the container with id f98b08c44dc266aa98a7beb643ec19e995389b8dfa471de3856959b42795b72b Mar 14 09:14:00 crc kubenswrapper[5129]: I0314 09:14:00.927826 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-4lts8"] Mar 14 09:14:01 crc kubenswrapper[5129]: I0314 09:14:01.512314 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-4lts8" event={"ID":"2019223e-c2a9-4105-aa7e-103a46334d67","Type":"ContainerStarted","Data":"f98b08c44dc266aa98a7beb643ec19e995389b8dfa471de3856959b42795b72b"} Mar 14 09:14:02 crc kubenswrapper[5129]: I0314 09:14:02.527097 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-4lts8" event={"ID":"2019223e-c2a9-4105-aa7e-103a46334d67","Type":"ContainerStarted","Data":"ea696cfcd4dc020003e9f093b3b0d58d1bd2e07121923fc921b4acbc155766fe"} Mar 14 09:14:02 crc kubenswrapper[5129]: I0314 09:14:02.556927 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557994-4lts8" podStartSLOduration=1.64631806 podStartE2EDuration="2.556903141s" podCreationTimestamp="2026-03-14 09:14:00 +0000 UTC" firstStartedPulling="2026-03-14 09:14:00.932246703 +0000 UTC m=+8103.684161887" lastFinishedPulling="2026-03-14 09:14:01.842831784 +0000 UTC m=+8104.594746968" observedRunningTime="2026-03-14 09:14:02.546358965 +0000 UTC m=+8105.298274139" watchObservedRunningTime="2026-03-14 09:14:02.556903141 +0000 UTC m=+8105.308818325" Mar 14 09:14:03 crc kubenswrapper[5129]: I0314 09:14:03.542186 5129 generic.go:334] "Generic (PLEG): container finished" podID="2019223e-c2a9-4105-aa7e-103a46334d67" containerID="ea696cfcd4dc020003e9f093b3b0d58d1bd2e07121923fc921b4acbc155766fe" exitCode=0 Mar 14 09:14:03 crc kubenswrapper[5129]: I0314 09:14:03.542450 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-4lts8" event={"ID":"2019223e-c2a9-4105-aa7e-103a46334d67","Type":"ContainerDied","Data":"ea696cfcd4dc020003e9f093b3b0d58d1bd2e07121923fc921b4acbc155766fe"} Mar 14 09:14:04 crc kubenswrapper[5129]: I0314 09:14:04.927327 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-4lts8" Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.039836 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd2kb\" (UniqueName: \"kubernetes.io/projected/2019223e-c2a9-4105-aa7e-103a46334d67-kube-api-access-gd2kb\") pod \"2019223e-c2a9-4105-aa7e-103a46334d67\" (UID: \"2019223e-c2a9-4105-aa7e-103a46334d67\") " Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.045677 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2019223e-c2a9-4105-aa7e-103a46334d67-kube-api-access-gd2kb" (OuterVolumeSpecName: "kube-api-access-gd2kb") pod "2019223e-c2a9-4105-aa7e-103a46334d67" (UID: "2019223e-c2a9-4105-aa7e-103a46334d67"). InnerVolumeSpecName "kube-api-access-gd2kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.142985 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd2kb\" (UniqueName: \"kubernetes.io/projected/2019223e-c2a9-4105-aa7e-103a46334d67-kube-api-access-gd2kb\") on node \"crc\" DevicePath \"\"" Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.560517 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-4lts8" event={"ID":"2019223e-c2a9-4105-aa7e-103a46334d67","Type":"ContainerDied","Data":"f98b08c44dc266aa98a7beb643ec19e995389b8dfa471de3856959b42795b72b"} Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.560564 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f98b08c44dc266aa98a7beb643ec19e995389b8dfa471de3856959b42795b72b" Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.560595 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-4lts8" Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.632593 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-84zvt"] Mar 14 09:14:05 crc kubenswrapper[5129]: I0314 09:14:05.642260 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-84zvt"] Mar 14 09:14:06 crc kubenswrapper[5129]: I0314 09:14:06.048756 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1f1ae2-41a9-4559-a79d-3dd00f407d52" path="/var/lib/kubelet/pods/be1f1ae2-41a9-4559-a79d-3dd00f407d52/volumes" Mar 14 09:14:15 crc kubenswrapper[5129]: I0314 09:14:15.036905 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:14:15 crc kubenswrapper[5129]: E0314 09:14:15.037723 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:14:30 crc kubenswrapper[5129]: I0314 09:14:30.037072 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:14:30 crc kubenswrapper[5129]: I0314 09:14:30.850552 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"94aa21ce282c2cbabcf60006572ac233594c85a69c6212b612a74372c03ca9bb"} Mar 14 09:14:33 crc kubenswrapper[5129]: I0314 09:14:33.722970 5129 scope.go:117] "RemoveContainer" containerID="4fd366e6b4f84257df2dc2e4364d0afea5c792265d18ffa6544d62f22641b294" Mar 14 09:14:33 crc kubenswrapper[5129]: I0314 09:14:33.766451 5129 scope.go:117] "RemoveContainer" containerID="e016520c9199256e4a26a49ccdffb63f5b0cc903389cd9db3ab67bd4cb64a8c8" Mar 14 09:14:33 crc kubenswrapper[5129]: I0314 09:14:33.830227 5129 scope.go:117] "RemoveContainer" containerID="9341c9da77ed187d1d35be3fc1af0b5c5de34df01fdca6bcf9392e148cd4d364" Mar 14 09:14:36 crc kubenswrapper[5129]: I0314 09:14:36.053206 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hznb2"] Mar 14 09:14:36 crc kubenswrapper[5129]: I0314 09:14:36.053524 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hznb2"] Mar 14 09:14:38 crc kubenswrapper[5129]: I0314 09:14:38.050712 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5ebad0-4050-4467-918f-18b373bd269a" path="/var/lib/kubelet/pods/bf5ebad0-4050-4467-918f-18b373bd269a/volumes" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.157412 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm"] Mar 14 09:15:00 crc kubenswrapper[5129]: E0314 09:15:00.158731 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2019223e-c2a9-4105-aa7e-103a46334d67" containerName="oc" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.158752 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2019223e-c2a9-4105-aa7e-103a46334d67" containerName="oc" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.159049 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2019223e-c2a9-4105-aa7e-103a46334d67" containerName="oc" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.160204 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.167337 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.167570 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.181860 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm"] Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.262441 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/631a6666-44c1-4fdf-8624-a24d98b17cd0-secret-volume\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.262535 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdr7r\" (UniqueName: \"kubernetes.io/projected/631a6666-44c1-4fdf-8624-a24d98b17cd0-kube-api-access-gdr7r\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.262681 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/631a6666-44c1-4fdf-8624-a24d98b17cd0-config-volume\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.365411 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/631a6666-44c1-4fdf-8624-a24d98b17cd0-secret-volume\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.365498 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdr7r\" (UniqueName: \"kubernetes.io/projected/631a6666-44c1-4fdf-8624-a24d98b17cd0-kube-api-access-gdr7r\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.365671 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/631a6666-44c1-4fdf-8624-a24d98b17cd0-config-volume\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.367188 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/631a6666-44c1-4fdf-8624-a24d98b17cd0-config-volume\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.373835 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/631a6666-44c1-4fdf-8624-a24d98b17cd0-secret-volume\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.399445 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdr7r\" (UniqueName: \"kubernetes.io/projected/631a6666-44c1-4fdf-8624-a24d98b17cd0-kube-api-access-gdr7r\") pod \"collect-profiles-29557995-wlwrm\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:00 crc kubenswrapper[5129]: I0314 09:15:00.482353 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:01 crc kubenswrapper[5129]: I0314 09:15:01.031073 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm"] Mar 14 09:15:01 crc kubenswrapper[5129]: I0314 09:15:01.207700 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" event={"ID":"631a6666-44c1-4fdf-8624-a24d98b17cd0","Type":"ContainerStarted","Data":"326384a4db5f88d74c50a0670e466bc07b98eed7043fff92ea573d7bb7877943"} Mar 14 09:15:02 crc kubenswrapper[5129]: I0314 09:15:02.221695 5129 generic.go:334] "Generic (PLEG): container finished" podID="631a6666-44c1-4fdf-8624-a24d98b17cd0" containerID="f1793d1eb809b9a7d523ecbacb2b6d2a88459ad7ae4c6cb13cec735cff050bb2" exitCode=0 Mar 14 09:15:02 crc kubenswrapper[5129]: I0314 09:15:02.221751 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" event={"ID":"631a6666-44c1-4fdf-8624-a24d98b17cd0","Type":"ContainerDied","Data":"f1793d1eb809b9a7d523ecbacb2b6d2a88459ad7ae4c6cb13cec735cff050bb2"} Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.589475 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.760809 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/631a6666-44c1-4fdf-8624-a24d98b17cd0-config-volume\") pod \"631a6666-44c1-4fdf-8624-a24d98b17cd0\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.760964 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdr7r\" (UniqueName: \"kubernetes.io/projected/631a6666-44c1-4fdf-8624-a24d98b17cd0-kube-api-access-gdr7r\") pod \"631a6666-44c1-4fdf-8624-a24d98b17cd0\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.761028 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/631a6666-44c1-4fdf-8624-a24d98b17cd0-secret-volume\") pod \"631a6666-44c1-4fdf-8624-a24d98b17cd0\" (UID: \"631a6666-44c1-4fdf-8624-a24d98b17cd0\") " Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.762029 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a6666-44c1-4fdf-8624-a24d98b17cd0-config-volume" (OuterVolumeSpecName: "config-volume") pod "631a6666-44c1-4fdf-8624-a24d98b17cd0" (UID: "631a6666-44c1-4fdf-8624-a24d98b17cd0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.768447 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631a6666-44c1-4fdf-8624-a24d98b17cd0-kube-api-access-gdr7r" (OuterVolumeSpecName: "kube-api-access-gdr7r") pod "631a6666-44c1-4fdf-8624-a24d98b17cd0" (UID: "631a6666-44c1-4fdf-8624-a24d98b17cd0"). InnerVolumeSpecName "kube-api-access-gdr7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.770080 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631a6666-44c1-4fdf-8624-a24d98b17cd0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "631a6666-44c1-4fdf-8624-a24d98b17cd0" (UID: "631a6666-44c1-4fdf-8624-a24d98b17cd0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.863309 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/631a6666-44c1-4fdf-8624-a24d98b17cd0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.863366 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdr7r\" (UniqueName: \"kubernetes.io/projected/631a6666-44c1-4fdf-8624-a24d98b17cd0-kube-api-access-gdr7r\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:03 crc kubenswrapper[5129]: I0314 09:15:03.863379 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/631a6666-44c1-4fdf-8624-a24d98b17cd0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:04 crc kubenswrapper[5129]: I0314 09:15:04.257290 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" event={"ID":"631a6666-44c1-4fdf-8624-a24d98b17cd0","Type":"ContainerDied","Data":"326384a4db5f88d74c50a0670e466bc07b98eed7043fff92ea573d7bb7877943"} Mar 14 09:15:04 crc kubenswrapper[5129]: I0314 09:15:04.257346 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="326384a4db5f88d74c50a0670e466bc07b98eed7043fff92ea573d7bb7877943" Mar 14 09:15:04 crc kubenswrapper[5129]: I0314 09:15:04.257474 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm" Mar 14 09:15:04 crc kubenswrapper[5129]: I0314 09:15:04.689902 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw"] Mar 14 09:15:04 crc kubenswrapper[5129]: I0314 09:15:04.699030 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-8f2sw"] Mar 14 09:15:06 crc kubenswrapper[5129]: I0314 09:15:06.054702 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d908e0-7ff9-4857-9702-3e913e9750e0" path="/var/lib/kubelet/pods/92d908e0-7ff9-4857-9702-3e913e9750e0/volumes" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.260897 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ntdxw"] Mar 14 09:15:07 crc kubenswrapper[5129]: E0314 09:15:07.262093 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631a6666-44c1-4fdf-8624-a24d98b17cd0" containerName="collect-profiles" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.262114 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="631a6666-44c1-4fdf-8624-a24d98b17cd0" containerName="collect-profiles" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.262426 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="631a6666-44c1-4fdf-8624-a24d98b17cd0" containerName="collect-profiles" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.266098 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.283564 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntdxw"] Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.463957 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfl24\" (UniqueName: \"kubernetes.io/projected/e2092352-4e4f-4648-9413-1815466371b4-kube-api-access-dfl24\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.464018 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-utilities\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.464095 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-catalog-content\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.566578 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfl24\" (UniqueName: \"kubernetes.io/projected/e2092352-4e4f-4648-9413-1815466371b4-kube-api-access-dfl24\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.566660 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-utilities\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.566741 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-catalog-content\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.567368 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-utilities\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.567377 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-catalog-content\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.591069 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfl24\" (UniqueName: \"kubernetes.io/projected/e2092352-4e4f-4648-9413-1815466371b4-kube-api-access-dfl24\") pod \"redhat-operators-ntdxw\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:07 crc kubenswrapper[5129]: I0314 09:15:07.886067 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:08 crc kubenswrapper[5129]: I0314 09:15:08.423882 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntdxw"] Mar 14 09:15:08 crc kubenswrapper[5129]: W0314 09:15:08.433798 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2092352_4e4f_4648_9413_1815466371b4.slice/crio-ab6a065e6e4de9ce0c943c3591b975d0a0384596446eb9f1ef051a12713c8df6 WatchSource:0}: Error finding container ab6a065e6e4de9ce0c943c3591b975d0a0384596446eb9f1ef051a12713c8df6: Status 404 returned error can't find the container with id ab6a065e6e4de9ce0c943c3591b975d0a0384596446eb9f1ef051a12713c8df6 Mar 14 09:15:09 crc kubenswrapper[5129]: I0314 09:15:09.304987 5129 generic.go:334] "Generic (PLEG): container finished" podID="e2092352-4e4f-4648-9413-1815466371b4" containerID="8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0" exitCode=0 Mar 14 09:15:09 crc kubenswrapper[5129]: I0314 09:15:09.305263 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntdxw" event={"ID":"e2092352-4e4f-4648-9413-1815466371b4","Type":"ContainerDied","Data":"8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0"} Mar 14 09:15:09 crc kubenswrapper[5129]: I0314 09:15:09.305291 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntdxw" event={"ID":"e2092352-4e4f-4648-9413-1815466371b4","Type":"ContainerStarted","Data":"ab6a065e6e4de9ce0c943c3591b975d0a0384596446eb9f1ef051a12713c8df6"} Mar 14 09:15:10 crc kubenswrapper[5129]: I0314 09:15:10.320543 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntdxw" event={"ID":"e2092352-4e4f-4648-9413-1815466371b4","Type":"ContainerStarted","Data":"b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5"} Mar 14 09:15:15 crc kubenswrapper[5129]: E0314 09:15:15.021879 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2092352_4e4f_4648_9413_1815466371b4.slice/crio-b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2092352_4e4f_4648_9413_1815466371b4.slice/crio-conmon-b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:15:15 crc kubenswrapper[5129]: I0314 09:15:15.383222 5129 generic.go:334] "Generic (PLEG): container finished" podID="e2092352-4e4f-4648-9413-1815466371b4" containerID="b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5" exitCode=0 Mar 14 09:15:15 crc kubenswrapper[5129]: I0314 09:15:15.383276 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntdxw" event={"ID":"e2092352-4e4f-4648-9413-1815466371b4","Type":"ContainerDied","Data":"b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5"} Mar 14 09:15:16 crc kubenswrapper[5129]: I0314 09:15:16.397121 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntdxw" event={"ID":"e2092352-4e4f-4648-9413-1815466371b4","Type":"ContainerStarted","Data":"e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61"} Mar 14 09:15:16 crc kubenswrapper[5129]: I0314 09:15:16.421215 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ntdxw" podStartSLOduration=2.9454110289999997 podStartE2EDuration="9.421193301s" podCreationTimestamp="2026-03-14 09:15:07 +0000 UTC" firstStartedPulling="2026-03-14 09:15:09.307370224 +0000 UTC m=+8172.059285408" lastFinishedPulling="2026-03-14 09:15:15.783152496 +0000 UTC m=+8178.535067680" observedRunningTime="2026-03-14 09:15:16.414240253 +0000 UTC m=+8179.166155447" watchObservedRunningTime="2026-03-14 09:15:16.421193301 +0000 UTC m=+8179.173108485" Mar 14 09:15:17 crc kubenswrapper[5129]: I0314 09:15:17.886460 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:17 crc kubenswrapper[5129]: I0314 09:15:17.886562 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:18 crc kubenswrapper[5129]: I0314 09:15:18.948494 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ntdxw" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="registry-server" probeResult="failure" output=< Mar 14 09:15:18 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:15:18 crc kubenswrapper[5129]: > Mar 14 09:15:27 crc kubenswrapper[5129]: I0314 09:15:27.987939 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:28 crc kubenswrapper[5129]: I0314 09:15:28.138419 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:28 crc kubenswrapper[5129]: I0314 09:15:28.258552 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntdxw"] Mar 14 09:15:29 crc kubenswrapper[5129]: I0314 09:15:29.535557 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ntdxw" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="registry-server" containerID="cri-o://e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61" gracePeriod=2 Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.025936 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.110029 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-catalog-content\") pod \"e2092352-4e4f-4648-9413-1815466371b4\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.110116 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-utilities\") pod \"e2092352-4e4f-4648-9413-1815466371b4\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.110348 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfl24\" (UniqueName: \"kubernetes.io/projected/e2092352-4e4f-4648-9413-1815466371b4-kube-api-access-dfl24\") pod \"e2092352-4e4f-4648-9413-1815466371b4\" (UID: \"e2092352-4e4f-4648-9413-1815466371b4\") " Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.111320 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-utilities" (OuterVolumeSpecName: "utilities") pod "e2092352-4e4f-4648-9413-1815466371b4" (UID: "e2092352-4e4f-4648-9413-1815466371b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.121736 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2092352-4e4f-4648-9413-1815466371b4-kube-api-access-dfl24" (OuterVolumeSpecName: "kube-api-access-dfl24") pod "e2092352-4e4f-4648-9413-1815466371b4" (UID: "e2092352-4e4f-4648-9413-1815466371b4"). InnerVolumeSpecName "kube-api-access-dfl24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.213544 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.213641 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfl24\" (UniqueName: \"kubernetes.io/projected/e2092352-4e4f-4648-9413-1815466371b4-kube-api-access-dfl24\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.260702 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2092352-4e4f-4648-9413-1815466371b4" (UID: "e2092352-4e4f-4648-9413-1815466371b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.317097 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2092352-4e4f-4648-9413-1815466371b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.547201 5129 generic.go:334] "Generic (PLEG): container finished" podID="e2092352-4e4f-4648-9413-1815466371b4" containerID="e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61" exitCode=0 Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.547256 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntdxw" event={"ID":"e2092352-4e4f-4648-9413-1815466371b4","Type":"ContainerDied","Data":"e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61"} Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.547294 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntdxw" event={"ID":"e2092352-4e4f-4648-9413-1815466371b4","Type":"ContainerDied","Data":"ab6a065e6e4de9ce0c943c3591b975d0a0384596446eb9f1ef051a12713c8df6"} Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.547333 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntdxw" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.547341 5129 scope.go:117] "RemoveContainer" containerID="e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.585130 5129 scope.go:117] "RemoveContainer" containerID="b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.589564 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntdxw"] Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.600236 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ntdxw"] Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.622418 5129 scope.go:117] "RemoveContainer" containerID="8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.694729 5129 scope.go:117] "RemoveContainer" containerID="e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61" Mar 14 09:15:30 crc kubenswrapper[5129]: E0314 09:15:30.695562 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61\": container with ID starting with e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61 not found: ID does not exist" containerID="e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.695644 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61"} err="failed to get container status \"e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61\": rpc error: code = NotFound desc = could not find container \"e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61\": container with ID starting with e6a63783aff314261cb4cbf78e15bb111799077fe2e6059185ab8133e8104d61 not found: ID does not exist" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.695671 5129 scope.go:117] "RemoveContainer" containerID="b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5" Mar 14 09:15:30 crc kubenswrapper[5129]: E0314 09:15:30.697906 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5\": container with ID starting with b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5 not found: ID does not exist" containerID="b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.697970 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5"} err="failed to get container status \"b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5\": rpc error: code = NotFound desc = could not find container \"b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5\": container with ID starting with b37893d9cd395b87d7038a711a6549bbd3ffef0db38382699cfb5fb35c2a2ea5 not found: ID does not exist" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.697986 5129 scope.go:117] "RemoveContainer" containerID="8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0" Mar 14 09:15:30 crc kubenswrapper[5129]: E0314 09:15:30.698260 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0\": container with ID starting with 8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0 not found: ID does not exist" containerID="8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0" Mar 14 09:15:30 crc kubenswrapper[5129]: I0314 09:15:30.698280 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0"} err="failed to get container status \"8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0\": rpc error: code = NotFound desc = could not find container \"8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0\": container with ID starting with 8a7e102b099676275a70c2331388dcc98f55b51967de5fcf9b4f25f990dffeb0 not found: ID does not exist" Mar 14 09:15:32 crc kubenswrapper[5129]: I0314 09:15:32.048023 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2092352-4e4f-4648-9413-1815466371b4" path="/var/lib/kubelet/pods/e2092352-4e4f-4648-9413-1815466371b4/volumes" Mar 14 09:15:33 crc kubenswrapper[5129]: I0314 09:15:33.969448 5129 scope.go:117] "RemoveContainer" containerID="0861d3de50b790da5f4a9a807f4108900a0f99e60667fb8a6f3bb3ca2babb3f4" Mar 14 09:15:33 crc kubenswrapper[5129]: I0314 09:15:33.991237 5129 scope.go:117] "RemoveContainer" containerID="a81c4c4b7be60df0c078b65c98bea1bd921c3091962c05a08a8532a20f292a02" Mar 14 09:15:34 crc kubenswrapper[5129]: I0314 09:15:34.010266 5129 scope.go:117] "RemoveContainer" containerID="e5d963130402a99df6b3b5de440dd2ced4e4272adf9e84093aee663e35af0886" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.144162 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557996-8wh64"] Mar 14 09:16:00 crc kubenswrapper[5129]: E0314 09:16:00.145146 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="registry-server" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.145161 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="registry-server" Mar 14 09:16:00 crc kubenswrapper[5129]: E0314 09:16:00.145183 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="extract-content" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.145189 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="extract-content" Mar 14 09:16:00 crc kubenswrapper[5129]: E0314 09:16:00.145203 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="extract-utilities" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.145210 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="extract-utilities" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.145385 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2092352-4e4f-4648-9413-1815466371b4" containerName="registry-server" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.146142 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-8wh64" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.148170 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.148679 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.149065 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.160411 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-8wh64"] Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.255978 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qkt\" (UniqueName: \"kubernetes.io/projected/c1e8f005-e776-479c-8946-e516b13555c7-kube-api-access-79qkt\") pod \"auto-csr-approver-29557996-8wh64\" (UID: \"c1e8f005-e776-479c-8946-e516b13555c7\") " pod="openshift-infra/auto-csr-approver-29557996-8wh64" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.358296 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qkt\" (UniqueName: \"kubernetes.io/projected/c1e8f005-e776-479c-8946-e516b13555c7-kube-api-access-79qkt\") pod \"auto-csr-approver-29557996-8wh64\" (UID: \"c1e8f005-e776-479c-8946-e516b13555c7\") " pod="openshift-infra/auto-csr-approver-29557996-8wh64" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.379181 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qkt\" (UniqueName: \"kubernetes.io/projected/c1e8f005-e776-479c-8946-e516b13555c7-kube-api-access-79qkt\") pod \"auto-csr-approver-29557996-8wh64\" (UID: \"c1e8f005-e776-479c-8946-e516b13555c7\") " pod="openshift-infra/auto-csr-approver-29557996-8wh64" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.468402 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-8wh64" Mar 14 09:16:00 crc kubenswrapper[5129]: I0314 09:16:00.939194 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-8wh64"] Mar 14 09:16:00 crc kubenswrapper[5129]: W0314 09:16:00.962295 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1e8f005_e776_479c_8946_e516b13555c7.slice/crio-b9c0613f096ab116e576333af33201a46287eac202568b9161a903a1d3641bbe WatchSource:0}: Error finding container b9c0613f096ab116e576333af33201a46287eac202568b9161a903a1d3641bbe: Status 404 returned error can't find the container with id b9c0613f096ab116e576333af33201a46287eac202568b9161a903a1d3641bbe Mar 14 09:16:01 crc kubenswrapper[5129]: I0314 09:16:01.900573 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-8wh64" event={"ID":"c1e8f005-e776-479c-8946-e516b13555c7","Type":"ContainerStarted","Data":"b9c0613f096ab116e576333af33201a46287eac202568b9161a903a1d3641bbe"} Mar 14 09:16:02 crc kubenswrapper[5129]: I0314 09:16:02.911959 5129 generic.go:334] "Generic (PLEG): container finished" podID="c1e8f005-e776-479c-8946-e516b13555c7" containerID="1c8d1b31196e1bac6fd195f3ed482fe9e2ae489d67afaffef42c8116433f847a" exitCode=0 Mar 14 09:16:02 crc kubenswrapper[5129]: I0314 09:16:02.912042 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-8wh64" event={"ID":"c1e8f005-e776-479c-8946-e516b13555c7","Type":"ContainerDied","Data":"1c8d1b31196e1bac6fd195f3ed482fe9e2ae489d67afaffef42c8116433f847a"} Mar 14 09:16:04 crc kubenswrapper[5129]: I0314 09:16:04.380883 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-8wh64" Mar 14 09:16:04 crc kubenswrapper[5129]: I0314 09:16:04.455241 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79qkt\" (UniqueName: \"kubernetes.io/projected/c1e8f005-e776-479c-8946-e516b13555c7-kube-api-access-79qkt\") pod \"c1e8f005-e776-479c-8946-e516b13555c7\" (UID: \"c1e8f005-e776-479c-8946-e516b13555c7\") " Mar 14 09:16:04 crc kubenswrapper[5129]: I0314 09:16:04.462322 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e8f005-e776-479c-8946-e516b13555c7-kube-api-access-79qkt" (OuterVolumeSpecName: "kube-api-access-79qkt") pod "c1e8f005-e776-479c-8946-e516b13555c7" (UID: "c1e8f005-e776-479c-8946-e516b13555c7"). InnerVolumeSpecName "kube-api-access-79qkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:04 crc kubenswrapper[5129]: I0314 09:16:04.557692 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79qkt\" (UniqueName: \"kubernetes.io/projected/c1e8f005-e776-479c-8946-e516b13555c7-kube-api-access-79qkt\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:04 crc kubenswrapper[5129]: I0314 09:16:04.936210 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-8wh64" event={"ID":"c1e8f005-e776-479c-8946-e516b13555c7","Type":"ContainerDied","Data":"b9c0613f096ab116e576333af33201a46287eac202568b9161a903a1d3641bbe"} Mar 14 09:16:04 crc kubenswrapper[5129]: I0314 09:16:04.936254 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c0613f096ab116e576333af33201a46287eac202568b9161a903a1d3641bbe" Mar 14 09:16:04 crc kubenswrapper[5129]: I0314 09:16:04.936265 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-8wh64" Mar 14 09:16:05 crc kubenswrapper[5129]: I0314 09:16:05.451666 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-swcvh"] Mar 14 09:16:05 crc kubenswrapper[5129]: I0314 09:16:05.462183 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-swcvh"] Mar 14 09:16:06 crc kubenswrapper[5129]: I0314 09:16:06.049097 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa91aa1c-d620-401e-a77f-2b5071e3338f" path="/var/lib/kubelet/pods/aa91aa1c-d620-401e-a77f-2b5071e3338f/volumes" Mar 14 09:16:34 crc kubenswrapper[5129]: I0314 09:16:34.203980 5129 scope.go:117] "RemoveContainer" containerID="5b3b7d4ae74df8f9a98df95a47a91da4d7b6dc86ffb6093ca9c06f4d79fb1d28" Mar 14 09:16:34 crc kubenswrapper[5129]: I0314 09:16:34.239674 5129 scope.go:117] "RemoveContainer" containerID="3c851339df5aa5000308021aaa1e8f5140c2ac9d984afd6a6f2a1009d42cf10f" Mar 14 09:16:34 crc kubenswrapper[5129]: I0314 09:16:34.294367 5129 scope.go:117] "RemoveContainer" containerID="e8485f22da847dd03986bf7d3ccb855bf06bcaffc3014457dc7b4ec346ffc709" Mar 14 09:16:34 crc kubenswrapper[5129]: I0314 09:16:34.339236 5129 scope.go:117] "RemoveContainer" containerID="e38711b51796d5b787c5a8ffdb9f06c9fbb7103044af0c7a8700968ee51f9f35" Mar 14 09:16:34 crc kubenswrapper[5129]: I0314 09:16:34.367800 5129 scope.go:117] "RemoveContainer" containerID="127c8bd0e76212ae2fc832e6f28179e0e16af3c786015d96af29e3969062c121" Mar 14 09:16:36 crc kubenswrapper[5129]: I0314 09:16:36.298062 5129 generic.go:334] "Generic (PLEG): container finished" podID="93cba590-03e6-448b-a8d0-d61dbe971a6f" containerID="1a6f3e906fc7089f90faafda269c88033aae2b12b2e32e32112bb18dcb43d3c7" exitCode=0 Mar 14 09:16:36 crc kubenswrapper[5129]: I0314 09:16:36.298152 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" event={"ID":"93cba590-03e6-448b-a8d0-d61dbe971a6f","Type":"ContainerDied","Data":"1a6f3e906fc7089f90faafda269c88033aae2b12b2e32e32112bb18dcb43d3c7"} Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.774482 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.917799 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgzrv\" (UniqueName: \"kubernetes.io/projected/93cba590-03e6-448b-a8d0-d61dbe971a6f-kube-api-access-jgzrv\") pod \"93cba590-03e6-448b-a8d0-d61dbe971a6f\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.917953 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-tripleo-cleanup-combined-ca-bundle\") pod \"93cba590-03e6-448b-a8d0-d61dbe971a6f\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.918022 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-ssh-key-openstack-networker\") pod \"93cba590-03e6-448b-a8d0-d61dbe971a6f\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.918251 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-inventory\") pod \"93cba590-03e6-448b-a8d0-d61dbe971a6f\" (UID: \"93cba590-03e6-448b-a8d0-d61dbe971a6f\") " Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.927127 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93cba590-03e6-448b-a8d0-d61dbe971a6f-kube-api-access-jgzrv" (OuterVolumeSpecName: "kube-api-access-jgzrv") pod "93cba590-03e6-448b-a8d0-d61dbe971a6f" (UID: "93cba590-03e6-448b-a8d0-d61dbe971a6f"). InnerVolumeSpecName "kube-api-access-jgzrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.927679 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "93cba590-03e6-448b-a8d0-d61dbe971a6f" (UID: "93cba590-03e6-448b-a8d0-d61dbe971a6f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.955579 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-inventory" (OuterVolumeSpecName: "inventory") pod "93cba590-03e6-448b-a8d0-d61dbe971a6f" (UID: "93cba590-03e6-448b-a8d0-d61dbe971a6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:37 crc kubenswrapper[5129]: I0314 09:16:37.966115 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "93cba590-03e6-448b-a8d0-d61dbe971a6f" (UID: "93cba590-03e6-448b-a8d0-d61dbe971a6f"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:38 crc kubenswrapper[5129]: I0314 09:16:38.021349 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:38 crc kubenswrapper[5129]: I0314 09:16:38.021395 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgzrv\" (UniqueName: \"kubernetes.io/projected/93cba590-03e6-448b-a8d0-d61dbe971a6f-kube-api-access-jgzrv\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:38 crc kubenswrapper[5129]: I0314 09:16:38.021410 5129 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:38 crc kubenswrapper[5129]: I0314 09:16:38.021426 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/93cba590-03e6-448b-a8d0-d61dbe971a6f-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:38 crc kubenswrapper[5129]: I0314 09:16:38.320299 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" event={"ID":"93cba590-03e6-448b-a8d0-d61dbe971a6f","Type":"ContainerDied","Data":"3ff450860f2eaab624e501db2f7a6286a64af35e8682211b74afab572c0be37e"} Mar 14 09:16:38 crc kubenswrapper[5129]: I0314 09:16:38.320350 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ff450860f2eaab624e501db2f7a6286a64af35e8682211b74afab572c0be37e" Mar 14 09:16:38 crc kubenswrapper[5129]: I0314 09:16:38.320384 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q" Mar 14 09:16:49 crc kubenswrapper[5129]: I0314 09:16:49.574627 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:16:49 crc kubenswrapper[5129]: I0314 09:16:49.575588 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:16:52 crc kubenswrapper[5129]: I0314 09:16:52.058797 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-5tt5v"] Mar 14 09:16:52 crc kubenswrapper[5129]: I0314 09:16:52.068261 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-ca6a-account-create-update-9w5n4"] Mar 14 09:16:52 crc kubenswrapper[5129]: I0314 09:16:52.077377 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-ca6a-account-create-update-9w5n4"] Mar 14 09:16:52 crc kubenswrapper[5129]: I0314 09:16:52.087343 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-5tt5v"] Mar 14 09:16:54 crc kubenswrapper[5129]: I0314 09:16:54.057886 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e78b5da-50f2-4439-946c-a2abca564dd4" path="/var/lib/kubelet/pods/4e78b5da-50f2-4439-946c-a2abca564dd4/volumes" Mar 14 09:16:54 crc kubenswrapper[5129]: I0314 09:16:54.059286 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd776ac-2907-401a-a4b6-2f89df804f66" path="/var/lib/kubelet/pods/dcd776ac-2907-401a-a4b6-2f89df804f66/volumes" Mar 14 09:17:07 crc kubenswrapper[5129]: I0314 09:17:07.049444 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-qprmq"] Mar 14 09:17:07 crc kubenswrapper[5129]: I0314 09:17:07.058348 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-qprmq"] Mar 14 09:17:08 crc kubenswrapper[5129]: I0314 09:17:08.052792 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db9df730-0f02-4892-b3ed-2273cb131298" path="/var/lib/kubelet/pods/db9df730-0f02-4892-b3ed-2273cb131298/volumes" Mar 14 09:17:19 crc kubenswrapper[5129]: I0314 09:17:19.574149 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:17:19 crc kubenswrapper[5129]: I0314 09:17:19.574858 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:17:34 crc kubenswrapper[5129]: I0314 09:17:34.440731 5129 scope.go:117] "RemoveContainer" containerID="c525e0d94418798d9243a949c6127a27e15581ee18c2628c0270ff8236875e14" Mar 14 09:17:34 crc kubenswrapper[5129]: I0314 09:17:34.496335 5129 scope.go:117] "RemoveContainer" containerID="650d3b9af905853e36fcbab021532e7cb5ecef275faf34050170fc42e4af8787" Mar 14 09:17:34 crc kubenswrapper[5129]: I0314 09:17:34.552870 5129 scope.go:117] "RemoveContainer" containerID="72b068afd296394873ca0de2278f5daa53f60c53914a04d88f197d77078254c7" Mar 14 09:17:49 crc kubenswrapper[5129]: I0314 09:17:49.574863 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:17:49 crc kubenswrapper[5129]: I0314 09:17:49.576586 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:17:49 crc kubenswrapper[5129]: I0314 09:17:49.576742 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:17:49 crc kubenswrapper[5129]: I0314 09:17:49.577660 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94aa21ce282c2cbabcf60006572ac233594c85a69c6212b612a74372c03ca9bb"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:17:49 crc kubenswrapper[5129]: I0314 09:17:49.578276 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://94aa21ce282c2cbabcf60006572ac233594c85a69c6212b612a74372c03ca9bb" gracePeriod=600 Mar 14 09:17:50 crc kubenswrapper[5129]: I0314 09:17:50.118057 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"94aa21ce282c2cbabcf60006572ac233594c85a69c6212b612a74372c03ca9bb"} Mar 14 09:17:50 crc kubenswrapper[5129]: I0314 09:17:50.118785 5129 scope.go:117] "RemoveContainer" containerID="2e925fcb22da3809042544210c303c66024cb37e651cfd9519002ad77e827d9e" Mar 14 09:17:50 crc kubenswrapper[5129]: I0314 09:17:50.117434 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="94aa21ce282c2cbabcf60006572ac233594c85a69c6212b612a74372c03ca9bb" exitCode=0 Mar 14 09:17:50 crc kubenswrapper[5129]: I0314 09:17:50.118960 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38"} Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.165532 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557998-748fl"] Mar 14 09:18:00 crc kubenswrapper[5129]: E0314 09:18:00.166856 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e8f005-e776-479c-8946-e516b13555c7" containerName="oc" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.166878 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e8f005-e776-479c-8946-e516b13555c7" containerName="oc" Mar 14 09:18:00 crc kubenswrapper[5129]: E0314 09:18:00.166925 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93cba590-03e6-448b-a8d0-d61dbe971a6f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.166938 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="93cba590-03e6-448b-a8d0-d61dbe971a6f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.167230 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e8f005-e776-479c-8946-e516b13555c7" containerName="oc" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.167272 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="93cba590-03e6-448b-a8d0-d61dbe971a6f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.168404 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-748fl" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.170526 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.171273 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.172212 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.185310 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-748fl"] Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.255874 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7jqq\" (UniqueName: \"kubernetes.io/projected/73053e95-f728-45d8-9cc5-74195815d48d-kube-api-access-t7jqq\") pod \"auto-csr-approver-29557998-748fl\" (UID: \"73053e95-f728-45d8-9cc5-74195815d48d\") " pod="openshift-infra/auto-csr-approver-29557998-748fl" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.357262 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7jqq\" (UniqueName: \"kubernetes.io/projected/73053e95-f728-45d8-9cc5-74195815d48d-kube-api-access-t7jqq\") pod \"auto-csr-approver-29557998-748fl\" (UID: \"73053e95-f728-45d8-9cc5-74195815d48d\") " pod="openshift-infra/auto-csr-approver-29557998-748fl" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.382846 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7jqq\" (UniqueName: \"kubernetes.io/projected/73053e95-f728-45d8-9cc5-74195815d48d-kube-api-access-t7jqq\") pod \"auto-csr-approver-29557998-748fl\" (UID: \"73053e95-f728-45d8-9cc5-74195815d48d\") " pod="openshift-infra/auto-csr-approver-29557998-748fl" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.502865 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-748fl" Mar 14 09:18:00 crc kubenswrapper[5129]: I0314 09:18:00.995879 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-748fl"] Mar 14 09:18:01 crc kubenswrapper[5129]: I0314 09:18:01.002993 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:18:01 crc kubenswrapper[5129]: I0314 09:18:01.228850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-748fl" event={"ID":"73053e95-f728-45d8-9cc5-74195815d48d","Type":"ContainerStarted","Data":"5e73f00e0d546153f076b79d6147ce379d9acf553422d6cfc6d3a9718f8c1268"} Mar 14 09:18:02 crc kubenswrapper[5129]: I0314 09:18:02.239192 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-748fl" event={"ID":"73053e95-f728-45d8-9cc5-74195815d48d","Type":"ContainerStarted","Data":"07aa01ffc2a4a7a3465158906b6c3cad8e479ba5df9bd1bfdbd449986fdac777"} Mar 14 09:18:02 crc kubenswrapper[5129]: I0314 09:18:02.264050 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557998-748fl" podStartSLOduration=1.521180279 podStartE2EDuration="2.264026016s" podCreationTimestamp="2026-03-14 09:18:00 +0000 UTC" firstStartedPulling="2026-03-14 09:18:01.002747621 +0000 UTC m=+8343.754662805" lastFinishedPulling="2026-03-14 09:18:01.745593338 +0000 UTC m=+8344.497508542" observedRunningTime="2026-03-14 09:18:02.256312928 +0000 UTC m=+8345.008228132" watchObservedRunningTime="2026-03-14 09:18:02.264026016 +0000 UTC m=+8345.015941200" Mar 14 09:18:03 crc kubenswrapper[5129]: I0314 09:18:03.251197 5129 generic.go:334] "Generic (PLEG): container finished" podID="73053e95-f728-45d8-9cc5-74195815d48d" containerID="07aa01ffc2a4a7a3465158906b6c3cad8e479ba5df9bd1bfdbd449986fdac777" exitCode=0 Mar 14 09:18:03 crc kubenswrapper[5129]: I0314 09:18:03.251279 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-748fl" event={"ID":"73053e95-f728-45d8-9cc5-74195815d48d","Type":"ContainerDied","Data":"07aa01ffc2a4a7a3465158906b6c3cad8e479ba5df9bd1bfdbd449986fdac777"} Mar 14 09:18:04 crc kubenswrapper[5129]: I0314 09:18:04.629697 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-748fl" Mar 14 09:18:04 crc kubenswrapper[5129]: I0314 09:18:04.752297 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7jqq\" (UniqueName: \"kubernetes.io/projected/73053e95-f728-45d8-9cc5-74195815d48d-kube-api-access-t7jqq\") pod \"73053e95-f728-45d8-9cc5-74195815d48d\" (UID: \"73053e95-f728-45d8-9cc5-74195815d48d\") " Mar 14 09:18:04 crc kubenswrapper[5129]: I0314 09:18:04.758630 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73053e95-f728-45d8-9cc5-74195815d48d-kube-api-access-t7jqq" (OuterVolumeSpecName: "kube-api-access-t7jqq") pod "73053e95-f728-45d8-9cc5-74195815d48d" (UID: "73053e95-f728-45d8-9cc5-74195815d48d"). InnerVolumeSpecName "kube-api-access-t7jqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:04 crc kubenswrapper[5129]: I0314 09:18:04.854887 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7jqq\" (UniqueName: \"kubernetes.io/projected/73053e95-f728-45d8-9cc5-74195815d48d-kube-api-access-t7jqq\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:05 crc kubenswrapper[5129]: I0314 09:18:05.276828 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-748fl" event={"ID":"73053e95-f728-45d8-9cc5-74195815d48d","Type":"ContainerDied","Data":"5e73f00e0d546153f076b79d6147ce379d9acf553422d6cfc6d3a9718f8c1268"} Mar 14 09:18:05 crc kubenswrapper[5129]: I0314 09:18:05.276871 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e73f00e0d546153f076b79d6147ce379d9acf553422d6cfc6d3a9718f8c1268" Mar 14 09:18:05 crc kubenswrapper[5129]: I0314 09:18:05.276949 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-748fl" Mar 14 09:18:05 crc kubenswrapper[5129]: I0314 09:18:05.333942 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-k6wtl"] Mar 14 09:18:05 crc kubenswrapper[5129]: I0314 09:18:05.342203 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-k6wtl"] Mar 14 09:18:06 crc kubenswrapper[5129]: I0314 09:18:06.052257 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="845c1e77-2312-4b8b-90af-042e6863f825" path="/var/lib/kubelet/pods/845c1e77-2312-4b8b-90af-042e6863f825/volumes" Mar 14 09:18:34 crc kubenswrapper[5129]: I0314 09:18:34.656296 5129 scope.go:117] "RemoveContainer" containerID="b18276702ee21ce85fc7cb4290368c316a6446ce845adfcb26f2310fbac407e9" Mar 14 09:19:49 crc kubenswrapper[5129]: I0314 09:19:49.060166 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-a95b-account-create-update-kdg2n"] Mar 14 09:19:49 crc kubenswrapper[5129]: I0314 09:19:49.074086 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-72j5h"] Mar 14 09:19:49 crc kubenswrapper[5129]: I0314 09:19:49.085290 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-72j5h"] Mar 14 09:19:49 crc kubenswrapper[5129]: I0314 09:19:49.094631 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-a95b-account-create-update-kdg2n"] Mar 14 09:19:49 crc kubenswrapper[5129]: I0314 09:19:49.576906 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:19:49 crc kubenswrapper[5129]: I0314 09:19:49.577350 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:19:50 crc kubenswrapper[5129]: I0314 09:19:50.050945 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb1b092b-8bbc-4ed2-8952-7c5a84167130" path="/var/lib/kubelet/pods/cb1b092b-8bbc-4ed2-8952-7c5a84167130/volumes" Mar 14 09:19:50 crc kubenswrapper[5129]: I0314 09:19:50.051547 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62447c8-ed83-43dd-9b97-a2ca49dde6ff" path="/var/lib/kubelet/pods/d62447c8-ed83-43dd-9b97-a2ca49dde6ff/volumes" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.148829 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558000-l224c"] Mar 14 09:20:00 crc kubenswrapper[5129]: E0314 09:20:00.149949 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73053e95-f728-45d8-9cc5-74195815d48d" containerName="oc" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.149970 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="73053e95-f728-45d8-9cc5-74195815d48d" containerName="oc" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.150232 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="73053e95-f728-45d8-9cc5-74195815d48d" containerName="oc" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.151189 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-l224c" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.153961 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.154709 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.154879 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.167690 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-l224c"] Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.287226 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsxqp\" (UniqueName: \"kubernetes.io/projected/8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e-kube-api-access-bsxqp\") pod \"auto-csr-approver-29558000-l224c\" (UID: \"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e\") " pod="openshift-infra/auto-csr-approver-29558000-l224c" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.389718 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsxqp\" (UniqueName: \"kubernetes.io/projected/8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e-kube-api-access-bsxqp\") pod \"auto-csr-approver-29558000-l224c\" (UID: \"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e\") " pod="openshift-infra/auto-csr-approver-29558000-l224c" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.412015 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsxqp\" (UniqueName: \"kubernetes.io/projected/8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e-kube-api-access-bsxqp\") pod \"auto-csr-approver-29558000-l224c\" (UID: \"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e\") " pod="openshift-infra/auto-csr-approver-29558000-l224c" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.477847 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-l224c" Mar 14 09:20:00 crc kubenswrapper[5129]: I0314 09:20:00.940629 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-l224c"] Mar 14 09:20:01 crc kubenswrapper[5129]: I0314 09:20:01.087898 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-5kphx"] Mar 14 09:20:01 crc kubenswrapper[5129]: I0314 09:20:01.106547 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-5kphx"] Mar 14 09:20:01 crc kubenswrapper[5129]: I0314 09:20:01.551160 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-l224c" event={"ID":"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e","Type":"ContainerStarted","Data":"714fab2861e4311b2c6bc33204c6035bc22c5b5e9b2ef87367a1a3014b123f5e"} Mar 14 09:20:02 crc kubenswrapper[5129]: I0314 09:20:02.055247 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03179c2-ff34-4407-9ebd-89f120c07123" path="/var/lib/kubelet/pods/d03179c2-ff34-4407-9ebd-89f120c07123/volumes" Mar 14 09:20:02 crc kubenswrapper[5129]: I0314 09:20:02.564932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-l224c" event={"ID":"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e","Type":"ContainerStarted","Data":"297c0cca3206d165cfc2322105d61b6b22834585b20c5f8d7f098f4db1e4ea42"} Mar 14 09:20:03 crc kubenswrapper[5129]: I0314 09:20:03.576714 5129 generic.go:334] "Generic (PLEG): container finished" podID="8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e" containerID="297c0cca3206d165cfc2322105d61b6b22834585b20c5f8d7f098f4db1e4ea42" exitCode=0 Mar 14 09:20:03 crc kubenswrapper[5129]: I0314 09:20:03.576914 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-l224c" event={"ID":"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e","Type":"ContainerDied","Data":"297c0cca3206d165cfc2322105d61b6b22834585b20c5f8d7f098f4db1e4ea42"} Mar 14 09:20:03 crc kubenswrapper[5129]: I0314 09:20:03.964528 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-l224c" Mar 14 09:20:04 crc kubenswrapper[5129]: I0314 09:20:04.074544 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsxqp\" (UniqueName: \"kubernetes.io/projected/8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e-kube-api-access-bsxqp\") pod \"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e\" (UID: \"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e\") " Mar 14 09:20:04 crc kubenswrapper[5129]: I0314 09:20:04.085119 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e-kube-api-access-bsxqp" (OuterVolumeSpecName: "kube-api-access-bsxqp") pod "8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e" (UID: "8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e"). InnerVolumeSpecName "kube-api-access-bsxqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:20:04 crc kubenswrapper[5129]: I0314 09:20:04.179757 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsxqp\" (UniqueName: \"kubernetes.io/projected/8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e-kube-api-access-bsxqp\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:04 crc kubenswrapper[5129]: I0314 09:20:04.593409 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-l224c" event={"ID":"8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e","Type":"ContainerDied","Data":"714fab2861e4311b2c6bc33204c6035bc22c5b5e9b2ef87367a1a3014b123f5e"} Mar 14 09:20:04 crc kubenswrapper[5129]: I0314 09:20:04.594471 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="714fab2861e4311b2c6bc33204c6035bc22c5b5e9b2ef87367a1a3014b123f5e" Mar 14 09:20:04 crc kubenswrapper[5129]: I0314 09:20:04.593459 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-l224c" Mar 14 09:20:05 crc kubenswrapper[5129]: I0314 09:20:05.051534 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-4lts8"] Mar 14 09:20:05 crc kubenswrapper[5129]: I0314 09:20:05.062561 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-4lts8"] Mar 14 09:20:06 crc kubenswrapper[5129]: I0314 09:20:06.050526 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2019223e-c2a9-4105-aa7e-103a46334d67" path="/var/lib/kubelet/pods/2019223e-c2a9-4105-aa7e-103a46334d67/volumes" Mar 14 09:20:19 crc kubenswrapper[5129]: I0314 09:20:19.574546 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:20:19 crc kubenswrapper[5129]: I0314 09:20:19.575314 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:20:34 crc kubenswrapper[5129]: I0314 09:20:34.781782 5129 scope.go:117] "RemoveContainer" containerID="abaa35f0d625fb25e18b7e8b71c187ccca30803e213f9203fa92f8900b9313d7" Mar 14 09:20:34 crc kubenswrapper[5129]: I0314 09:20:34.811143 5129 scope.go:117] "RemoveContainer" containerID="377d4cb9f048e4c344f41759c1679015c994f008dd29f7de9a5036586413ec47" Mar 14 09:20:34 crc kubenswrapper[5129]: I0314 09:20:34.858569 5129 scope.go:117] "RemoveContainer" containerID="ea696cfcd4dc020003e9f093b3b0d58d1bd2e07121923fc921b4acbc155766fe" Mar 14 09:20:34 crc kubenswrapper[5129]: I0314 09:20:34.926340 5129 scope.go:117] "RemoveContainer" containerID="c7ecd68941f37b693969af2ca495592910ebef03c9fefd3522844aec3e3d7816" Mar 14 09:20:41 crc kubenswrapper[5129]: I0314 09:20:41.980241 5129 generic.go:334] "Generic (PLEG): container finished" podID="78038caa-d464-4b63-a304-057a4393be51" containerID="649146b588b7dff41f016204c26bdc4ed1ce72acd9c650f86abd28cd761f77dd" exitCode=0 Mar 14 09:20:41 crc kubenswrapper[5129]: I0314 09:20:41.980898 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" event={"ID":"78038caa-d464-4b63-a304-057a4393be51","Type":"ContainerDied","Data":"649146b588b7dff41f016204c26bdc4ed1ce72acd9c650f86abd28cd761f77dd"} Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.473565 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.613205 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-inventory\") pod \"78038caa-d464-4b63-a304-057a4393be51\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.613305 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-tripleo-cleanup-combined-ca-bundle\") pod \"78038caa-d464-4b63-a304-057a4393be51\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.613364 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w85fl\" (UniqueName: \"kubernetes.io/projected/78038caa-d464-4b63-a304-057a4393be51-kube-api-access-w85fl\") pod \"78038caa-d464-4b63-a304-057a4393be51\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.613421 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-ssh-key-openstack-cell1\") pod \"78038caa-d464-4b63-a304-057a4393be51\" (UID: \"78038caa-d464-4b63-a304-057a4393be51\") " Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.620058 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "78038caa-d464-4b63-a304-057a4393be51" (UID: "78038caa-d464-4b63-a304-057a4393be51"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.622013 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78038caa-d464-4b63-a304-057a4393be51-kube-api-access-w85fl" (OuterVolumeSpecName: "kube-api-access-w85fl") pod "78038caa-d464-4b63-a304-057a4393be51" (UID: "78038caa-d464-4b63-a304-057a4393be51"). InnerVolumeSpecName "kube-api-access-w85fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.645781 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-inventory" (OuterVolumeSpecName: "inventory") pod "78038caa-d464-4b63-a304-057a4393be51" (UID: "78038caa-d464-4b63-a304-057a4393be51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.653843 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "78038caa-d464-4b63-a304-057a4393be51" (UID: "78038caa-d464-4b63-a304-057a4393be51"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.716563 5129 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.716740 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w85fl\" (UniqueName: \"kubernetes.io/projected/78038caa-d464-4b63-a304-057a4393be51-kube-api-access-w85fl\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.716756 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:43 crc kubenswrapper[5129]: I0314 09:20:43.716774 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78038caa-d464-4b63-a304-057a4393be51-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:44 crc kubenswrapper[5129]: I0314 09:20:44.008321 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" event={"ID":"78038caa-d464-4b63-a304-057a4393be51","Type":"ContainerDied","Data":"767b01c54b582fd38b9f76edd7e57853ad2d361d3073dc4a3847ed07f96e1287"} Mar 14 09:20:44 crc kubenswrapper[5129]: I0314 09:20:44.008385 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767b01c54b582fd38b9f76edd7e57853ad2d361d3073dc4a3847ed07f96e1287" Mar 14 09:20:44 crc kubenswrapper[5129]: I0314 09:20:44.008388 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch" Mar 14 09:20:49 crc kubenswrapper[5129]: I0314 09:20:49.574962 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:20:49 crc kubenswrapper[5129]: I0314 09:20:49.576102 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:20:49 crc kubenswrapper[5129]: I0314 09:20:49.576188 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:20:49 crc kubenswrapper[5129]: I0314 09:20:49.577665 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:20:49 crc kubenswrapper[5129]: I0314 09:20:49.577764 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" gracePeriod=600 Mar 14 09:20:49 crc kubenswrapper[5129]: E0314 09:20:49.703869 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:20:50 crc kubenswrapper[5129]: I0314 09:20:50.080409 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" exitCode=0 Mar 14 09:20:50 crc kubenswrapper[5129]: I0314 09:20:50.080535 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38"} Mar 14 09:20:50 crc kubenswrapper[5129]: I0314 09:20:50.081661 5129 scope.go:117] "RemoveContainer" containerID="94aa21ce282c2cbabcf60006572ac233594c85a69c6212b612a74372c03ca9bb" Mar 14 09:20:50 crc kubenswrapper[5129]: I0314 09:20:50.082856 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:20:50 crc kubenswrapper[5129]: E0314 09:20:50.083239 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:21:02 crc kubenswrapper[5129]: I0314 09:21:02.037397 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:21:02 crc kubenswrapper[5129]: E0314 09:21:02.038321 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.779748 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kksh9"] Mar 14 09:21:10 crc kubenswrapper[5129]: E0314 09:21:10.781514 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78038caa-d464-4b63-a304-057a4393be51" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.781535 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="78038caa-d464-4b63-a304-057a4393be51" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 14 09:21:10 crc kubenswrapper[5129]: E0314 09:21:10.781591 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e" containerName="oc" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.781623 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e" containerName="oc" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.782127 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="78038caa-d464-4b63-a304-057a4393be51" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.782155 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e" containerName="oc" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.785635 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.792767 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kksh9"] Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.828081 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grc8\" (UniqueName: \"kubernetes.io/projected/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-kube-api-access-7grc8\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.828489 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-catalog-content\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.829364 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-utilities\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.931538 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-catalog-content\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.932166 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-utilities\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.932283 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-catalog-content\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.932336 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7grc8\" (UniqueName: \"kubernetes.io/projected/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-kube-api-access-7grc8\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.932762 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-utilities\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:10 crc kubenswrapper[5129]: I0314 09:21:10.972683 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grc8\" (UniqueName: \"kubernetes.io/projected/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-kube-api-access-7grc8\") pod \"community-operators-kksh9\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:11 crc kubenswrapper[5129]: I0314 09:21:11.128781 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:11 crc kubenswrapper[5129]: I0314 09:21:11.736820 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kksh9"] Mar 14 09:21:12 crc kubenswrapper[5129]: I0314 09:21:12.364092 5129 generic.go:334] "Generic (PLEG): container finished" podID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerID="05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6" exitCode=0 Mar 14 09:21:12 crc kubenswrapper[5129]: I0314 09:21:12.364696 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksh9" event={"ID":"56987d09-fdb3-4de9-b8df-b75efc6c6fb2","Type":"ContainerDied","Data":"05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6"} Mar 14 09:21:12 crc kubenswrapper[5129]: I0314 09:21:12.364743 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksh9" event={"ID":"56987d09-fdb3-4de9-b8df-b75efc6c6fb2","Type":"ContainerStarted","Data":"d7f694fe8bd1630fb4639ee9a632c180ad692d00327da483e43c8279729d10f5"} Mar 14 09:21:14 crc kubenswrapper[5129]: I0314 09:21:14.398098 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksh9" event={"ID":"56987d09-fdb3-4de9-b8df-b75efc6c6fb2","Type":"ContainerStarted","Data":"6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80"} Mar 14 09:21:15 crc kubenswrapper[5129]: I0314 09:21:15.036891 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:21:15 crc kubenswrapper[5129]: E0314 09:21:15.037385 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:21:15 crc kubenswrapper[5129]: I0314 09:21:15.401781 5129 generic.go:334] "Generic (PLEG): container finished" podID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerID="6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80" exitCode=0 Mar 14 09:21:15 crc kubenswrapper[5129]: I0314 09:21:15.401831 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksh9" event={"ID":"56987d09-fdb3-4de9-b8df-b75efc6c6fb2","Type":"ContainerDied","Data":"6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80"} Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.260299 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m6vlz"] Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.262261 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.265803 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.269789 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.269820 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.270836 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.275680 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-6dnqp"] Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.277682 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.280419 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.283429 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.290371 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m6vlz"] Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.315312 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-6dnqp"] Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.346745 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.346918 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.346957 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-inventory\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.347060 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbqh9\" (UniqueName: \"kubernetes.io/projected/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-kube-api-access-mbqh9\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.415263 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksh9" event={"ID":"56987d09-fdb3-4de9-b8df-b75efc6c6fb2","Type":"ContainerStarted","Data":"21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123"} Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.437123 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kksh9" podStartSLOduration=2.972014288 podStartE2EDuration="6.437098556s" podCreationTimestamp="2026-03-14 09:21:10 +0000 UTC" firstStartedPulling="2026-03-14 09:21:12.367752923 +0000 UTC m=+8535.119668107" lastFinishedPulling="2026-03-14 09:21:15.832837191 +0000 UTC m=+8538.584752375" observedRunningTime="2026-03-14 09:21:16.432712787 +0000 UTC m=+8539.184627981" watchObservedRunningTime="2026-03-14 09:21:16.437098556 +0000 UTC m=+8539.189013740" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.449930 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkzkx\" (UniqueName: \"kubernetes.io/projected/5e53276d-c8cb-4fb1-aea6-d436e50e4490-kube-api-access-dkzkx\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.450048 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-inventory\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.450108 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.450171 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.450208 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-inventory\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.450291 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbqh9\" (UniqueName: \"kubernetes.io/projected/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-kube-api-access-mbqh9\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.450338 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.450369 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.457306 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-inventory\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.457416 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.459426 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.477282 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbqh9\" (UniqueName: \"kubernetes.io/projected/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-kube-api-access-mbqh9\") pod \"bootstrap-openstack-openstack-cell1-m6vlz\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.553296 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.553365 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkzkx\" (UniqueName: \"kubernetes.io/projected/5e53276d-c8cb-4fb1-aea6-d436e50e4490-kube-api-access-dkzkx\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.553409 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-inventory\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.553451 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.557567 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-inventory\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.559723 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.559834 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.572267 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkzkx\" (UniqueName: \"kubernetes.io/projected/5e53276d-c8cb-4fb1-aea6-d436e50e4490-kube-api-access-dkzkx\") pod \"bootstrap-openstack-openstack-networker-6dnqp\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.580850 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:21:16 crc kubenswrapper[5129]: I0314 09:21:16.596906 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:21:17 crc kubenswrapper[5129]: I0314 09:21:17.191212 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m6vlz"] Mar 14 09:21:17 crc kubenswrapper[5129]: I0314 09:21:17.350617 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-6dnqp"] Mar 14 09:21:17 crc kubenswrapper[5129]: W0314 09:21:17.363357 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e53276d_c8cb_4fb1_aea6_d436e50e4490.slice/crio-1c63e62df4d8ab151e01684c03fe79ebbb6677a1c5febc44f052a1a7b3bb84f2 WatchSource:0}: Error finding container 1c63e62df4d8ab151e01684c03fe79ebbb6677a1c5febc44f052a1a7b3bb84f2: Status 404 returned error can't find the container with id 1c63e62df4d8ab151e01684c03fe79ebbb6677a1c5febc44f052a1a7b3bb84f2 Mar 14 09:21:17 crc kubenswrapper[5129]: I0314 09:21:17.430110 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" event={"ID":"5e53276d-c8cb-4fb1-aea6-d436e50e4490","Type":"ContainerStarted","Data":"1c63e62df4d8ab151e01684c03fe79ebbb6677a1c5febc44f052a1a7b3bb84f2"} Mar 14 09:21:17 crc kubenswrapper[5129]: I0314 09:21:17.439005 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" event={"ID":"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64","Type":"ContainerStarted","Data":"ab30ed1e4f042bc5bb73d6eaa29d2ed5c3eabeda5d67b879e24ef2c3f164ab44"} Mar 14 09:21:19 crc kubenswrapper[5129]: I0314 09:21:19.683576 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" event={"ID":"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64","Type":"ContainerStarted","Data":"a13d66826c52749337612761c09fb2c3ea8d20f3ed1e0a0e4d4391ecffcd473d"} Mar 14 09:21:19 crc kubenswrapper[5129]: I0314 09:21:19.686117 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" event={"ID":"5e53276d-c8cb-4fb1-aea6-d436e50e4490","Type":"ContainerStarted","Data":"36912edf9951921f214c48e6e4f0d7c24003bd25a6670e94c4942b96f46406ff"} Mar 14 09:21:19 crc kubenswrapper[5129]: I0314 09:21:19.703891 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" podStartSLOduration=2.276235753 podStartE2EDuration="3.703871839s" podCreationTimestamp="2026-03-14 09:21:16 +0000 UTC" firstStartedPulling="2026-03-14 09:21:17.257177338 +0000 UTC m=+8540.009092522" lastFinishedPulling="2026-03-14 09:21:18.684813424 +0000 UTC m=+8541.436728608" observedRunningTime="2026-03-14 09:21:19.700502788 +0000 UTC m=+8542.452417982" watchObservedRunningTime="2026-03-14 09:21:19.703871839 +0000 UTC m=+8542.455787023" Mar 14 09:21:19 crc kubenswrapper[5129]: I0314 09:21:19.721486 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" podStartSLOduration=2.197333102 podStartE2EDuration="3.721468164s" podCreationTimestamp="2026-03-14 09:21:16 +0000 UTC" firstStartedPulling="2026-03-14 09:21:17.370701873 +0000 UTC m=+8540.122617057" lastFinishedPulling="2026-03-14 09:21:18.894836945 +0000 UTC m=+8541.646752119" observedRunningTime="2026-03-14 09:21:19.718790881 +0000 UTC m=+8542.470706065" watchObservedRunningTime="2026-03-14 09:21:19.721468164 +0000 UTC m=+8542.473383348" Mar 14 09:21:21 crc kubenswrapper[5129]: I0314 09:21:21.130190 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:21 crc kubenswrapper[5129]: I0314 09:21:21.130888 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:21 crc kubenswrapper[5129]: I0314 09:21:21.181344 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:21 crc kubenswrapper[5129]: I0314 09:21:21.779500 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:21 crc kubenswrapper[5129]: I0314 09:21:21.843256 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kksh9"] Mar 14 09:21:23 crc kubenswrapper[5129]: I0314 09:21:23.732962 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kksh9" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="registry-server" containerID="cri-o://21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123" gracePeriod=2 Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.259773 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.345887 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-catalog-content\") pod \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.345944 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7grc8\" (UniqueName: \"kubernetes.io/projected/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-kube-api-access-7grc8\") pod \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.345978 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-utilities\") pod \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\" (UID: \"56987d09-fdb3-4de9-b8df-b75efc6c6fb2\") " Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.347540 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-utilities" (OuterVolumeSpecName: "utilities") pod "56987d09-fdb3-4de9-b8df-b75efc6c6fb2" (UID: "56987d09-fdb3-4de9-b8df-b75efc6c6fb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.355467 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-kube-api-access-7grc8" (OuterVolumeSpecName: "kube-api-access-7grc8") pod "56987d09-fdb3-4de9-b8df-b75efc6c6fb2" (UID: "56987d09-fdb3-4de9-b8df-b75efc6c6fb2"). InnerVolumeSpecName "kube-api-access-7grc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.448202 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7grc8\" (UniqueName: \"kubernetes.io/projected/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-kube-api-access-7grc8\") on node \"crc\" DevicePath \"\"" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.448571 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.750691 5129 generic.go:334] "Generic (PLEG): container finished" podID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerID="21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123" exitCode=0 Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.750758 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksh9" event={"ID":"56987d09-fdb3-4de9-b8df-b75efc6c6fb2","Type":"ContainerDied","Data":"21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123"} Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.750831 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kksh9" event={"ID":"56987d09-fdb3-4de9-b8df-b75efc6c6fb2","Type":"ContainerDied","Data":"d7f694fe8bd1630fb4639ee9a632c180ad692d00327da483e43c8279729d10f5"} Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.750838 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kksh9" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.750863 5129 scope.go:117] "RemoveContainer" containerID="21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.778434 5129 scope.go:117] "RemoveContainer" containerID="6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.801766 5129 scope.go:117] "RemoveContainer" containerID="05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.878795 5129 scope.go:117] "RemoveContainer" containerID="21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123" Mar 14 09:21:24 crc kubenswrapper[5129]: E0314 09:21:24.879414 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123\": container with ID starting with 21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123 not found: ID does not exist" containerID="21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.879496 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123"} err="failed to get container status \"21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123\": rpc error: code = NotFound desc = could not find container \"21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123\": container with ID starting with 21eac95b0ce62b812cf095e08b5a7ab5b1b9b7074f028bce7c0755dcfc0d4123 not found: ID does not exist" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.879557 5129 scope.go:117] "RemoveContainer" containerID="6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80" Mar 14 09:21:24 crc kubenswrapper[5129]: E0314 09:21:24.880048 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80\": container with ID starting with 6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80 not found: ID does not exist" containerID="6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.880077 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80"} err="failed to get container status \"6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80\": rpc error: code = NotFound desc = could not find container \"6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80\": container with ID starting with 6d4d922be580b03872a80e15c09e3353d75032ac37a504a4ea93075444ce9a80 not found: ID does not exist" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.880097 5129 scope.go:117] "RemoveContainer" containerID="05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6" Mar 14 09:21:24 crc kubenswrapper[5129]: E0314 09:21:24.880428 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6\": container with ID starting with 05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6 not found: ID does not exist" containerID="05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6" Mar 14 09:21:24 crc kubenswrapper[5129]: I0314 09:21:24.880476 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6"} err="failed to get container status \"05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6\": rpc error: code = NotFound desc = could not find container \"05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6\": container with ID starting with 05c3fed835b7fdcce905ee93aef2f54d4e6240c27fe9e17cb898bae2355601d6 not found: ID does not exist" Mar 14 09:21:25 crc kubenswrapper[5129]: I0314 09:21:25.085572 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56987d09-fdb3-4de9-b8df-b75efc6c6fb2" (UID: "56987d09-fdb3-4de9-b8df-b75efc6c6fb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:21:25 crc kubenswrapper[5129]: I0314 09:21:25.167687 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56987d09-fdb3-4de9-b8df-b75efc6c6fb2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:21:25 crc kubenswrapper[5129]: I0314 09:21:25.390144 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kksh9"] Mar 14 09:21:25 crc kubenswrapper[5129]: I0314 09:21:25.400654 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kksh9"] Mar 14 09:21:26 crc kubenswrapper[5129]: I0314 09:21:26.036224 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:21:26 crc kubenswrapper[5129]: E0314 09:21:26.036573 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:21:26 crc kubenswrapper[5129]: I0314 09:21:26.100107 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" path="/var/lib/kubelet/pods/56987d09-fdb3-4de9-b8df-b75efc6c6fb2/volumes" Mar 14 09:21:37 crc kubenswrapper[5129]: I0314 09:21:37.036835 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:21:37 crc kubenswrapper[5129]: E0314 09:21:37.037640 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:21:48 crc kubenswrapper[5129]: I0314 09:21:48.044656 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:21:48 crc kubenswrapper[5129]: E0314 09:21:48.048108 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.152704 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558002-np6rf"] Mar 14 09:22:00 crc kubenswrapper[5129]: E0314 09:22:00.154450 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="extract-content" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.154471 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="extract-content" Mar 14 09:22:00 crc kubenswrapper[5129]: E0314 09:22:00.154517 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.154528 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[5129]: E0314 09:22:00.154569 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="extract-utilities" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.154581 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="extract-utilities" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.154924 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="56987d09-fdb3-4de9-b8df-b75efc6c6fb2" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.156294 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-np6rf" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.159325 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.159353 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.159484 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.166354 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-np6rf"] Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.309212 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8ct\" (UniqueName: \"kubernetes.io/projected/2a192670-ecbb-4862-a27e-e8c0536bde53-kube-api-access-zp8ct\") pod \"auto-csr-approver-29558002-np6rf\" (UID: \"2a192670-ecbb-4862-a27e-e8c0536bde53\") " pod="openshift-infra/auto-csr-approver-29558002-np6rf" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.411945 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8ct\" (UniqueName: \"kubernetes.io/projected/2a192670-ecbb-4862-a27e-e8c0536bde53-kube-api-access-zp8ct\") pod \"auto-csr-approver-29558002-np6rf\" (UID: \"2a192670-ecbb-4862-a27e-e8c0536bde53\") " pod="openshift-infra/auto-csr-approver-29558002-np6rf" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.438433 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8ct\" (UniqueName: \"kubernetes.io/projected/2a192670-ecbb-4862-a27e-e8c0536bde53-kube-api-access-zp8ct\") pod \"auto-csr-approver-29558002-np6rf\" (UID: \"2a192670-ecbb-4862-a27e-e8c0536bde53\") " pod="openshift-infra/auto-csr-approver-29558002-np6rf" Mar 14 09:22:00 crc kubenswrapper[5129]: I0314 09:22:00.477438 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-np6rf" Mar 14 09:22:01 crc kubenswrapper[5129]: I0314 09:22:01.016529 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-np6rf"] Mar 14 09:22:01 crc kubenswrapper[5129]: I0314 09:22:01.154309 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-np6rf" event={"ID":"2a192670-ecbb-4862-a27e-e8c0536bde53","Type":"ContainerStarted","Data":"fd283900d25cf949254eb4681f4741f46d3a89f0116f5db783ac4d4d95f6f8dd"} Mar 14 09:22:02 crc kubenswrapper[5129]: I0314 09:22:02.037390 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:22:02 crc kubenswrapper[5129]: E0314 09:22:02.038529 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:22:02 crc kubenswrapper[5129]: I0314 09:22:02.165672 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-np6rf" event={"ID":"2a192670-ecbb-4862-a27e-e8c0536bde53","Type":"ContainerStarted","Data":"57c2452833de7e3f4f61355cb63ec7942151b07421b72108c926529490922b85"} Mar 14 09:22:03 crc kubenswrapper[5129]: I0314 09:22:03.177007 5129 generic.go:334] "Generic (PLEG): container finished" podID="2a192670-ecbb-4862-a27e-e8c0536bde53" containerID="57c2452833de7e3f4f61355cb63ec7942151b07421b72108c926529490922b85" exitCode=0 Mar 14 09:22:03 crc kubenswrapper[5129]: I0314 09:22:03.177206 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-np6rf" event={"ID":"2a192670-ecbb-4862-a27e-e8c0536bde53","Type":"ContainerDied","Data":"57c2452833de7e3f4f61355cb63ec7942151b07421b72108c926529490922b85"} Mar 14 09:22:04 crc kubenswrapper[5129]: I0314 09:22:04.592354 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-np6rf" Mar 14 09:22:04 crc kubenswrapper[5129]: I0314 09:22:04.721403 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp8ct\" (UniqueName: \"kubernetes.io/projected/2a192670-ecbb-4862-a27e-e8c0536bde53-kube-api-access-zp8ct\") pod \"2a192670-ecbb-4862-a27e-e8c0536bde53\" (UID: \"2a192670-ecbb-4862-a27e-e8c0536bde53\") " Mar 14 09:22:04 crc kubenswrapper[5129]: I0314 09:22:04.729259 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a192670-ecbb-4862-a27e-e8c0536bde53-kube-api-access-zp8ct" (OuterVolumeSpecName: "kube-api-access-zp8ct") pod "2a192670-ecbb-4862-a27e-e8c0536bde53" (UID: "2a192670-ecbb-4862-a27e-e8c0536bde53"). InnerVolumeSpecName "kube-api-access-zp8ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:22:04 crc kubenswrapper[5129]: I0314 09:22:04.825176 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp8ct\" (UniqueName: \"kubernetes.io/projected/2a192670-ecbb-4862-a27e-e8c0536bde53-kube-api-access-zp8ct\") on node \"crc\" DevicePath \"\"" Mar 14 09:22:05 crc kubenswrapper[5129]: I0314 09:22:05.203351 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-np6rf" event={"ID":"2a192670-ecbb-4862-a27e-e8c0536bde53","Type":"ContainerDied","Data":"fd283900d25cf949254eb4681f4741f46d3a89f0116f5db783ac4d4d95f6f8dd"} Mar 14 09:22:05 crc kubenswrapper[5129]: I0314 09:22:05.203794 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd283900d25cf949254eb4681f4741f46d3a89f0116f5db783ac4d4d95f6f8dd" Mar 14 09:22:05 crc kubenswrapper[5129]: I0314 09:22:05.203471 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-np6rf" Mar 14 09:22:05 crc kubenswrapper[5129]: I0314 09:22:05.274812 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-8wh64"] Mar 14 09:22:05 crc kubenswrapper[5129]: I0314 09:22:05.282936 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-8wh64"] Mar 14 09:22:06 crc kubenswrapper[5129]: I0314 09:22:06.050944 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e8f005-e776-479c-8946-e516b13555c7" path="/var/lib/kubelet/pods/c1e8f005-e776-479c-8946-e516b13555c7/volumes" Mar 14 09:22:16 crc kubenswrapper[5129]: I0314 09:22:16.037288 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:22:16 crc kubenswrapper[5129]: E0314 09:22:16.038472 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:22:27 crc kubenswrapper[5129]: I0314 09:22:27.037243 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:22:27 crc kubenswrapper[5129]: E0314 09:22:27.039272 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:22:35 crc kubenswrapper[5129]: I0314 09:22:35.075470 5129 scope.go:117] "RemoveContainer" containerID="1c8d1b31196e1bac6fd195f3ed482fe9e2ae489d67afaffef42c8116433f847a" Mar 14 09:22:40 crc kubenswrapper[5129]: I0314 09:22:40.037782 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:22:40 crc kubenswrapper[5129]: E0314 09:22:40.038518 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:22:51 crc kubenswrapper[5129]: I0314 09:22:51.036361 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:22:51 crc kubenswrapper[5129]: E0314 09:22:51.037896 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:23:05 crc kubenswrapper[5129]: I0314 09:23:05.037173 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:23:05 crc kubenswrapper[5129]: E0314 09:23:05.038039 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:23:16 crc kubenswrapper[5129]: I0314 09:23:16.037346 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:23:16 crc kubenswrapper[5129]: E0314 09:23:16.039036 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:23:29 crc kubenswrapper[5129]: I0314 09:23:29.036811 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:23:29 crc kubenswrapper[5129]: E0314 09:23:29.038024 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:23:41 crc kubenswrapper[5129]: I0314 09:23:41.037289 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:23:41 crc kubenswrapper[5129]: E0314 09:23:41.038452 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:23:52 crc kubenswrapper[5129]: I0314 09:23:52.037479 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:23:52 crc kubenswrapper[5129]: E0314 09:23:52.038868 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.141456 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558004-tt9qw"] Mar 14 09:24:00 crc kubenswrapper[5129]: E0314 09:24:00.143245 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a192670-ecbb-4862-a27e-e8c0536bde53" containerName="oc" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.143262 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a192670-ecbb-4862-a27e-e8c0536bde53" containerName="oc" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.143488 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a192670-ecbb-4862-a27e-e8c0536bde53" containerName="oc" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.144373 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-tt9qw" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.146698 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.146920 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.147049 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.153066 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-tt9qw"] Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.263266 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tllt\" (UniqueName: \"kubernetes.io/projected/f09ccbf1-7724-4b26-bc21-2822d2ea8eaa-kube-api-access-2tllt\") pod \"auto-csr-approver-29558004-tt9qw\" (UID: \"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa\") " pod="openshift-infra/auto-csr-approver-29558004-tt9qw" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.365494 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tllt\" (UniqueName: \"kubernetes.io/projected/f09ccbf1-7724-4b26-bc21-2822d2ea8eaa-kube-api-access-2tllt\") pod \"auto-csr-approver-29558004-tt9qw\" (UID: \"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa\") " pod="openshift-infra/auto-csr-approver-29558004-tt9qw" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.384859 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tllt\" (UniqueName: \"kubernetes.io/projected/f09ccbf1-7724-4b26-bc21-2822d2ea8eaa-kube-api-access-2tllt\") pod \"auto-csr-approver-29558004-tt9qw\" (UID: \"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa\") " pod="openshift-infra/auto-csr-approver-29558004-tt9qw" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.466723 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-tt9qw" Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.919825 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-tt9qw"] Mar 14 09:24:00 crc kubenswrapper[5129]: I0314 09:24:00.929090 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:24:01 crc kubenswrapper[5129]: I0314 09:24:01.529795 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-tt9qw" event={"ID":"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa","Type":"ContainerStarted","Data":"82a58900a078b2348ef7384caba653885721030fc428ef43418aa110e6a0d989"} Mar 14 09:24:02 crc kubenswrapper[5129]: I0314 09:24:02.542832 5129 generic.go:334] "Generic (PLEG): container finished" podID="f09ccbf1-7724-4b26-bc21-2822d2ea8eaa" containerID="590608f3b22bf63f9e7dc0645f708b7ccee925ba750fb564f457a6440badb487" exitCode=0 Mar 14 09:24:02 crc kubenswrapper[5129]: I0314 09:24:02.542940 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-tt9qw" event={"ID":"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa","Type":"ContainerDied","Data":"590608f3b22bf63f9e7dc0645f708b7ccee925ba750fb564f457a6440badb487"} Mar 14 09:24:03 crc kubenswrapper[5129]: I0314 09:24:03.037842 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:24:03 crc kubenswrapper[5129]: E0314 09:24:03.038201 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:24:03 crc kubenswrapper[5129]: I0314 09:24:03.928824 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-tt9qw" Mar 14 09:24:04 crc kubenswrapper[5129]: I0314 09:24:04.052937 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tllt\" (UniqueName: \"kubernetes.io/projected/f09ccbf1-7724-4b26-bc21-2822d2ea8eaa-kube-api-access-2tllt\") pod \"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa\" (UID: \"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa\") " Mar 14 09:24:04 crc kubenswrapper[5129]: I0314 09:24:04.059667 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09ccbf1-7724-4b26-bc21-2822d2ea8eaa-kube-api-access-2tllt" (OuterVolumeSpecName: "kube-api-access-2tllt") pod "f09ccbf1-7724-4b26-bc21-2822d2ea8eaa" (UID: "f09ccbf1-7724-4b26-bc21-2822d2ea8eaa"). InnerVolumeSpecName "kube-api-access-2tllt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:24:04 crc kubenswrapper[5129]: I0314 09:24:04.157812 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tllt\" (UniqueName: \"kubernetes.io/projected/f09ccbf1-7724-4b26-bc21-2822d2ea8eaa-kube-api-access-2tllt\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:04 crc kubenswrapper[5129]: I0314 09:24:04.566303 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-tt9qw" event={"ID":"f09ccbf1-7724-4b26-bc21-2822d2ea8eaa","Type":"ContainerDied","Data":"82a58900a078b2348ef7384caba653885721030fc428ef43418aa110e6a0d989"} Mar 14 09:24:04 crc kubenswrapper[5129]: I0314 09:24:04.567686 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82a58900a078b2348ef7384caba653885721030fc428ef43418aa110e6a0d989" Mar 14 09:24:04 crc kubenswrapper[5129]: I0314 09:24:04.566419 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-tt9qw" Mar 14 09:24:05 crc kubenswrapper[5129]: I0314 09:24:05.027627 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-748fl"] Mar 14 09:24:05 crc kubenswrapper[5129]: I0314 09:24:05.043570 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-748fl"] Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.049628 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73053e95-f728-45d8-9cc5-74195815d48d" path="/var/lib/kubelet/pods/73053e95-f728-45d8-9cc5-74195815d48d/volumes" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.341539 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vz984"] Mar 14 09:24:06 crc kubenswrapper[5129]: E0314 09:24:06.342851 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09ccbf1-7724-4b26-bc21-2822d2ea8eaa" containerName="oc" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.342879 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09ccbf1-7724-4b26-bc21-2822d2ea8eaa" containerName="oc" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.343259 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09ccbf1-7724-4b26-bc21-2822d2ea8eaa" containerName="oc" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.345566 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.470188 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vz984"] Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.521504 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-catalog-content\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.521820 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-utilities\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.522001 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sdjj\" (UniqueName: \"kubernetes.io/projected/3c1d6709-7a13-49f5-805a-a58cc35dc857-kube-api-access-5sdjj\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.624403 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-utilities\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.624503 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sdjj\" (UniqueName: \"kubernetes.io/projected/3c1d6709-7a13-49f5-805a-a58cc35dc857-kube-api-access-5sdjj\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.624661 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-catalog-content\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.625258 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-catalog-content\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.625520 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-utilities\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.664431 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sdjj\" (UniqueName: \"kubernetes.io/projected/3c1d6709-7a13-49f5-805a-a58cc35dc857-kube-api-access-5sdjj\") pod \"certified-operators-vz984\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:06 crc kubenswrapper[5129]: I0314 09:24:06.676355 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:07 crc kubenswrapper[5129]: I0314 09:24:07.229780 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vz984"] Mar 14 09:24:07 crc kubenswrapper[5129]: I0314 09:24:07.600555 5129 generic.go:334] "Generic (PLEG): container finished" podID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerID="2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d" exitCode=0 Mar 14 09:24:07 crc kubenswrapper[5129]: I0314 09:24:07.600640 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vz984" event={"ID":"3c1d6709-7a13-49f5-805a-a58cc35dc857","Type":"ContainerDied","Data":"2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d"} Mar 14 09:24:07 crc kubenswrapper[5129]: I0314 09:24:07.600678 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vz984" event={"ID":"3c1d6709-7a13-49f5-805a-a58cc35dc857","Type":"ContainerStarted","Data":"01ba408dcc6d32b1bf86b75343aa562db03ae320a05c7419fc183d6a447a85c4"} Mar 14 09:24:08 crc kubenswrapper[5129]: I0314 09:24:08.616530 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vz984" event={"ID":"3c1d6709-7a13-49f5-805a-a58cc35dc857","Type":"ContainerStarted","Data":"822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd"} Mar 14 09:24:08 crc kubenswrapper[5129]: I0314 09:24:08.715465 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c8t6g"] Mar 14 09:24:08 crc kubenswrapper[5129]: I0314 09:24:08.718968 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:08 crc kubenswrapper[5129]: I0314 09:24:08.736626 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8t6g"] Mar 14 09:24:08 crc kubenswrapper[5129]: I0314 09:24:08.899753 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94bqv\" (UniqueName: \"kubernetes.io/projected/039f2ab5-8a1d-451d-a6a0-3b227a11606a-kube-api-access-94bqv\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:08 crc kubenswrapper[5129]: I0314 09:24:08.900030 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-utilities\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:08 crc kubenswrapper[5129]: I0314 09:24:08.900074 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-catalog-content\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.003426 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-utilities\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.003843 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-catalog-content\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.003975 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-utilities\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.004581 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94bqv\" (UniqueName: \"kubernetes.io/projected/039f2ab5-8a1d-451d-a6a0-3b227a11606a-kube-api-access-94bqv\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.004880 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-catalog-content\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.026377 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94bqv\" (UniqueName: \"kubernetes.io/projected/039f2ab5-8a1d-451d-a6a0-3b227a11606a-kube-api-access-94bqv\") pod \"redhat-marketplace-c8t6g\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.057775 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.573959 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8t6g"] Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.632915 5129 generic.go:334] "Generic (PLEG): container finished" podID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerID="822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd" exitCode=0 Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.633012 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vz984" event={"ID":"3c1d6709-7a13-49f5-805a-a58cc35dc857","Type":"ContainerDied","Data":"822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd"} Mar 14 09:24:09 crc kubenswrapper[5129]: I0314 09:24:09.640676 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8t6g" event={"ID":"039f2ab5-8a1d-451d-a6a0-3b227a11606a","Type":"ContainerStarted","Data":"5ce359c62c18b7d727eee2ebd6671231b93d8a11f6517f3b87fce3d889fe592c"} Mar 14 09:24:10 crc kubenswrapper[5129]: I0314 09:24:10.656298 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vz984" event={"ID":"3c1d6709-7a13-49f5-805a-a58cc35dc857","Type":"ContainerStarted","Data":"bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811"} Mar 14 09:24:10 crc kubenswrapper[5129]: I0314 09:24:10.658883 5129 generic.go:334] "Generic (PLEG): container finished" podID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerID="ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8" exitCode=0 Mar 14 09:24:10 crc kubenswrapper[5129]: I0314 09:24:10.658946 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8t6g" event={"ID":"039f2ab5-8a1d-451d-a6a0-3b227a11606a","Type":"ContainerDied","Data":"ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8"} Mar 14 09:24:10 crc kubenswrapper[5129]: I0314 09:24:10.693442 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vz984" podStartSLOduration=2.132151581 podStartE2EDuration="4.693420675s" podCreationTimestamp="2026-03-14 09:24:06 +0000 UTC" firstStartedPulling="2026-03-14 09:24:07.6028773 +0000 UTC m=+8710.354792484" lastFinishedPulling="2026-03-14 09:24:10.164146394 +0000 UTC m=+8712.916061578" observedRunningTime="2026-03-14 09:24:10.687992568 +0000 UTC m=+8713.439907782" watchObservedRunningTime="2026-03-14 09:24:10.693420675 +0000 UTC m=+8713.445335879" Mar 14 09:24:11 crc kubenswrapper[5129]: I0314 09:24:11.672012 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8t6g" event={"ID":"039f2ab5-8a1d-451d-a6a0-3b227a11606a","Type":"ContainerStarted","Data":"91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03"} Mar 14 09:24:12 crc kubenswrapper[5129]: I0314 09:24:12.717706 5129 generic.go:334] "Generic (PLEG): container finished" podID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerID="91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03" exitCode=0 Mar 14 09:24:12 crc kubenswrapper[5129]: I0314 09:24:12.718370 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8t6g" event={"ID":"039f2ab5-8a1d-451d-a6a0-3b227a11606a","Type":"ContainerDied","Data":"91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03"} Mar 14 09:24:13 crc kubenswrapper[5129]: I0314 09:24:13.738084 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8t6g" event={"ID":"039f2ab5-8a1d-451d-a6a0-3b227a11606a","Type":"ContainerStarted","Data":"c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df"} Mar 14 09:24:13 crc kubenswrapper[5129]: I0314 09:24:13.770267 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c8t6g" podStartSLOduration=3.063777686 podStartE2EDuration="5.77023729s" podCreationTimestamp="2026-03-14 09:24:08 +0000 UTC" firstStartedPulling="2026-03-14 09:24:10.660505157 +0000 UTC m=+8713.412420341" lastFinishedPulling="2026-03-14 09:24:13.366964761 +0000 UTC m=+8716.118879945" observedRunningTime="2026-03-14 09:24:13.767742742 +0000 UTC m=+8716.519657996" watchObservedRunningTime="2026-03-14 09:24:13.77023729 +0000 UTC m=+8716.522152514" Mar 14 09:24:14 crc kubenswrapper[5129]: I0314 09:24:14.759374 5129 generic.go:334] "Generic (PLEG): container finished" podID="35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" containerID="a13d66826c52749337612761c09fb2c3ea8d20f3ed1e0a0e4d4391ecffcd473d" exitCode=0 Mar 14 09:24:14 crc kubenswrapper[5129]: I0314 09:24:14.759842 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" event={"ID":"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64","Type":"ContainerDied","Data":"a13d66826c52749337612761c09fb2c3ea8d20f3ed1e0a0e4d4391ecffcd473d"} Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.037992 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:24:16 crc kubenswrapper[5129]: E0314 09:24:16.039044 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.298514 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.320711 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-ssh-key-openstack-cell1\") pod \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.320797 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-bootstrap-combined-ca-bundle\") pod \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.320893 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbqh9\" (UniqueName: \"kubernetes.io/projected/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-kube-api-access-mbqh9\") pod \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.321152 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-inventory\") pod \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\" (UID: \"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64\") " Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.328162 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" (UID: "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.331438 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-kube-api-access-mbqh9" (OuterVolumeSpecName: "kube-api-access-mbqh9") pod "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" (UID: "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64"). InnerVolumeSpecName "kube-api-access-mbqh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.358881 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-inventory" (OuterVolumeSpecName: "inventory") pod "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" (UID: "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.367495 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" (UID: "35d9420c-6bf5-4e2b-a80c-9e2b364f6a64"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.424843 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.425168 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.425241 5129 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.425316 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbqh9\" (UniqueName: \"kubernetes.io/projected/35d9420c-6bf5-4e2b-a80c-9e2b364f6a64-kube-api-access-mbqh9\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.676747 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.677017 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.739356 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.781501 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.781488 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m6vlz" event={"ID":"35d9420c-6bf5-4e2b-a80c-9e2b364f6a64","Type":"ContainerDied","Data":"ab30ed1e4f042bc5bb73d6eaa29d2ed5c3eabeda5d67b879e24ef2c3f164ab44"} Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.781739 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab30ed1e4f042bc5bb73d6eaa29d2ed5c3eabeda5d67b879e24ef2c3f164ab44" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.838849 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.945963 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-2vl8w"] Mar 14 09:24:16 crc kubenswrapper[5129]: E0314 09:24:16.946727 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" containerName="bootstrap-openstack-openstack-cell1" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.946756 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" containerName="bootstrap-openstack-openstack-cell1" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.947057 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d9420c-6bf5-4e2b-a80c-9e2b364f6a64" containerName="bootstrap-openstack-openstack-cell1" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.948035 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.952014 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.952013 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:24:16 crc kubenswrapper[5129]: I0314 09:24:16.958504 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-2vl8w"] Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.046537 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbwvk\" (UniqueName: \"kubernetes.io/projected/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-kube-api-access-bbwvk\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.046706 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-inventory\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.046788 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.148325 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.148458 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbwvk\" (UniqueName: \"kubernetes.io/projected/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-kube-api-access-bbwvk\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.148580 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-inventory\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.154167 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.154408 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-inventory\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.167454 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbwvk\" (UniqueName: \"kubernetes.io/projected/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-kube-api-access-bbwvk\") pod \"download-cache-openstack-openstack-cell1-2vl8w\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.281749 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.910971 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vz984"] Mar 14 09:24:17 crc kubenswrapper[5129]: I0314 09:24:17.933480 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-2vl8w"] Mar 14 09:24:18 crc kubenswrapper[5129]: I0314 09:24:18.810930 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" event={"ID":"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72","Type":"ContainerStarted","Data":"c353313245dc77d8d23ce36ccfbb3ea06859c440d7f6ca85c65873cb77dc03ee"} Mar 14 09:24:18 crc kubenswrapper[5129]: I0314 09:24:18.814278 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" event={"ID":"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72","Type":"ContainerStarted","Data":"83e66d72eadc7f0ab1fd8532a09c2c308e09da67b1dd52ae34a35d4e884f6949"} Mar 14 09:24:18 crc kubenswrapper[5129]: I0314 09:24:18.839220 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" podStartSLOduration=2.422536751 podStartE2EDuration="2.839199801s" podCreationTimestamp="2026-03-14 09:24:16 +0000 UTC" firstStartedPulling="2026-03-14 09:24:17.944419942 +0000 UTC m=+8720.696335126" lastFinishedPulling="2026-03-14 09:24:18.361082992 +0000 UTC m=+8721.112998176" observedRunningTime="2026-03-14 09:24:18.827209938 +0000 UTC m=+8721.579125142" watchObservedRunningTime="2026-03-14 09:24:18.839199801 +0000 UTC m=+8721.591114985" Mar 14 09:24:19 crc kubenswrapper[5129]: I0314 09:24:19.058393 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:19 crc kubenswrapper[5129]: I0314 09:24:19.058458 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:19 crc kubenswrapper[5129]: I0314 09:24:19.110388 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:19 crc kubenswrapper[5129]: I0314 09:24:19.820670 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vz984" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="registry-server" containerID="cri-o://bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811" gracePeriod=2 Mar 14 09:24:19 crc kubenswrapper[5129]: I0314 09:24:19.874468 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.278527 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.304997 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8t6g"] Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.461306 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-utilities\") pod \"3c1d6709-7a13-49f5-805a-a58cc35dc857\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.461463 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-catalog-content\") pod \"3c1d6709-7a13-49f5-805a-a58cc35dc857\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.461540 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sdjj\" (UniqueName: \"kubernetes.io/projected/3c1d6709-7a13-49f5-805a-a58cc35dc857-kube-api-access-5sdjj\") pod \"3c1d6709-7a13-49f5-805a-a58cc35dc857\" (UID: \"3c1d6709-7a13-49f5-805a-a58cc35dc857\") " Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.462306 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-utilities" (OuterVolumeSpecName: "utilities") pod "3c1d6709-7a13-49f5-805a-a58cc35dc857" (UID: "3c1d6709-7a13-49f5-805a-a58cc35dc857"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.462618 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.467921 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1d6709-7a13-49f5-805a-a58cc35dc857-kube-api-access-5sdjj" (OuterVolumeSpecName: "kube-api-access-5sdjj") pod "3c1d6709-7a13-49f5-805a-a58cc35dc857" (UID: "3c1d6709-7a13-49f5-805a-a58cc35dc857"). InnerVolumeSpecName "kube-api-access-5sdjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.520863 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c1d6709-7a13-49f5-805a-a58cc35dc857" (UID: "3c1d6709-7a13-49f5-805a-a58cc35dc857"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.565282 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c1d6709-7a13-49f5-805a-a58cc35dc857-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.565325 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sdjj\" (UniqueName: \"kubernetes.io/projected/3c1d6709-7a13-49f5-805a-a58cc35dc857-kube-api-access-5sdjj\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.831697 5129 generic.go:334] "Generic (PLEG): container finished" podID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerID="bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811" exitCode=0 Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.831787 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vz984" event={"ID":"3c1d6709-7a13-49f5-805a-a58cc35dc857","Type":"ContainerDied","Data":"bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811"} Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.831822 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vz984" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.831850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vz984" event={"ID":"3c1d6709-7a13-49f5-805a-a58cc35dc857","Type":"ContainerDied","Data":"01ba408dcc6d32b1bf86b75343aa562db03ae320a05c7419fc183d6a447a85c4"} Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.831876 5129 scope.go:117] "RemoveContainer" containerID="bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.864002 5129 scope.go:117] "RemoveContainer" containerID="822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.873213 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vz984"] Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.884091 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vz984"] Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.902959 5129 scope.go:117] "RemoveContainer" containerID="2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.928894 5129 scope.go:117] "RemoveContainer" containerID="bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811" Mar 14 09:24:20 crc kubenswrapper[5129]: E0314 09:24:20.929273 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811\": container with ID starting with bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811 not found: ID does not exist" containerID="bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.929309 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811"} err="failed to get container status \"bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811\": rpc error: code = NotFound desc = could not find container \"bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811\": container with ID starting with bb5051ea6c5346841263497f2130299cf54e33908363687e082432ff7c18c811 not found: ID does not exist" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.929330 5129 scope.go:117] "RemoveContainer" containerID="822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd" Mar 14 09:24:20 crc kubenswrapper[5129]: E0314 09:24:20.929582 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd\": container with ID starting with 822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd not found: ID does not exist" containerID="822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.929633 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd"} err="failed to get container status \"822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd\": rpc error: code = NotFound desc = could not find container \"822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd\": container with ID starting with 822f9b8c496546e962dffd1e55eebb633e2088aafa58eb134786a63857f2dfcd not found: ID does not exist" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.929660 5129 scope.go:117] "RemoveContainer" containerID="2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d" Mar 14 09:24:20 crc kubenswrapper[5129]: E0314 09:24:20.929924 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d\": container with ID starting with 2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d not found: ID does not exist" containerID="2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d" Mar 14 09:24:20 crc kubenswrapper[5129]: I0314 09:24:20.929954 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d"} err="failed to get container status \"2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d\": rpc error: code = NotFound desc = could not find container \"2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d\": container with ID starting with 2f82040411e7bc747be2b87eb6e5088bbf37d421ada2ed5b04d58d6893b4644d not found: ID does not exist" Mar 14 09:24:21 crc kubenswrapper[5129]: I0314 09:24:21.841372 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c8t6g" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="registry-server" containerID="cri-o://c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df" gracePeriod=2 Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.049589 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" path="/var/lib/kubelet/pods/3c1d6709-7a13-49f5-805a-a58cc35dc857/volumes" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.321453 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.514756 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94bqv\" (UniqueName: \"kubernetes.io/projected/039f2ab5-8a1d-451d-a6a0-3b227a11606a-kube-api-access-94bqv\") pod \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.514844 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-utilities\") pod \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.514909 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-catalog-content\") pod \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\" (UID: \"039f2ab5-8a1d-451d-a6a0-3b227a11606a\") " Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.516060 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-utilities" (OuterVolumeSpecName: "utilities") pod "039f2ab5-8a1d-451d-a6a0-3b227a11606a" (UID: "039f2ab5-8a1d-451d-a6a0-3b227a11606a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.518515 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.524093 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039f2ab5-8a1d-451d-a6a0-3b227a11606a-kube-api-access-94bqv" (OuterVolumeSpecName: "kube-api-access-94bqv") pod "039f2ab5-8a1d-451d-a6a0-3b227a11606a" (UID: "039f2ab5-8a1d-451d-a6a0-3b227a11606a"). InnerVolumeSpecName "kube-api-access-94bqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.546733 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "039f2ab5-8a1d-451d-a6a0-3b227a11606a" (UID: "039f2ab5-8a1d-451d-a6a0-3b227a11606a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.620349 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94bqv\" (UniqueName: \"kubernetes.io/projected/039f2ab5-8a1d-451d-a6a0-3b227a11606a-kube-api-access-94bqv\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.620393 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/039f2ab5-8a1d-451d-a6a0-3b227a11606a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.853197 5129 generic.go:334] "Generic (PLEG): container finished" podID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerID="c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df" exitCode=0 Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.853242 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8t6g" event={"ID":"039f2ab5-8a1d-451d-a6a0-3b227a11606a","Type":"ContainerDied","Data":"c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df"} Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.853268 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8t6g" event={"ID":"039f2ab5-8a1d-451d-a6a0-3b227a11606a","Type":"ContainerDied","Data":"5ce359c62c18b7d727eee2ebd6671231b93d8a11f6517f3b87fce3d889fe592c"} Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.853287 5129 scope.go:117] "RemoveContainer" containerID="c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.853317 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8t6g" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.874549 5129 scope.go:117] "RemoveContainer" containerID="91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.901873 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8t6g"] Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.910071 5129 scope.go:117] "RemoveContainer" containerID="ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.913853 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8t6g"] Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.943058 5129 scope.go:117] "RemoveContainer" containerID="c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df" Mar 14 09:24:22 crc kubenswrapper[5129]: E0314 09:24:22.943445 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df\": container with ID starting with c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df not found: ID does not exist" containerID="c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.943476 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df"} err="failed to get container status \"c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df\": rpc error: code = NotFound desc = could not find container \"c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df\": container with ID starting with c5a6f8d9c49db561e43b41cacf2d22629cc31dad77dbf006b75f3021c10cf3df not found: ID does not exist" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.943495 5129 scope.go:117] "RemoveContainer" containerID="91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03" Mar 14 09:24:22 crc kubenswrapper[5129]: E0314 09:24:22.943722 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03\": container with ID starting with 91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03 not found: ID does not exist" containerID="91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.943776 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03"} err="failed to get container status \"91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03\": rpc error: code = NotFound desc = could not find container \"91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03\": container with ID starting with 91f988f43295e86dbc5ff38ada9fd8a358b76b8c5d4b4078687e32d570d03d03 not found: ID does not exist" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.943796 5129 scope.go:117] "RemoveContainer" containerID="ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8" Mar 14 09:24:22 crc kubenswrapper[5129]: E0314 09:24:22.944072 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8\": container with ID starting with ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8 not found: ID does not exist" containerID="ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8" Mar 14 09:24:22 crc kubenswrapper[5129]: I0314 09:24:22.944100 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8"} err="failed to get container status \"ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8\": rpc error: code = NotFound desc = could not find container \"ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8\": container with ID starting with ddf3e204feb00ea45f4bcf89b61b2f5e4fab31d41f4cf0a97cc3fc47122f99b8 not found: ID does not exist" Mar 14 09:24:24 crc kubenswrapper[5129]: I0314 09:24:24.049753 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" path="/var/lib/kubelet/pods/039f2ab5-8a1d-451d-a6a0-3b227a11606a/volumes" Mar 14 09:24:27 crc kubenswrapper[5129]: I0314 09:24:27.038749 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:24:27 crc kubenswrapper[5129]: E0314 09:24:27.039784 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:24:35 crc kubenswrapper[5129]: I0314 09:24:35.217029 5129 scope.go:117] "RemoveContainer" containerID="07aa01ffc2a4a7a3465158906b6c3cad8e479ba5df9bd1bfdbd449986fdac777" Mar 14 09:24:41 crc kubenswrapper[5129]: I0314 09:24:41.036535 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:24:41 crc kubenswrapper[5129]: E0314 09:24:41.037569 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:24:53 crc kubenswrapper[5129]: I0314 09:24:53.037022 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:24:53 crc kubenswrapper[5129]: E0314 09:24:53.037790 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:24:58 crc kubenswrapper[5129]: I0314 09:24:58.208070 5129 generic.go:334] "Generic (PLEG): container finished" podID="5e53276d-c8cb-4fb1-aea6-d436e50e4490" containerID="36912edf9951921f214c48e6e4f0d7c24003bd25a6670e94c4942b96f46406ff" exitCode=0 Mar 14 09:24:58 crc kubenswrapper[5129]: I0314 09:24:58.208167 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" event={"ID":"5e53276d-c8cb-4fb1-aea6-d436e50e4490","Type":"ContainerDied","Data":"36912edf9951921f214c48e6e4f0d7c24003bd25a6670e94c4942b96f46406ff"} Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.691083 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.757643 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-bootstrap-combined-ca-bundle\") pod \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.757718 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-ssh-key-openstack-networker\") pod \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.757829 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-inventory\") pod \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.758465 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkzkx\" (UniqueName: \"kubernetes.io/projected/5e53276d-c8cb-4fb1-aea6-d436e50e4490-kube-api-access-dkzkx\") pod \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\" (UID: \"5e53276d-c8cb-4fb1-aea6-d436e50e4490\") " Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.763977 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5e53276d-c8cb-4fb1-aea6-d436e50e4490" (UID: "5e53276d-c8cb-4fb1-aea6-d436e50e4490"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.767720 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e53276d-c8cb-4fb1-aea6-d436e50e4490-kube-api-access-dkzkx" (OuterVolumeSpecName: "kube-api-access-dkzkx") pod "5e53276d-c8cb-4fb1-aea6-d436e50e4490" (UID: "5e53276d-c8cb-4fb1-aea6-d436e50e4490"). InnerVolumeSpecName "kube-api-access-dkzkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.788064 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "5e53276d-c8cb-4fb1-aea6-d436e50e4490" (UID: "5e53276d-c8cb-4fb1-aea6-d436e50e4490"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.790579 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-inventory" (OuterVolumeSpecName: "inventory") pod "5e53276d-c8cb-4fb1-aea6-d436e50e4490" (UID: "5e53276d-c8cb-4fb1-aea6-d436e50e4490"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.861381 5129 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.861655 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.861668 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e53276d-c8cb-4fb1-aea6-d436e50e4490-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:59 crc kubenswrapper[5129]: I0314 09:24:59.861677 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkzkx\" (UniqueName: \"kubernetes.io/projected/5e53276d-c8cb-4fb1-aea6-d436e50e4490-kube-api-access-dkzkx\") on node \"crc\" DevicePath \"\"" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.233912 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" event={"ID":"5e53276d-c8cb-4fb1-aea6-d436e50e4490","Type":"ContainerDied","Data":"1c63e62df4d8ab151e01684c03fe79ebbb6677a1c5febc44f052a1a7b3bb84f2"} Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.233954 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c63e62df4d8ab151e01684c03fe79ebbb6677a1c5febc44f052a1a7b3bb84f2" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.234001 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-6dnqp" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.374769 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-networker-wdpqk"] Mar 14 09:25:00 crc kubenswrapper[5129]: E0314 09:25:00.375202 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="extract-content" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375220 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="extract-content" Mar 14 09:25:00 crc kubenswrapper[5129]: E0314 09:25:00.375239 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="registry-server" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375282 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="registry-server" Mar 14 09:25:00 crc kubenswrapper[5129]: E0314 09:25:00.375305 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e53276d-c8cb-4fb1-aea6-d436e50e4490" containerName="bootstrap-openstack-openstack-networker" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375311 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e53276d-c8cb-4fb1-aea6-d436e50e4490" containerName="bootstrap-openstack-openstack-networker" Mar 14 09:25:00 crc kubenswrapper[5129]: E0314 09:25:00.375324 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="extract-utilities" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375330 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="extract-utilities" Mar 14 09:25:00 crc kubenswrapper[5129]: E0314 09:25:00.375353 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="extract-utilities" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375359 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="extract-utilities" Mar 14 09:25:00 crc kubenswrapper[5129]: E0314 09:25:00.375376 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="extract-content" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375382 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="extract-content" Mar 14 09:25:00 crc kubenswrapper[5129]: E0314 09:25:00.375392 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="registry-server" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375398 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="registry-server" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375570 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="039f2ab5-8a1d-451d-a6a0-3b227a11606a" containerName="registry-server" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375591 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1d6709-7a13-49f5-805a-a58cc35dc857" containerName="registry-server" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.375622 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e53276d-c8cb-4fb1-aea6-d436e50e4490" containerName="bootstrap-openstack-openstack-networker" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.376411 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.380246 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.380628 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.387841 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-wdpqk"] Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.471399 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.471494 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-inventory\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.471532 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vhkg\" (UniqueName: \"kubernetes.io/projected/f314e854-c21b-4a43-aa81-87d2ff072c30-kube-api-access-8vhkg\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.573305 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-inventory\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.573382 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vhkg\" (UniqueName: \"kubernetes.io/projected/f314e854-c21b-4a43-aa81-87d2ff072c30-kube-api-access-8vhkg\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.573493 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.578449 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-inventory\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.579197 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.589681 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vhkg\" (UniqueName: \"kubernetes.io/projected/f314e854-c21b-4a43-aa81-87d2ff072c30-kube-api-access-8vhkg\") pod \"download-cache-openstack-openstack-networker-wdpqk\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:00 crc kubenswrapper[5129]: I0314 09:25:00.748837 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:25:01 crc kubenswrapper[5129]: I0314 09:25:01.302694 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-wdpqk"] Mar 14 09:25:02 crc kubenswrapper[5129]: I0314 09:25:02.251261 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" event={"ID":"f314e854-c21b-4a43-aa81-87d2ff072c30","Type":"ContainerStarted","Data":"97696973264de1bc30b2f9dc3530e4e251e553a6edfbab5341211117bb5101f0"} Mar 14 09:25:02 crc kubenswrapper[5129]: I0314 09:25:02.251535 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" event={"ID":"f314e854-c21b-4a43-aa81-87d2ff072c30","Type":"ContainerStarted","Data":"41d8c31e9075bd9d88c255cee7bab21c6cdf5bf7e3d3dc660f83c45067514557"} Mar 14 09:25:02 crc kubenswrapper[5129]: I0314 09:25:02.281303 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" podStartSLOduration=1.791815459 podStartE2EDuration="2.281276325s" podCreationTimestamp="2026-03-14 09:25:00 +0000 UTC" firstStartedPulling="2026-03-14 09:25:01.318528001 +0000 UTC m=+8764.070443185" lastFinishedPulling="2026-03-14 09:25:01.807988867 +0000 UTC m=+8764.559904051" observedRunningTime="2026-03-14 09:25:02.27218576 +0000 UTC m=+8765.024100944" watchObservedRunningTime="2026-03-14 09:25:02.281276325 +0000 UTC m=+8765.033191509" Mar 14 09:25:05 crc kubenswrapper[5129]: I0314 09:25:05.036649 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:25:05 crc kubenswrapper[5129]: E0314 09:25:05.037685 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:25:18 crc kubenswrapper[5129]: I0314 09:25:18.042495 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:25:18 crc kubenswrapper[5129]: E0314 09:25:18.043332 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:25:31 crc kubenswrapper[5129]: I0314 09:25:31.035868 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:25:31 crc kubenswrapper[5129]: E0314 09:25:31.036678 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.600537 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zkwv5"] Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.605797 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.678420 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkwv5"] Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.746402 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-utilities\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.746658 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhd7j\" (UniqueName: \"kubernetes.io/projected/7c10d4bb-6686-4df1-9598-70b030d96a39-kube-api-access-dhd7j\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.747507 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-catalog-content\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.849569 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhd7j\" (UniqueName: \"kubernetes.io/projected/7c10d4bb-6686-4df1-9598-70b030d96a39-kube-api-access-dhd7j\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.849663 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-catalog-content\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.849726 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-utilities\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.850309 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-utilities\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.850919 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-catalog-content\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.878481 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhd7j\" (UniqueName: \"kubernetes.io/projected/7c10d4bb-6686-4df1-9598-70b030d96a39-kube-api-access-dhd7j\") pod \"redhat-operators-zkwv5\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:40 crc kubenswrapper[5129]: I0314 09:25:40.931650 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:25:41 crc kubenswrapper[5129]: I0314 09:25:41.465497 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkwv5"] Mar 14 09:25:41 crc kubenswrapper[5129]: I0314 09:25:41.621311 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkwv5" event={"ID":"7c10d4bb-6686-4df1-9598-70b030d96a39","Type":"ContainerStarted","Data":"9a48e98bd0aaf353b3c9c0eb9efdfe0ed6f8669a0a1905c57e93685ddff265f5"} Mar 14 09:25:42 crc kubenswrapper[5129]: I0314 09:25:42.642518 5129 generic.go:334] "Generic (PLEG): container finished" podID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerID="9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117" exitCode=0 Mar 14 09:25:42 crc kubenswrapper[5129]: I0314 09:25:42.642913 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkwv5" event={"ID":"7c10d4bb-6686-4df1-9598-70b030d96a39","Type":"ContainerDied","Data":"9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117"} Mar 14 09:25:44 crc kubenswrapper[5129]: I0314 09:25:44.037707 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:25:44 crc kubenswrapper[5129]: E0314 09:25:44.039102 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:25:44 crc kubenswrapper[5129]: I0314 09:25:44.667234 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkwv5" event={"ID":"7c10d4bb-6686-4df1-9598-70b030d96a39","Type":"ContainerStarted","Data":"297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79"} Mar 14 09:25:49 crc kubenswrapper[5129]: I0314 09:25:49.721853 5129 generic.go:334] "Generic (PLEG): container finished" podID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerID="297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79" exitCode=0 Mar 14 09:25:49 crc kubenswrapper[5129]: I0314 09:25:49.721897 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkwv5" event={"ID":"7c10d4bb-6686-4df1-9598-70b030d96a39","Type":"ContainerDied","Data":"297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79"} Mar 14 09:25:51 crc kubenswrapper[5129]: I0314 09:25:51.743436 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkwv5" event={"ID":"7c10d4bb-6686-4df1-9598-70b030d96a39","Type":"ContainerStarted","Data":"da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230"} Mar 14 09:25:51 crc kubenswrapper[5129]: I0314 09:25:51.777305 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zkwv5" podStartSLOduration=3.804001018 podStartE2EDuration="11.777286527s" podCreationTimestamp="2026-03-14 09:25:40 +0000 UTC" firstStartedPulling="2026-03-14 09:25:42.648210421 +0000 UTC m=+8805.400125605" lastFinishedPulling="2026-03-14 09:25:50.62149594 +0000 UTC m=+8813.373411114" observedRunningTime="2026-03-14 09:25:51.765330994 +0000 UTC m=+8814.517246178" watchObservedRunningTime="2026-03-14 09:25:51.777286527 +0000 UTC m=+8814.529201711" Mar 14 09:25:57 crc kubenswrapper[5129]: I0314 09:25:57.037051 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:25:57 crc kubenswrapper[5129]: I0314 09:25:57.802918 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"716bfedf95059e7409f096a660a732a3c112f45bc7a6d09224dd5044410d7300"} Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.159530 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558006-7k6zz"] Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.161713 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-7k6zz" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.164253 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.164492 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.165745 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.167580 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-7k6zz"] Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.311013 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f554s\" (UniqueName: \"kubernetes.io/projected/0b7285f3-a213-4658-9762-55b0d8ed9836-kube-api-access-f554s\") pod \"auto-csr-approver-29558006-7k6zz\" (UID: \"0b7285f3-a213-4658-9762-55b0d8ed9836\") " pod="openshift-infra/auto-csr-approver-29558006-7k6zz" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.412935 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f554s\" (UniqueName: \"kubernetes.io/projected/0b7285f3-a213-4658-9762-55b0d8ed9836-kube-api-access-f554s\") pod \"auto-csr-approver-29558006-7k6zz\" (UID: \"0b7285f3-a213-4658-9762-55b0d8ed9836\") " pod="openshift-infra/auto-csr-approver-29558006-7k6zz" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.433621 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f554s\" (UniqueName: \"kubernetes.io/projected/0b7285f3-a213-4658-9762-55b0d8ed9836-kube-api-access-f554s\") pod \"auto-csr-approver-29558006-7k6zz\" (UID: \"0b7285f3-a213-4658-9762-55b0d8ed9836\") " pod="openshift-infra/auto-csr-approver-29558006-7k6zz" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.486128 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-7k6zz" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.932357 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.932728 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:26:00 crc kubenswrapper[5129]: I0314 09:26:00.980975 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:26:01 crc kubenswrapper[5129]: I0314 09:26:01.019899 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-7k6zz"] Mar 14 09:26:01 crc kubenswrapper[5129]: W0314 09:26:01.025850 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b7285f3_a213_4658_9762_55b0d8ed9836.slice/crio-fa883cb3782be3fc8ead00f5f149fe8531ec7c56ca6e35d560b6848cb95d6265 WatchSource:0}: Error finding container fa883cb3782be3fc8ead00f5f149fe8531ec7c56ca6e35d560b6848cb95d6265: Status 404 returned error can't find the container with id fa883cb3782be3fc8ead00f5f149fe8531ec7c56ca6e35d560b6848cb95d6265 Mar 14 09:26:01 crc kubenswrapper[5129]: I0314 09:26:01.837507 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-7k6zz" event={"ID":"0b7285f3-a213-4658-9762-55b0d8ed9836","Type":"ContainerStarted","Data":"fa883cb3782be3fc8ead00f5f149fe8531ec7c56ca6e35d560b6848cb95d6265"} Mar 14 09:26:01 crc kubenswrapper[5129]: I0314 09:26:01.890180 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:26:01 crc kubenswrapper[5129]: I0314 09:26:01.947054 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkwv5"] Mar 14 09:26:02 crc kubenswrapper[5129]: I0314 09:26:02.848540 5129 generic.go:334] "Generic (PLEG): container finished" podID="0b7285f3-a213-4658-9762-55b0d8ed9836" containerID="125c18eb5bf85f6759ee74d38b3afc05e6594ea239dc40ff7d7a0a02266a6393" exitCode=0 Mar 14 09:26:02 crc kubenswrapper[5129]: I0314 09:26:02.848624 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-7k6zz" event={"ID":"0b7285f3-a213-4658-9762-55b0d8ed9836","Type":"ContainerDied","Data":"125c18eb5bf85f6759ee74d38b3afc05e6594ea239dc40ff7d7a0a02266a6393"} Mar 14 09:26:03 crc kubenswrapper[5129]: I0314 09:26:03.860103 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zkwv5" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="registry-server" containerID="cri-o://da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230" gracePeriod=2 Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.377877 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-7k6zz" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.387730 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.500940 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-utilities\") pod \"7c10d4bb-6686-4df1-9598-70b030d96a39\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.501393 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-catalog-content\") pod \"7c10d4bb-6686-4df1-9598-70b030d96a39\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.501431 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f554s\" (UniqueName: \"kubernetes.io/projected/0b7285f3-a213-4658-9762-55b0d8ed9836-kube-api-access-f554s\") pod \"0b7285f3-a213-4658-9762-55b0d8ed9836\" (UID: \"0b7285f3-a213-4658-9762-55b0d8ed9836\") " Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.501552 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhd7j\" (UniqueName: \"kubernetes.io/projected/7c10d4bb-6686-4df1-9598-70b030d96a39-kube-api-access-dhd7j\") pod \"7c10d4bb-6686-4df1-9598-70b030d96a39\" (UID: \"7c10d4bb-6686-4df1-9598-70b030d96a39\") " Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.504052 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-utilities" (OuterVolumeSpecName: "utilities") pod "7c10d4bb-6686-4df1-9598-70b030d96a39" (UID: "7c10d4bb-6686-4df1-9598-70b030d96a39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.508915 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7285f3-a213-4658-9762-55b0d8ed9836-kube-api-access-f554s" (OuterVolumeSpecName: "kube-api-access-f554s") pod "0b7285f3-a213-4658-9762-55b0d8ed9836" (UID: "0b7285f3-a213-4658-9762-55b0d8ed9836"). InnerVolumeSpecName "kube-api-access-f554s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.509710 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c10d4bb-6686-4df1-9598-70b030d96a39-kube-api-access-dhd7j" (OuterVolumeSpecName: "kube-api-access-dhd7j") pod "7c10d4bb-6686-4df1-9598-70b030d96a39" (UID: "7c10d4bb-6686-4df1-9598-70b030d96a39"). InnerVolumeSpecName "kube-api-access-dhd7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.603923 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.603958 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f554s\" (UniqueName: \"kubernetes.io/projected/0b7285f3-a213-4658-9762-55b0d8ed9836-kube-api-access-f554s\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.603972 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhd7j\" (UniqueName: \"kubernetes.io/projected/7c10d4bb-6686-4df1-9598-70b030d96a39-kube-api-access-dhd7j\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.663074 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c10d4bb-6686-4df1-9598-70b030d96a39" (UID: "7c10d4bb-6686-4df1-9598-70b030d96a39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.707752 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c10d4bb-6686-4df1-9598-70b030d96a39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.874409 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-7k6zz" event={"ID":"0b7285f3-a213-4658-9762-55b0d8ed9836","Type":"ContainerDied","Data":"fa883cb3782be3fc8ead00f5f149fe8531ec7c56ca6e35d560b6848cb95d6265"} Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.874503 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa883cb3782be3fc8ead00f5f149fe8531ec7c56ca6e35d560b6848cb95d6265" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.874454 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-7k6zz" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.877703 5129 generic.go:334] "Generic (PLEG): container finished" podID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerID="da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230" exitCode=0 Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.877750 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkwv5" event={"ID":"7c10d4bb-6686-4df1-9598-70b030d96a39","Type":"ContainerDied","Data":"da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230"} Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.877781 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkwv5" event={"ID":"7c10d4bb-6686-4df1-9598-70b030d96a39","Type":"ContainerDied","Data":"9a48e98bd0aaf353b3c9c0eb9efdfe0ed6f8669a0a1905c57e93685ddff265f5"} Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.877803 5129 scope.go:117] "RemoveContainer" containerID="da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.878195 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkwv5" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.917730 5129 scope.go:117] "RemoveContainer" containerID="297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.935710 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkwv5"] Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.941777 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zkwv5"] Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.958092 5129 scope.go:117] "RemoveContainer" containerID="9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.987957 5129 scope.go:117] "RemoveContainer" containerID="da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230" Mar 14 09:26:04 crc kubenswrapper[5129]: E0314 09:26:04.996285 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230\": container with ID starting with da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230 not found: ID does not exist" containerID="da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.996336 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230"} err="failed to get container status \"da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230\": rpc error: code = NotFound desc = could not find container \"da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230\": container with ID starting with da0710ffb3d13c9a80066a22a7459c247c6aa223a002a7ae9fa089f4e9193230 not found: ID does not exist" Mar 14 09:26:04 crc kubenswrapper[5129]: I0314 09:26:04.996363 5129 scope.go:117] "RemoveContainer" containerID="297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79" Mar 14 09:26:05 crc kubenswrapper[5129]: E0314 09:26:05.000227 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79\": container with ID starting with 297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79 not found: ID does not exist" containerID="297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79" Mar 14 09:26:05 crc kubenswrapper[5129]: I0314 09:26:05.000293 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79"} err="failed to get container status \"297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79\": rpc error: code = NotFound desc = could not find container \"297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79\": container with ID starting with 297fc2c094de3bd86cd4d23462876e61289876f25fe7401b16e9df9222f16b79 not found: ID does not exist" Mar 14 09:26:05 crc kubenswrapper[5129]: I0314 09:26:05.000335 5129 scope.go:117] "RemoveContainer" containerID="9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117" Mar 14 09:26:05 crc kubenswrapper[5129]: E0314 09:26:05.001363 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117\": container with ID starting with 9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117 not found: ID does not exist" containerID="9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117" Mar 14 09:26:05 crc kubenswrapper[5129]: I0314 09:26:05.001405 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117"} err="failed to get container status \"9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117\": rpc error: code = NotFound desc = could not find container \"9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117\": container with ID starting with 9eaa65bc515fa5a0ba4e3eed538cdb452cc617881afc315f8db005656b73f117 not found: ID does not exist" Mar 14 09:26:05 crc kubenswrapper[5129]: I0314 09:26:05.471100 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-l224c"] Mar 14 09:26:05 crc kubenswrapper[5129]: I0314 09:26:05.493496 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-l224c"] Mar 14 09:26:06 crc kubenswrapper[5129]: I0314 09:26:06.050033 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" path="/var/lib/kubelet/pods/7c10d4bb-6686-4df1-9598-70b030d96a39/volumes" Mar 14 09:26:06 crc kubenswrapper[5129]: I0314 09:26:06.051376 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e" path="/var/lib/kubelet/pods/8a3a36f8-1090-4d4f-a6da-0f7ac1ddce4e/volumes" Mar 14 09:26:10 crc kubenswrapper[5129]: I0314 09:26:10.947578 5129 generic.go:334] "Generic (PLEG): container finished" podID="f314e854-c21b-4a43-aa81-87d2ff072c30" containerID="97696973264de1bc30b2f9dc3530e4e251e553a6edfbab5341211117bb5101f0" exitCode=0 Mar 14 09:26:10 crc kubenswrapper[5129]: I0314 09:26:10.947685 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" event={"ID":"f314e854-c21b-4a43-aa81-87d2ff072c30","Type":"ContainerDied","Data":"97696973264de1bc30b2f9dc3530e4e251e553a6edfbab5341211117bb5101f0"} Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.516165 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.604691 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-inventory\") pod \"f314e854-c21b-4a43-aa81-87d2ff072c30\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.604926 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vhkg\" (UniqueName: \"kubernetes.io/projected/f314e854-c21b-4a43-aa81-87d2ff072c30-kube-api-access-8vhkg\") pod \"f314e854-c21b-4a43-aa81-87d2ff072c30\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.605038 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-ssh-key-openstack-networker\") pod \"f314e854-c21b-4a43-aa81-87d2ff072c30\" (UID: \"f314e854-c21b-4a43-aa81-87d2ff072c30\") " Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.613391 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f314e854-c21b-4a43-aa81-87d2ff072c30-kube-api-access-8vhkg" (OuterVolumeSpecName: "kube-api-access-8vhkg") pod "f314e854-c21b-4a43-aa81-87d2ff072c30" (UID: "f314e854-c21b-4a43-aa81-87d2ff072c30"). InnerVolumeSpecName "kube-api-access-8vhkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.638333 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-inventory" (OuterVolumeSpecName: "inventory") pod "f314e854-c21b-4a43-aa81-87d2ff072c30" (UID: "f314e854-c21b-4a43-aa81-87d2ff072c30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.640224 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "f314e854-c21b-4a43-aa81-87d2ff072c30" (UID: "f314e854-c21b-4a43-aa81-87d2ff072c30"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.710210 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.710317 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vhkg\" (UniqueName: \"kubernetes.io/projected/f314e854-c21b-4a43-aa81-87d2ff072c30-kube-api-access-8vhkg\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.710352 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f314e854-c21b-4a43-aa81-87d2ff072c30-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.978907 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" event={"ID":"f314e854-c21b-4a43-aa81-87d2ff072c30","Type":"ContainerDied","Data":"41d8c31e9075bd9d88c255cee7bab21c6cdf5bf7e3d3dc660f83c45067514557"} Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.978953 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41d8c31e9075bd9d88c255cee7bab21c6cdf5bf7e3d3dc660f83c45067514557" Mar 14 09:26:12 crc kubenswrapper[5129]: I0314 09:26:12.979391 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-wdpqk" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184153 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-networker-64ksz"] Mar 14 09:26:13 crc kubenswrapper[5129]: E0314 09:26:13.184570 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7285f3-a213-4658-9762-55b0d8ed9836" containerName="oc" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184585 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7285f3-a213-4658-9762-55b0d8ed9836" containerName="oc" Mar 14 09:26:13 crc kubenswrapper[5129]: E0314 09:26:13.184637 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="extract-content" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184644 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="extract-content" Mar 14 09:26:13 crc kubenswrapper[5129]: E0314 09:26:13.184657 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f314e854-c21b-4a43-aa81-87d2ff072c30" containerName="download-cache-openstack-openstack-networker" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184667 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f314e854-c21b-4a43-aa81-87d2ff072c30" containerName="download-cache-openstack-openstack-networker" Mar 14 09:26:13 crc kubenswrapper[5129]: E0314 09:26:13.184698 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="extract-utilities" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184705 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="extract-utilities" Mar 14 09:26:13 crc kubenswrapper[5129]: E0314 09:26:13.184723 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="registry-server" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184729 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="registry-server" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184908 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f314e854-c21b-4a43-aa81-87d2ff072c30" containerName="download-cache-openstack-openstack-networker" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184929 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7285f3-a213-4658-9762-55b0d8ed9836" containerName="oc" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.184949 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c10d4bb-6686-4df1-9598-70b030d96a39" containerName="registry-server" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.185612 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.192620 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.195127 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.205830 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-64ksz"] Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.330319 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-inventory\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.330668 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fms\" (UniqueName: \"kubernetes.io/projected/40be015e-e0ec-4758-ab71-d20347e005c2-kube-api-access-n6fms\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.330795 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.433301 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-inventory\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.433463 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fms\" (UniqueName: \"kubernetes.io/projected/40be015e-e0ec-4758-ab71-d20347e005c2-kube-api-access-n6fms\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.433528 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.728190 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-inventory\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.728192 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.729823 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fms\" (UniqueName: \"kubernetes.io/projected/40be015e-e0ec-4758-ab71-d20347e005c2-kube-api-access-n6fms\") pod \"configure-network-openstack-openstack-networker-64ksz\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:13 crc kubenswrapper[5129]: I0314 09:26:13.804554 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:26:14 crc kubenswrapper[5129]: I0314 09:26:14.386263 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-64ksz"] Mar 14 09:26:15 crc kubenswrapper[5129]: I0314 09:26:15.006073 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-64ksz" event={"ID":"40be015e-e0ec-4758-ab71-d20347e005c2","Type":"ContainerStarted","Data":"7eb3799fe1aed098ef05e246d94ed31f302227972b4b571a4041b231e316b779"} Mar 14 09:26:16 crc kubenswrapper[5129]: I0314 09:26:16.015996 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-64ksz" event={"ID":"40be015e-e0ec-4758-ab71-d20347e005c2","Type":"ContainerStarted","Data":"db7380742a49ca8e0e968fbe7cb0604966fb5c9594f0d14aa0eea6cd6fc3538a"} Mar 14 09:26:16 crc kubenswrapper[5129]: I0314 09:26:16.046075 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-networker-64ksz" podStartSLOduration=2.5831307949999998 podStartE2EDuration="3.046052873s" podCreationTimestamp="2026-03-14 09:26:13 +0000 UTC" firstStartedPulling="2026-03-14 09:26:14.39332934 +0000 UTC m=+8837.145244524" lastFinishedPulling="2026-03-14 09:26:14.856251408 +0000 UTC m=+8837.608166602" observedRunningTime="2026-03-14 09:26:16.045625082 +0000 UTC m=+8838.797540276" watchObservedRunningTime="2026-03-14 09:26:16.046052873 +0000 UTC m=+8838.797968057" Mar 14 09:26:30 crc kubenswrapper[5129]: I0314 09:26:30.209590 5129 generic.go:334] "Generic (PLEG): container finished" podID="123dc78e-9c5e-4d0b-9b8a-16d2217f9f72" containerID="c353313245dc77d8d23ce36ccfbb3ea06859c440d7f6ca85c65873cb77dc03ee" exitCode=0 Mar 14 09:26:30 crc kubenswrapper[5129]: I0314 09:26:30.209667 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" event={"ID":"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72","Type":"ContainerDied","Data":"c353313245dc77d8d23ce36ccfbb3ea06859c440d7f6ca85c65873cb77dc03ee"} Mar 14 09:26:31 crc kubenswrapper[5129]: I0314 09:26:31.727986 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:26:31 crc kubenswrapper[5129]: I0314 09:26:31.906156 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-inventory\") pod \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " Mar 14 09:26:31 crc kubenswrapper[5129]: I0314 09:26:31.906220 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbwvk\" (UniqueName: \"kubernetes.io/projected/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-kube-api-access-bbwvk\") pod \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " Mar 14 09:26:31 crc kubenswrapper[5129]: I0314 09:26:31.906261 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-ssh-key-openstack-cell1\") pod \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\" (UID: \"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72\") " Mar 14 09:26:31 crc kubenswrapper[5129]: I0314 09:26:31.914708 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-kube-api-access-bbwvk" (OuterVolumeSpecName: "kube-api-access-bbwvk") pod "123dc78e-9c5e-4d0b-9b8a-16d2217f9f72" (UID: "123dc78e-9c5e-4d0b-9b8a-16d2217f9f72"). InnerVolumeSpecName "kube-api-access-bbwvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:26:31 crc kubenswrapper[5129]: I0314 09:26:31.941004 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "123dc78e-9c5e-4d0b-9b8a-16d2217f9f72" (UID: "123dc78e-9c5e-4d0b-9b8a-16d2217f9f72"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:26:31 crc kubenswrapper[5129]: I0314 09:26:31.960301 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-inventory" (OuterVolumeSpecName: "inventory") pod "123dc78e-9c5e-4d0b-9b8a-16d2217f9f72" (UID: "123dc78e-9c5e-4d0b-9b8a-16d2217f9f72"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.009138 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.009175 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbwvk\" (UniqueName: \"kubernetes.io/projected/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-kube-api-access-bbwvk\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.009186 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/123dc78e-9c5e-4d0b-9b8a-16d2217f9f72-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.232091 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" event={"ID":"123dc78e-9c5e-4d0b-9b8a-16d2217f9f72","Type":"ContainerDied","Data":"83e66d72eadc7f0ab1fd8532a09c2c308e09da67b1dd52ae34a35d4e884f6949"} Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.232137 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e66d72eadc7f0ab1fd8532a09c2c308e09da67b1dd52ae34a35d4e884f6949" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.232204 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-2vl8w" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.362720 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-m6xpr"] Mar 14 09:26:32 crc kubenswrapper[5129]: E0314 09:26:32.363124 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123dc78e-9c5e-4d0b-9b8a-16d2217f9f72" containerName="download-cache-openstack-openstack-cell1" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.363142 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="123dc78e-9c5e-4d0b-9b8a-16d2217f9f72" containerName="download-cache-openstack-openstack-cell1" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.363374 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="123dc78e-9c5e-4d0b-9b8a-16d2217f9f72" containerName="download-cache-openstack-openstack-cell1" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.364114 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.366630 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.366869 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.377482 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-m6xpr"] Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.520636 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.520846 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c88b\" (UniqueName: \"kubernetes.io/projected/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-kube-api-access-9c88b\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.521087 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-inventory\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.628647 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c88b\" (UniqueName: \"kubernetes.io/projected/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-kube-api-access-9c88b\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.629083 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-inventory\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.637400 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.641816 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-inventory\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.646293 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.652552 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c88b\" (UniqueName: \"kubernetes.io/projected/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-kube-api-access-9c88b\") pod \"configure-network-openstack-openstack-cell1-m6xpr\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:32 crc kubenswrapper[5129]: I0314 09:26:32.697937 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:26:33 crc kubenswrapper[5129]: I0314 09:26:33.261216 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-m6xpr"] Mar 14 09:26:34 crc kubenswrapper[5129]: I0314 09:26:34.255055 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" event={"ID":"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3","Type":"ContainerStarted","Data":"2fdb4bd53a4e06746ef03b22b74d531e824c03c147591556cb85d25bbbdc7ad9"} Mar 14 09:26:34 crc kubenswrapper[5129]: I0314 09:26:34.255790 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" event={"ID":"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3","Type":"ContainerStarted","Data":"6476bb5037b29b259435020f6e7df2335df68dbfffd3f59312461aa757c11442"} Mar 14 09:26:34 crc kubenswrapper[5129]: I0314 09:26:34.284242 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" podStartSLOduration=1.858132079 podStartE2EDuration="2.284213744s" podCreationTimestamp="2026-03-14 09:26:32 +0000 UTC" firstStartedPulling="2026-03-14 09:26:33.278463558 +0000 UTC m=+8856.030378752" lastFinishedPulling="2026-03-14 09:26:33.704545233 +0000 UTC m=+8856.456460417" observedRunningTime="2026-03-14 09:26:34.273089113 +0000 UTC m=+8857.025004297" watchObservedRunningTime="2026-03-14 09:26:34.284213744 +0000 UTC m=+8857.036128928" Mar 14 09:26:35 crc kubenswrapper[5129]: I0314 09:26:35.353763 5129 scope.go:117] "RemoveContainer" containerID="297c0cca3206d165cfc2322105d61b6b22834585b20c5f8d7f098f4db1e4ea42" Mar 14 09:27:20 crc kubenswrapper[5129]: I0314 09:27:20.805187 5129 generic.go:334] "Generic (PLEG): container finished" podID="40be015e-e0ec-4758-ab71-d20347e005c2" containerID="db7380742a49ca8e0e968fbe7cb0604966fb5c9594f0d14aa0eea6cd6fc3538a" exitCode=0 Mar 14 09:27:20 crc kubenswrapper[5129]: I0314 09:27:20.805272 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-64ksz" event={"ID":"40be015e-e0ec-4758-ab71-d20347e005c2","Type":"ContainerDied","Data":"db7380742a49ca8e0e968fbe7cb0604966fb5c9594f0d14aa0eea6cd6fc3538a"} Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.328207 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.503661 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-ssh-key-openstack-networker\") pod \"40be015e-e0ec-4758-ab71-d20347e005c2\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.504381 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6fms\" (UniqueName: \"kubernetes.io/projected/40be015e-e0ec-4758-ab71-d20347e005c2-kube-api-access-n6fms\") pod \"40be015e-e0ec-4758-ab71-d20347e005c2\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.504526 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-inventory\") pod \"40be015e-e0ec-4758-ab71-d20347e005c2\" (UID: \"40be015e-e0ec-4758-ab71-d20347e005c2\") " Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.526215 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40be015e-e0ec-4758-ab71-d20347e005c2-kube-api-access-n6fms" (OuterVolumeSpecName: "kube-api-access-n6fms") pod "40be015e-e0ec-4758-ab71-d20347e005c2" (UID: "40be015e-e0ec-4758-ab71-d20347e005c2"). InnerVolumeSpecName "kube-api-access-n6fms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.542794 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "40be015e-e0ec-4758-ab71-d20347e005c2" (UID: "40be015e-e0ec-4758-ab71-d20347e005c2"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.570668 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-inventory" (OuterVolumeSpecName: "inventory") pod "40be015e-e0ec-4758-ab71-d20347e005c2" (UID: "40be015e-e0ec-4758-ab71-d20347e005c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.608369 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.608399 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6fms\" (UniqueName: \"kubernetes.io/projected/40be015e-e0ec-4758-ab71-d20347e005c2-kube-api-access-n6fms\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.608410 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40be015e-e0ec-4758-ab71-d20347e005c2-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.776072 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-64ksz" event={"ID":"40be015e-e0ec-4758-ab71-d20347e005c2","Type":"ContainerDied","Data":"7eb3799fe1aed098ef05e246d94ed31f302227972b4b571a4041b231e316b779"} Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.776125 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eb3799fe1aed098ef05e246d94ed31f302227972b4b571a4041b231e316b779" Mar 14 09:27:23 crc kubenswrapper[5129]: I0314 09:27:23.776195 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-64ksz" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.518444 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-networker-x4xn2"] Mar 14 09:27:24 crc kubenswrapper[5129]: E0314 09:27:24.519668 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40be015e-e0ec-4758-ab71-d20347e005c2" containerName="configure-network-openstack-openstack-networker" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.519684 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="40be015e-e0ec-4758-ab71-d20347e005c2" containerName="configure-network-openstack-openstack-networker" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.519883 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="40be015e-e0ec-4758-ab71-d20347e005c2" containerName="configure-network-openstack-openstack-networker" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.520626 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.525486 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.528908 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.554622 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-x4xn2"] Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.653452 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgfb\" (UniqueName: \"kubernetes.io/projected/5f757610-030c-48d6-ae59-dfe99c5edb1a-kube-api-access-5zgfb\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.653529 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.653560 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-inventory\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.755169 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgfb\" (UniqueName: \"kubernetes.io/projected/5f757610-030c-48d6-ae59-dfe99c5edb1a-kube-api-access-5zgfb\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.755230 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.755257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-inventory\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.761742 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.764730 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-inventory\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.775639 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgfb\" (UniqueName: \"kubernetes.io/projected/5f757610-030c-48d6-ae59-dfe99c5edb1a-kube-api-access-5zgfb\") pod \"validate-network-openstack-openstack-networker-x4xn2\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:24 crc kubenswrapper[5129]: I0314 09:27:24.873516 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:25 crc kubenswrapper[5129]: I0314 09:27:25.485873 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-x4xn2"] Mar 14 09:27:25 crc kubenswrapper[5129]: I0314 09:27:25.803909 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" event={"ID":"5f757610-030c-48d6-ae59-dfe99c5edb1a","Type":"ContainerStarted","Data":"8676a7a354a0121be2c4c8b0f281858b81a67d7e22edb523796c93dd0839c4b5"} Mar 14 09:27:26 crc kubenswrapper[5129]: I0314 09:27:26.816785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" event={"ID":"5f757610-030c-48d6-ae59-dfe99c5edb1a","Type":"ContainerStarted","Data":"70dcc7ec8d484d802bc279dfdc8a486ae5ac6143b294b9c0fb9dca3536c19b75"} Mar 14 09:27:26 crc kubenswrapper[5129]: I0314 09:27:26.842877 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" podStartSLOduration=2.37773838 podStartE2EDuration="2.842855539s" podCreationTimestamp="2026-03-14 09:27:24 +0000 UTC" firstStartedPulling="2026-03-14 09:27:25.495439228 +0000 UTC m=+8908.247354412" lastFinishedPulling="2026-03-14 09:27:25.960556387 +0000 UTC m=+8908.712471571" observedRunningTime="2026-03-14 09:27:26.832789007 +0000 UTC m=+8909.584704191" watchObservedRunningTime="2026-03-14 09:27:26.842855539 +0000 UTC m=+8909.594770723" Mar 14 09:27:31 crc kubenswrapper[5129]: I0314 09:27:31.870967 5129 generic.go:334] "Generic (PLEG): container finished" podID="5f757610-030c-48d6-ae59-dfe99c5edb1a" containerID="70dcc7ec8d484d802bc279dfdc8a486ae5ac6143b294b9c0fb9dca3536c19b75" exitCode=0 Mar 14 09:27:31 crc kubenswrapper[5129]: I0314 09:27:31.871035 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" event={"ID":"5f757610-030c-48d6-ae59-dfe99c5edb1a","Type":"ContainerDied","Data":"70dcc7ec8d484d802bc279dfdc8a486ae5ac6143b294b9c0fb9dca3536c19b75"} Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.538525 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.588812 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zgfb\" (UniqueName: \"kubernetes.io/projected/5f757610-030c-48d6-ae59-dfe99c5edb1a-kube-api-access-5zgfb\") pod \"5f757610-030c-48d6-ae59-dfe99c5edb1a\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.588922 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-inventory\") pod \"5f757610-030c-48d6-ae59-dfe99c5edb1a\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.588974 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-ssh-key-openstack-networker\") pod \"5f757610-030c-48d6-ae59-dfe99c5edb1a\" (UID: \"5f757610-030c-48d6-ae59-dfe99c5edb1a\") " Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.598040 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f757610-030c-48d6-ae59-dfe99c5edb1a-kube-api-access-5zgfb" (OuterVolumeSpecName: "kube-api-access-5zgfb") pod "5f757610-030c-48d6-ae59-dfe99c5edb1a" (UID: "5f757610-030c-48d6-ae59-dfe99c5edb1a"). InnerVolumeSpecName "kube-api-access-5zgfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.621953 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "5f757610-030c-48d6-ae59-dfe99c5edb1a" (UID: "5f757610-030c-48d6-ae59-dfe99c5edb1a"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.624106 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-inventory" (OuterVolumeSpecName: "inventory") pod "5f757610-030c-48d6-ae59-dfe99c5edb1a" (UID: "5f757610-030c-48d6-ae59-dfe99c5edb1a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.690486 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.690525 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f757610-030c-48d6-ae59-dfe99c5edb1a-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.690544 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zgfb\" (UniqueName: \"kubernetes.io/projected/5f757610-030c-48d6-ae59-dfe99c5edb1a-kube-api-access-5zgfb\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.892859 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" event={"ID":"5f757610-030c-48d6-ae59-dfe99c5edb1a","Type":"ContainerDied","Data":"8676a7a354a0121be2c4c8b0f281858b81a67d7e22edb523796c93dd0839c4b5"} Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.892920 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8676a7a354a0121be2c4c8b0f281858b81a67d7e22edb523796c93dd0839c4b5" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.892958 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-x4xn2" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.988782 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-networker-m2ssc"] Mar 14 09:27:33 crc kubenswrapper[5129]: E0314 09:27:33.989404 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f757610-030c-48d6-ae59-dfe99c5edb1a" containerName="validate-network-openstack-openstack-networker" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.989428 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f757610-030c-48d6-ae59-dfe99c5edb1a" containerName="validate-network-openstack-openstack-networker" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.989708 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f757610-030c-48d6-ae59-dfe99c5edb1a" containerName="validate-network-openstack-openstack-networker" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.990866 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.998954 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:27:33 crc kubenswrapper[5129]: I0314 09:27:33.999079 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.019901 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-m2ssc"] Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.102536 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkz7\" (UniqueName: \"kubernetes.io/projected/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-kube-api-access-5pkz7\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.102686 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-inventory\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.102835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.205882 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkz7\" (UniqueName: \"kubernetes.io/projected/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-kube-api-access-5pkz7\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.205964 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-inventory\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.206040 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.210672 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-inventory\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.211303 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.232215 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkz7\" (UniqueName: \"kubernetes.io/projected/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-kube-api-access-5pkz7\") pod \"install-os-openstack-openstack-networker-m2ssc\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.333127 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:27:34 crc kubenswrapper[5129]: I0314 09:27:34.891538 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-m2ssc"] Mar 14 09:27:35 crc kubenswrapper[5129]: I0314 09:27:35.919433 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-m2ssc" event={"ID":"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512","Type":"ContainerStarted","Data":"f900ba52f1f331e072d240825504b6b169455744bf4a6646eee1f9d2bb6eddd7"} Mar 14 09:27:35 crc kubenswrapper[5129]: I0314 09:27:35.919808 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-m2ssc" event={"ID":"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512","Type":"ContainerStarted","Data":"68c41db51d6d44d881b03c981e8bacf18c0dc7c14396fe10dd5b4f2a178e9235"} Mar 14 09:27:35 crc kubenswrapper[5129]: I0314 09:27:35.940272 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-networker-m2ssc" podStartSLOduration=2.514572216 podStartE2EDuration="2.940251539s" podCreationTimestamp="2026-03-14 09:27:33 +0000 UTC" firstStartedPulling="2026-03-14 09:27:34.901657907 +0000 UTC m=+8917.653573091" lastFinishedPulling="2026-03-14 09:27:35.32733723 +0000 UTC m=+8918.079252414" observedRunningTime="2026-03-14 09:27:35.936657741 +0000 UTC m=+8918.688572915" watchObservedRunningTime="2026-03-14 09:27:35.940251539 +0000 UTC m=+8918.692166723" Mar 14 09:27:39 crc kubenswrapper[5129]: I0314 09:27:39.952581 5129 generic.go:334] "Generic (PLEG): container finished" podID="1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3" containerID="2fdb4bd53a4e06746ef03b22b74d531e824c03c147591556cb85d25bbbdc7ad9" exitCode=0 Mar 14 09:27:39 crc kubenswrapper[5129]: I0314 09:27:39.952670 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" event={"ID":"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3","Type":"ContainerDied","Data":"2fdb4bd53a4e06746ef03b22b74d531e824c03c147591556cb85d25bbbdc7ad9"} Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.426926 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.455283 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-ssh-key-openstack-cell1\") pod \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.455348 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c88b\" (UniqueName: \"kubernetes.io/projected/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-kube-api-access-9c88b\") pod \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.455439 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-inventory\") pod \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\" (UID: \"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3\") " Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.461070 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-kube-api-access-9c88b" (OuterVolumeSpecName: "kube-api-access-9c88b") pod "1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3" (UID: "1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3"). InnerVolumeSpecName "kube-api-access-9c88b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.490714 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3" (UID: "1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.491688 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-inventory" (OuterVolumeSpecName: "inventory") pod "1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3" (UID: "1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.558260 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c88b\" (UniqueName: \"kubernetes.io/projected/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-kube-api-access-9c88b\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.558299 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.558311 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.975016 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" event={"ID":"1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3","Type":"ContainerDied","Data":"6476bb5037b29b259435020f6e7df2335df68dbfffd3f59312461aa757c11442"} Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.975310 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6476bb5037b29b259435020f6e7df2335df68dbfffd3f59312461aa757c11442" Mar 14 09:27:41 crc kubenswrapper[5129]: I0314 09:27:41.975124 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-m6xpr" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.169306 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zrt44"] Mar 14 09:27:42 crc kubenswrapper[5129]: E0314 09:27:42.169739 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3" containerName="configure-network-openstack-openstack-cell1" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.169754 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3" containerName="configure-network-openstack-openstack-cell1" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.169973 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3" containerName="configure-network-openstack-openstack-cell1" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.170671 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.173514 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.174127 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.196461 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zrt44"] Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.385437 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-inventory\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.386228 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frlsx\" (UniqueName: \"kubernetes.io/projected/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-kube-api-access-frlsx\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.386495 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.488462 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-inventory\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.488574 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frlsx\" (UniqueName: \"kubernetes.io/projected/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-kube-api-access-frlsx\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.488682 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.494342 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-inventory\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.502281 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.508163 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frlsx\" (UniqueName: \"kubernetes.io/projected/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-kube-api-access-frlsx\") pod \"validate-network-openstack-openstack-cell1-zrt44\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:42 crc kubenswrapper[5129]: I0314 09:27:42.529011 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:43 crc kubenswrapper[5129]: I0314 09:27:43.092477 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zrt44"] Mar 14 09:27:43 crc kubenswrapper[5129]: I0314 09:27:43.993669 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" event={"ID":"62bc5cd6-1f77-4220-ba45-f5376a5b0dec","Type":"ContainerStarted","Data":"eb22e37a171c2cead1ae710a8d575be42aa685ada8966500cb5667f1c8ac4778"} Mar 14 09:27:43 crc kubenswrapper[5129]: I0314 09:27:43.994296 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" event={"ID":"62bc5cd6-1f77-4220-ba45-f5376a5b0dec","Type":"ContainerStarted","Data":"853aa4397b94190ab70230f3fca4235b68d7c3a711f1a4de89a1b1ca2068adbf"} Mar 14 09:27:44 crc kubenswrapper[5129]: I0314 09:27:44.021482 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" podStartSLOduration=1.635933181 podStartE2EDuration="2.02144444s" podCreationTimestamp="2026-03-14 09:27:42 +0000 UTC" firstStartedPulling="2026-03-14 09:27:43.100106654 +0000 UTC m=+8925.852021848" lastFinishedPulling="2026-03-14 09:27:43.485617923 +0000 UTC m=+8926.237533107" observedRunningTime="2026-03-14 09:27:44.014155214 +0000 UTC m=+8926.766070398" watchObservedRunningTime="2026-03-14 09:27:44.02144444 +0000 UTC m=+8926.773359624" Mar 14 09:27:50 crc kubenswrapper[5129]: I0314 09:27:50.072534 5129 generic.go:334] "Generic (PLEG): container finished" podID="62bc5cd6-1f77-4220-ba45-f5376a5b0dec" containerID="eb22e37a171c2cead1ae710a8d575be42aa685ada8966500cb5667f1c8ac4778" exitCode=0 Mar 14 09:27:50 crc kubenswrapper[5129]: I0314 09:27:50.072684 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" event={"ID":"62bc5cd6-1f77-4220-ba45-f5376a5b0dec","Type":"ContainerDied","Data":"eb22e37a171c2cead1ae710a8d575be42aa685ada8966500cb5667f1c8ac4778"} Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.575838 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.737979 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-ssh-key-openstack-cell1\") pod \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.738154 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frlsx\" (UniqueName: \"kubernetes.io/projected/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-kube-api-access-frlsx\") pod \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.738206 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-inventory\") pod \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\" (UID: \"62bc5cd6-1f77-4220-ba45-f5376a5b0dec\") " Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.761906 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-kube-api-access-frlsx" (OuterVolumeSpecName: "kube-api-access-frlsx") pod "62bc5cd6-1f77-4220-ba45-f5376a5b0dec" (UID: "62bc5cd6-1f77-4220-ba45-f5376a5b0dec"). InnerVolumeSpecName "kube-api-access-frlsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.782393 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "62bc5cd6-1f77-4220-ba45-f5376a5b0dec" (UID: "62bc5cd6-1f77-4220-ba45-f5376a5b0dec"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.805354 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-inventory" (OuterVolumeSpecName: "inventory") pod "62bc5cd6-1f77-4220-ba45-f5376a5b0dec" (UID: "62bc5cd6-1f77-4220-ba45-f5376a5b0dec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.841002 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.841525 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frlsx\" (UniqueName: \"kubernetes.io/projected/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-kube-api-access-frlsx\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:51 crc kubenswrapper[5129]: I0314 09:27:51.841541 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62bc5cd6-1f77-4220-ba45-f5376a5b0dec-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.091992 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" event={"ID":"62bc5cd6-1f77-4220-ba45-f5376a5b0dec","Type":"ContainerDied","Data":"853aa4397b94190ab70230f3fca4235b68d7c3a711f1a4de89a1b1ca2068adbf"} Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.092038 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="853aa4397b94190ab70230f3fca4235b68d7c3a711f1a4de89a1b1ca2068adbf" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.092065 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zrt44" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.175823 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-fdhr7"] Mar 14 09:27:52 crc kubenswrapper[5129]: E0314 09:27:52.176319 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62bc5cd6-1f77-4220-ba45-f5376a5b0dec" containerName="validate-network-openstack-openstack-cell1" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.176340 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="62bc5cd6-1f77-4220-ba45-f5376a5b0dec" containerName="validate-network-openstack-openstack-cell1" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.176545 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="62bc5cd6-1f77-4220-ba45-f5376a5b0dec" containerName="validate-network-openstack-openstack-cell1" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.177238 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.182496 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.183243 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.184865 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-fdhr7"] Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.351463 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cfz2\" (UniqueName: \"kubernetes.io/projected/abf4eec2-f79d-4f05-af43-da60a28bde7e-kube-api-access-8cfz2\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.351528 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-inventory\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.351688 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.454251 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cfz2\" (UniqueName: \"kubernetes.io/projected/abf4eec2-f79d-4f05-af43-da60a28bde7e-kube-api-access-8cfz2\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.454336 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-inventory\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.454460 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.458454 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.458736 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-inventory\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.473129 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cfz2\" (UniqueName: \"kubernetes.io/projected/abf4eec2-f79d-4f05-af43-da60a28bde7e-kube-api-access-8cfz2\") pod \"install-os-openstack-openstack-cell1-fdhr7\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:52 crc kubenswrapper[5129]: I0314 09:27:52.495593 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:27:53 crc kubenswrapper[5129]: I0314 09:27:53.000019 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-fdhr7"] Mar 14 09:27:53 crc kubenswrapper[5129]: W0314 09:27:53.033680 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf4eec2_f79d_4f05_af43_da60a28bde7e.slice/crio-55a0da12f8edce3f61c19bad5d0ce9b7eb5e38dcbcb5e6f6157f4f7f9b7fb84c WatchSource:0}: Error finding container 55a0da12f8edce3f61c19bad5d0ce9b7eb5e38dcbcb5e6f6157f4f7f9b7fb84c: Status 404 returned error can't find the container with id 55a0da12f8edce3f61c19bad5d0ce9b7eb5e38dcbcb5e6f6157f4f7f9b7fb84c Mar 14 09:27:53 crc kubenswrapper[5129]: I0314 09:27:53.107463 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" event={"ID":"abf4eec2-f79d-4f05-af43-da60a28bde7e","Type":"ContainerStarted","Data":"55a0da12f8edce3f61c19bad5d0ce9b7eb5e38dcbcb5e6f6157f4f7f9b7fb84c"} Mar 14 09:27:54 crc kubenswrapper[5129]: I0314 09:27:54.124294 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" event={"ID":"abf4eec2-f79d-4f05-af43-da60a28bde7e","Type":"ContainerStarted","Data":"e48c90fab19326c40cc218bc7817a4ab952925e67fb6d03f8323adf2159375b2"} Mar 14 09:27:54 crc kubenswrapper[5129]: I0314 09:27:54.147527 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" podStartSLOduration=1.753856144 podStartE2EDuration="2.147499573s" podCreationTimestamp="2026-03-14 09:27:52 +0000 UTC" firstStartedPulling="2026-03-14 09:27:53.036513106 +0000 UTC m=+8935.788428310" lastFinishedPulling="2026-03-14 09:27:53.430156555 +0000 UTC m=+8936.182071739" observedRunningTime="2026-03-14 09:27:54.140553065 +0000 UTC m=+8936.892468279" watchObservedRunningTime="2026-03-14 09:27:54.147499573 +0000 UTC m=+8936.899414797" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.137311 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558008-486gk"] Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.139972 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-486gk" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.145303 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.145310 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.145392 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.152541 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-486gk"] Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.244035 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qscq\" (UniqueName: \"kubernetes.io/projected/617d39f9-a13a-426f-9657-6b06f1eed2d5-kube-api-access-6qscq\") pod \"auto-csr-approver-29558008-486gk\" (UID: \"617d39f9-a13a-426f-9657-6b06f1eed2d5\") " pod="openshift-infra/auto-csr-approver-29558008-486gk" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.347403 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qscq\" (UniqueName: \"kubernetes.io/projected/617d39f9-a13a-426f-9657-6b06f1eed2d5-kube-api-access-6qscq\") pod \"auto-csr-approver-29558008-486gk\" (UID: \"617d39f9-a13a-426f-9657-6b06f1eed2d5\") " pod="openshift-infra/auto-csr-approver-29558008-486gk" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.368019 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qscq\" (UniqueName: \"kubernetes.io/projected/617d39f9-a13a-426f-9657-6b06f1eed2d5-kube-api-access-6qscq\") pod \"auto-csr-approver-29558008-486gk\" (UID: \"617d39f9-a13a-426f-9657-6b06f1eed2d5\") " pod="openshift-infra/auto-csr-approver-29558008-486gk" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.460295 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-486gk" Mar 14 09:28:00 crc kubenswrapper[5129]: I0314 09:28:00.905617 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-486gk"] Mar 14 09:28:00 crc kubenswrapper[5129]: W0314 09:28:00.911359 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod617d39f9_a13a_426f_9657_6b06f1eed2d5.slice/crio-333a632137f33e71f0d9ca9fed4c4a44e50a2e7058db186db621a8b29d2a2a59 WatchSource:0}: Error finding container 333a632137f33e71f0d9ca9fed4c4a44e50a2e7058db186db621a8b29d2a2a59: Status 404 returned error can't find the container with id 333a632137f33e71f0d9ca9fed4c4a44e50a2e7058db186db621a8b29d2a2a59 Mar 14 09:28:01 crc kubenswrapper[5129]: I0314 09:28:01.199199 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-486gk" event={"ID":"617d39f9-a13a-426f-9657-6b06f1eed2d5","Type":"ContainerStarted","Data":"333a632137f33e71f0d9ca9fed4c4a44e50a2e7058db186db621a8b29d2a2a59"} Mar 14 09:28:02 crc kubenswrapper[5129]: I0314 09:28:02.212268 5129 generic.go:334] "Generic (PLEG): container finished" podID="617d39f9-a13a-426f-9657-6b06f1eed2d5" containerID="55f1bdebc58ca8b23caeb2cbda7a0bf3975f7bee40b03aa542be3f65029499a1" exitCode=0 Mar 14 09:28:02 crc kubenswrapper[5129]: I0314 09:28:02.212677 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-486gk" event={"ID":"617d39f9-a13a-426f-9657-6b06f1eed2d5","Type":"ContainerDied","Data":"55f1bdebc58ca8b23caeb2cbda7a0bf3975f7bee40b03aa542be3f65029499a1"} Mar 14 09:28:03 crc kubenswrapper[5129]: I0314 09:28:03.599554 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-486gk" Mar 14 09:28:03 crc kubenswrapper[5129]: I0314 09:28:03.753788 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qscq\" (UniqueName: \"kubernetes.io/projected/617d39f9-a13a-426f-9657-6b06f1eed2d5-kube-api-access-6qscq\") pod \"617d39f9-a13a-426f-9657-6b06f1eed2d5\" (UID: \"617d39f9-a13a-426f-9657-6b06f1eed2d5\") " Mar 14 09:28:03 crc kubenswrapper[5129]: I0314 09:28:03.761360 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617d39f9-a13a-426f-9657-6b06f1eed2d5-kube-api-access-6qscq" (OuterVolumeSpecName: "kube-api-access-6qscq") pod "617d39f9-a13a-426f-9657-6b06f1eed2d5" (UID: "617d39f9-a13a-426f-9657-6b06f1eed2d5"). InnerVolumeSpecName "kube-api-access-6qscq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:28:03 crc kubenswrapper[5129]: I0314 09:28:03.856864 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qscq\" (UniqueName: \"kubernetes.io/projected/617d39f9-a13a-426f-9657-6b06f1eed2d5-kube-api-access-6qscq\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:04 crc kubenswrapper[5129]: I0314 09:28:04.237572 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-486gk" event={"ID":"617d39f9-a13a-426f-9657-6b06f1eed2d5","Type":"ContainerDied","Data":"333a632137f33e71f0d9ca9fed4c4a44e50a2e7058db186db621a8b29d2a2a59"} Mar 14 09:28:04 crc kubenswrapper[5129]: I0314 09:28:04.238063 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="333a632137f33e71f0d9ca9fed4c4a44e50a2e7058db186db621a8b29d2a2a59" Mar 14 09:28:04 crc kubenswrapper[5129]: I0314 09:28:04.237646 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-486gk" Mar 14 09:28:04 crc kubenswrapper[5129]: I0314 09:28:04.688403 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-np6rf"] Mar 14 09:28:04 crc kubenswrapper[5129]: I0314 09:28:04.698337 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-np6rf"] Mar 14 09:28:06 crc kubenswrapper[5129]: I0314 09:28:06.051511 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a192670-ecbb-4862-a27e-e8c0536bde53" path="/var/lib/kubelet/pods/2a192670-ecbb-4862-a27e-e8c0536bde53/volumes" Mar 14 09:28:19 crc kubenswrapper[5129]: I0314 09:28:19.573969 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:28:19 crc kubenswrapper[5129]: I0314 09:28:19.574865 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:28:30 crc kubenswrapper[5129]: I0314 09:28:30.267581 5129 generic.go:334] "Generic (PLEG): container finished" podID="9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512" containerID="f900ba52f1f331e072d240825504b6b169455744bf4a6646eee1f9d2bb6eddd7" exitCode=0 Mar 14 09:28:30 crc kubenswrapper[5129]: I0314 09:28:30.267670 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-m2ssc" event={"ID":"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512","Type":"ContainerDied","Data":"f900ba52f1f331e072d240825504b6b169455744bf4a6646eee1f9d2bb6eddd7"} Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.724534 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.835540 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-ssh-key-openstack-networker\") pod \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.835995 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pkz7\" (UniqueName: \"kubernetes.io/projected/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-kube-api-access-5pkz7\") pod \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.836048 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-inventory\") pod \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\" (UID: \"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512\") " Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.843459 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-kube-api-access-5pkz7" (OuterVolumeSpecName: "kube-api-access-5pkz7") pod "9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512" (UID: "9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512"). InnerVolumeSpecName "kube-api-access-5pkz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.866409 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-inventory" (OuterVolumeSpecName: "inventory") pod "9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512" (UID: "9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.877155 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512" (UID: "9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.939792 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.939842 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pkz7\" (UniqueName: \"kubernetes.io/projected/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-kube-api-access-5pkz7\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:31 crc kubenswrapper[5129]: I0314 09:28:31.939856 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.293915 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-m2ssc" event={"ID":"9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512","Type":"ContainerDied","Data":"68c41db51d6d44d881b03c981e8bacf18c0dc7c14396fe10dd5b4f2a178e9235"} Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.294026 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c41db51d6d44d881b03c981e8bacf18c0dc7c14396fe10dd5b4f2a178e9235" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.293941 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-m2ssc" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.413922 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-networker-p48r8"] Mar 14 09:28:32 crc kubenswrapper[5129]: E0314 09:28:32.414662 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617d39f9-a13a-426f-9657-6b06f1eed2d5" containerName="oc" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.414693 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="617d39f9-a13a-426f-9657-6b06f1eed2d5" containerName="oc" Mar 14 09:28:32 crc kubenswrapper[5129]: E0314 09:28:32.414711 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512" containerName="install-os-openstack-openstack-networker" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.414722 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512" containerName="install-os-openstack-openstack-networker" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.415057 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512" containerName="install-os-openstack-openstack-networker" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.415104 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="617d39f9-a13a-426f-9657-6b06f1eed2d5" containerName="oc" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.416187 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.418402 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.419410 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.438298 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-p48r8"] Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.555851 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-inventory\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.556028 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb2ls\" (UniqueName: \"kubernetes.io/projected/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-kube-api-access-wb2ls\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.556083 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.658308 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb2ls\" (UniqueName: \"kubernetes.io/projected/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-kube-api-access-wb2ls\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.658374 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.658535 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-inventory\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.663033 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-inventory\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.663032 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.700295 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb2ls\" (UniqueName: \"kubernetes.io/projected/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-kube-api-access-wb2ls\") pod \"configure-os-openstack-openstack-networker-p48r8\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:32 crc kubenswrapper[5129]: I0314 09:28:32.747483 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:28:33 crc kubenswrapper[5129]: I0314 09:28:33.341178 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-p48r8"] Mar 14 09:28:34 crc kubenswrapper[5129]: I0314 09:28:34.319131 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-p48r8" event={"ID":"9cc591e9-0dd5-47e9-b60e-fd1476f5a130","Type":"ContainerStarted","Data":"8199dcf2faa88693f079e738f0595d75524d7b24829be42ebfcf6029b6200157"} Mar 14 09:28:34 crc kubenswrapper[5129]: I0314 09:28:34.320084 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-p48r8" event={"ID":"9cc591e9-0dd5-47e9-b60e-fd1476f5a130","Type":"ContainerStarted","Data":"415485dff29aab90b4c2920206de3c59af56eccc2e28ab51de3898877ecef3ab"} Mar 14 09:28:34 crc kubenswrapper[5129]: I0314 09:28:34.340679 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-networker-p48r8" podStartSLOduration=1.844544344 podStartE2EDuration="2.340658989s" podCreationTimestamp="2026-03-14 09:28:32 +0000 UTC" firstStartedPulling="2026-03-14 09:28:33.356572778 +0000 UTC m=+8976.108487962" lastFinishedPulling="2026-03-14 09:28:33.852687423 +0000 UTC m=+8976.604602607" observedRunningTime="2026-03-14 09:28:34.339078076 +0000 UTC m=+8977.090993270" watchObservedRunningTime="2026-03-14 09:28:34.340658989 +0000 UTC m=+8977.092574173" Mar 14 09:28:35 crc kubenswrapper[5129]: I0314 09:28:35.469567 5129 scope.go:117] "RemoveContainer" containerID="57c2452833de7e3f4f61355cb63ec7942151b07421b72108c926529490922b85" Mar 14 09:28:48 crc kubenswrapper[5129]: I0314 09:28:48.467299 5129 generic.go:334] "Generic (PLEG): container finished" podID="abf4eec2-f79d-4f05-af43-da60a28bde7e" containerID="e48c90fab19326c40cc218bc7817a4ab952925e67fb6d03f8323adf2159375b2" exitCode=0 Mar 14 09:28:48 crc kubenswrapper[5129]: I0314 09:28:48.467374 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" event={"ID":"abf4eec2-f79d-4f05-af43-da60a28bde7e","Type":"ContainerDied","Data":"e48c90fab19326c40cc218bc7817a4ab952925e67fb6d03f8323adf2159375b2"} Mar 14 09:28:49 crc kubenswrapper[5129]: I0314 09:28:49.574916 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:28:49 crc kubenswrapper[5129]: I0314 09:28:49.575018 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.003130 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.064225 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-inventory\") pod \"abf4eec2-f79d-4f05-af43-da60a28bde7e\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.064394 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-ssh-key-openstack-cell1\") pod \"abf4eec2-f79d-4f05-af43-da60a28bde7e\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.064422 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cfz2\" (UniqueName: \"kubernetes.io/projected/abf4eec2-f79d-4f05-af43-da60a28bde7e-kube-api-access-8cfz2\") pod \"abf4eec2-f79d-4f05-af43-da60a28bde7e\" (UID: \"abf4eec2-f79d-4f05-af43-da60a28bde7e\") " Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.074370 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf4eec2-f79d-4f05-af43-da60a28bde7e-kube-api-access-8cfz2" (OuterVolumeSpecName: "kube-api-access-8cfz2") pod "abf4eec2-f79d-4f05-af43-da60a28bde7e" (UID: "abf4eec2-f79d-4f05-af43-da60a28bde7e"). InnerVolumeSpecName "kube-api-access-8cfz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.096468 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-inventory" (OuterVolumeSpecName: "inventory") pod "abf4eec2-f79d-4f05-af43-da60a28bde7e" (UID: "abf4eec2-f79d-4f05-af43-da60a28bde7e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.096499 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "abf4eec2-f79d-4f05-af43-da60a28bde7e" (UID: "abf4eec2-f79d-4f05-af43-da60a28bde7e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.167314 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.167360 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abf4eec2-f79d-4f05-af43-da60a28bde7e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.167375 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cfz2\" (UniqueName: \"kubernetes.io/projected/abf4eec2-f79d-4f05-af43-da60a28bde7e-kube-api-access-8cfz2\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.493116 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" event={"ID":"abf4eec2-f79d-4f05-af43-da60a28bde7e","Type":"ContainerDied","Data":"55a0da12f8edce3f61c19bad5d0ce9b7eb5e38dcbcb5e6f6157f4f7f9b7fb84c"} Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.493486 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55a0da12f8edce3f61c19bad5d0ce9b7eb5e38dcbcb5e6f6157f4f7f9b7fb84c" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.493246 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fdhr7" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.609165 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wdjj9"] Mar 14 09:28:50 crc kubenswrapper[5129]: E0314 09:28:50.609872 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf4eec2-f79d-4f05-af43-da60a28bde7e" containerName="install-os-openstack-openstack-cell1" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.609903 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf4eec2-f79d-4f05-af43-da60a28bde7e" containerName="install-os-openstack-openstack-cell1" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.610426 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf4eec2-f79d-4f05-af43-da60a28bde7e" containerName="install-os-openstack-openstack-cell1" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.612009 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.614759 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.614993 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.622014 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wdjj9"] Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.677470 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.677714 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-inventory\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.677798 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6nvk\" (UniqueName: \"kubernetes.io/projected/c0e67d08-ffac-4bd2-ad70-190d2a1808df-kube-api-access-k6nvk\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.780227 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.780384 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-inventory\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.780446 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6nvk\" (UniqueName: \"kubernetes.io/projected/c0e67d08-ffac-4bd2-ad70-190d2a1808df-kube-api-access-k6nvk\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.819010 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-inventory\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.819472 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6nvk\" (UniqueName: \"kubernetes.io/projected/c0e67d08-ffac-4bd2-ad70-190d2a1808df-kube-api-access-k6nvk\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.821279 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-wdjj9\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:50 crc kubenswrapper[5129]: I0314 09:28:50.955624 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:28:51 crc kubenswrapper[5129]: I0314 09:28:51.557066 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wdjj9"] Mar 14 09:28:52 crc kubenswrapper[5129]: I0314 09:28:52.518719 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" event={"ID":"c0e67d08-ffac-4bd2-ad70-190d2a1808df","Type":"ContainerStarted","Data":"401ce3e55448d65c9115de47f8c29e6cce05101aba94a40acfabdef58bd9e5b1"} Mar 14 09:28:52 crc kubenswrapper[5129]: I0314 09:28:52.519311 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" event={"ID":"c0e67d08-ffac-4bd2-ad70-190d2a1808df","Type":"ContainerStarted","Data":"861a40ec8d68048a6051f67ad4a411595f0a823bc97a3c128f6dc0f59f40d8d7"} Mar 14 09:28:52 crc kubenswrapper[5129]: I0314 09:28:52.539375 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" podStartSLOduration=2.14109006 podStartE2EDuration="2.539344673s" podCreationTimestamp="2026-03-14 09:28:50 +0000 UTC" firstStartedPulling="2026-03-14 09:28:51.558199282 +0000 UTC m=+8994.310114466" lastFinishedPulling="2026-03-14 09:28:51.956453895 +0000 UTC m=+8994.708369079" observedRunningTime="2026-03-14 09:28:52.539030085 +0000 UTC m=+8995.290945269" watchObservedRunningTime="2026-03-14 09:28:52.539344673 +0000 UTC m=+8995.291259877" Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.574209 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.575287 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.575375 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.576660 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"716bfedf95059e7409f096a660a732a3c112f45bc7a6d09224dd5044410d7300"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.576750 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://716bfedf95059e7409f096a660a732a3c112f45bc7a6d09224dd5044410d7300" gracePeriod=600 Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.798036 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="716bfedf95059e7409f096a660a732a3c112f45bc7a6d09224dd5044410d7300" exitCode=0 Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.798126 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"716bfedf95059e7409f096a660a732a3c112f45bc7a6d09224dd5044410d7300"} Mar 14 09:29:19 crc kubenswrapper[5129]: I0314 09:29:19.798505 5129 scope.go:117] "RemoveContainer" containerID="c9ef70f39d7818a1f2fc7144f75a83cfce15668c98b78d5d39e5ce9757fffd38" Mar 14 09:29:20 crc kubenswrapper[5129]: I0314 09:29:20.815467 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816"} Mar 14 09:29:28 crc kubenswrapper[5129]: I0314 09:29:28.906784 5129 generic.go:334] "Generic (PLEG): container finished" podID="9cc591e9-0dd5-47e9-b60e-fd1476f5a130" containerID="8199dcf2faa88693f079e738f0595d75524d7b24829be42ebfcf6029b6200157" exitCode=0 Mar 14 09:29:28 crc kubenswrapper[5129]: I0314 09:29:28.906867 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-p48r8" event={"ID":"9cc591e9-0dd5-47e9-b60e-fd1476f5a130","Type":"ContainerDied","Data":"8199dcf2faa88693f079e738f0595d75524d7b24829be42ebfcf6029b6200157"} Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.404172 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.593855 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-inventory\") pod \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.594518 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb2ls\" (UniqueName: \"kubernetes.io/projected/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-kube-api-access-wb2ls\") pod \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.594594 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-ssh-key-openstack-networker\") pod \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\" (UID: \"9cc591e9-0dd5-47e9-b60e-fd1476f5a130\") " Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.603878 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-kube-api-access-wb2ls" (OuterVolumeSpecName: "kube-api-access-wb2ls") pod "9cc591e9-0dd5-47e9-b60e-fd1476f5a130" (UID: "9cc591e9-0dd5-47e9-b60e-fd1476f5a130"). InnerVolumeSpecName "kube-api-access-wb2ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.630291 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-inventory" (OuterVolumeSpecName: "inventory") pod "9cc591e9-0dd5-47e9-b60e-fd1476f5a130" (UID: "9cc591e9-0dd5-47e9-b60e-fd1476f5a130"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.642301 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "9cc591e9-0dd5-47e9-b60e-fd1476f5a130" (UID: "9cc591e9-0dd5-47e9-b60e-fd1476f5a130"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.697169 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb2ls\" (UniqueName: \"kubernetes.io/projected/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-kube-api-access-wb2ls\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.697211 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.697222 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc591e9-0dd5-47e9-b60e-fd1476f5a130-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.930185 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-p48r8" event={"ID":"9cc591e9-0dd5-47e9-b60e-fd1476f5a130","Type":"ContainerDied","Data":"415485dff29aab90b4c2920206de3c59af56eccc2e28ab51de3898877ecef3ab"} Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.930229 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="415485dff29aab90b4c2920206de3c59af56eccc2e28ab51de3898877ecef3ab" Mar 14 09:29:30 crc kubenswrapper[5129]: I0314 09:29:30.930302 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-p48r8" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.061825 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-networker-7dm4w"] Mar 14 09:29:31 crc kubenswrapper[5129]: E0314 09:29:31.062454 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc591e9-0dd5-47e9-b60e-fd1476f5a130" containerName="configure-os-openstack-openstack-networker" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.062479 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc591e9-0dd5-47e9-b60e-fd1476f5a130" containerName="configure-os-openstack-openstack-networker" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.062804 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc591e9-0dd5-47e9-b60e-fd1476f5a130" containerName="configure-os-openstack-openstack-networker" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.063883 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.072786 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.073050 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.080778 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-7dm4w"] Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.105325 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-inventory\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.105442 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.105548 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jv8\" (UniqueName: \"kubernetes.io/projected/7e48906b-10f5-468a-97d2-be4873be5eaa-kube-api-access-l2jv8\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.207050 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jv8\" (UniqueName: \"kubernetes.io/projected/7e48906b-10f5-468a-97d2-be4873be5eaa-kube-api-access-l2jv8\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.207168 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-inventory\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.207307 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.211330 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-inventory\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.226130 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.227003 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jv8\" (UniqueName: \"kubernetes.io/projected/7e48906b-10f5-468a-97d2-be4873be5eaa-kube-api-access-l2jv8\") pod \"run-os-openstack-openstack-networker-7dm4w\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:31 crc kubenswrapper[5129]: I0314 09:29:31.401512 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:32 crc kubenswrapper[5129]: I0314 09:29:32.008526 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-7dm4w"] Mar 14 09:29:32 crc kubenswrapper[5129]: I0314 09:29:32.015965 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:29:32 crc kubenswrapper[5129]: I0314 09:29:32.952095 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-7dm4w" event={"ID":"7e48906b-10f5-468a-97d2-be4873be5eaa","Type":"ContainerStarted","Data":"e386473433a053d1150a5fca412740bd88b7853c366bc0be3d22979de0c21efc"} Mar 14 09:29:32 crc kubenswrapper[5129]: I0314 09:29:32.952528 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-7dm4w" event={"ID":"7e48906b-10f5-468a-97d2-be4873be5eaa","Type":"ContainerStarted","Data":"c523bb119550a3c6a34d7135428ee8aa0c7a8470db4a8f5f43086c77fc37c66d"} Mar 14 09:29:32 crc kubenswrapper[5129]: I0314 09:29:32.975878 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-networker-7dm4w" podStartSLOduration=1.542278661 podStartE2EDuration="1.975852027s" podCreationTimestamp="2026-03-14 09:29:31 +0000 UTC" firstStartedPulling="2026-03-14 09:29:32.015698574 +0000 UTC m=+9034.767613758" lastFinishedPulling="2026-03-14 09:29:32.44927194 +0000 UTC m=+9035.201187124" observedRunningTime="2026-03-14 09:29:32.97225281 +0000 UTC m=+9035.724167994" watchObservedRunningTime="2026-03-14 09:29:32.975852027 +0000 UTC m=+9035.727767231" Mar 14 09:29:43 crc kubenswrapper[5129]: I0314 09:29:43.096012 5129 generic.go:334] "Generic (PLEG): container finished" podID="7e48906b-10f5-468a-97d2-be4873be5eaa" containerID="e386473433a053d1150a5fca412740bd88b7853c366bc0be3d22979de0c21efc" exitCode=0 Mar 14 09:29:43 crc kubenswrapper[5129]: I0314 09:29:43.096497 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-7dm4w" event={"ID":"7e48906b-10f5-468a-97d2-be4873be5eaa","Type":"ContainerDied","Data":"e386473433a053d1150a5fca412740bd88b7853c366bc0be3d22979de0c21efc"} Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.112939 5129 generic.go:334] "Generic (PLEG): container finished" podID="c0e67d08-ffac-4bd2-ad70-190d2a1808df" containerID="401ce3e55448d65c9115de47f8c29e6cce05101aba94a40acfabdef58bd9e5b1" exitCode=0 Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.113030 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" event={"ID":"c0e67d08-ffac-4bd2-ad70-190d2a1808df","Type":"ContainerDied","Data":"401ce3e55448d65c9115de47f8c29e6cce05101aba94a40acfabdef58bd9e5b1"} Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.743273 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.912332 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-ssh-key-openstack-networker\") pod \"7e48906b-10f5-468a-97d2-be4873be5eaa\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.912686 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-inventory\") pod \"7e48906b-10f5-468a-97d2-be4873be5eaa\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.912733 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2jv8\" (UniqueName: \"kubernetes.io/projected/7e48906b-10f5-468a-97d2-be4873be5eaa-kube-api-access-l2jv8\") pod \"7e48906b-10f5-468a-97d2-be4873be5eaa\" (UID: \"7e48906b-10f5-468a-97d2-be4873be5eaa\") " Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.921998 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e48906b-10f5-468a-97d2-be4873be5eaa-kube-api-access-l2jv8" (OuterVolumeSpecName: "kube-api-access-l2jv8") pod "7e48906b-10f5-468a-97d2-be4873be5eaa" (UID: "7e48906b-10f5-468a-97d2-be4873be5eaa"). InnerVolumeSpecName "kube-api-access-l2jv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.944535 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-inventory" (OuterVolumeSpecName: "inventory") pod "7e48906b-10f5-468a-97d2-be4873be5eaa" (UID: "7e48906b-10f5-468a-97d2-be4873be5eaa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:29:44 crc kubenswrapper[5129]: I0314 09:29:44.952329 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "7e48906b-10f5-468a-97d2-be4873be5eaa" (UID: "7e48906b-10f5-468a-97d2-be4873be5eaa"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.015776 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.015815 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2jv8\" (UniqueName: \"kubernetes.io/projected/7e48906b-10f5-468a-97d2-be4873be5eaa-kube-api-access-l2jv8\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.015828 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7e48906b-10f5-468a-97d2-be4873be5eaa-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.127097 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-7dm4w" event={"ID":"7e48906b-10f5-468a-97d2-be4873be5eaa","Type":"ContainerDied","Data":"c523bb119550a3c6a34d7135428ee8aa0c7a8470db4a8f5f43086c77fc37c66d"} Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.127140 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-7dm4w" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.127177 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c523bb119550a3c6a34d7135428ee8aa0c7a8470db4a8f5f43086c77fc37c66d" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.230994 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-8hjzt"] Mar 14 09:29:45 crc kubenswrapper[5129]: E0314 09:29:45.231629 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e48906b-10f5-468a-97d2-be4873be5eaa" containerName="run-os-openstack-openstack-networker" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.231664 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48906b-10f5-468a-97d2-be4873be5eaa" containerName="run-os-openstack-openstack-networker" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.231969 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e48906b-10f5-468a-97d2-be4873be5eaa" containerName="run-os-openstack-openstack-networker" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.233008 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.237553 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.239499 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.265685 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-8hjzt"] Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.428331 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-inventory\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.429049 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2tx\" (UniqueName: \"kubernetes.io/projected/e24d1244-561f-4d4a-a744-dd7255702ffb-kube-api-access-9x2tx\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.429175 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.530959 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-inventory\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.531026 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2tx\" (UniqueName: \"kubernetes.io/projected/e24d1244-561f-4d4a-a744-dd7255702ffb-kube-api-access-9x2tx\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.531099 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.538457 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-inventory\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.540323 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.549485 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2tx\" (UniqueName: \"kubernetes.io/projected/e24d1244-561f-4d4a-a744-dd7255702ffb-kube-api-access-9x2tx\") pod \"reboot-os-openstack-openstack-networker-8hjzt\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.555551 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.680313 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.836319 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-ssh-key-openstack-cell1\") pod \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.837071 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6nvk\" (UniqueName: \"kubernetes.io/projected/c0e67d08-ffac-4bd2-ad70-190d2a1808df-kube-api-access-k6nvk\") pod \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.838007 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-inventory\") pod \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\" (UID: \"c0e67d08-ffac-4bd2-ad70-190d2a1808df\") " Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.842837 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e67d08-ffac-4bd2-ad70-190d2a1808df-kube-api-access-k6nvk" (OuterVolumeSpecName: "kube-api-access-k6nvk") pod "c0e67d08-ffac-4bd2-ad70-190d2a1808df" (UID: "c0e67d08-ffac-4bd2-ad70-190d2a1808df"). InnerVolumeSpecName "kube-api-access-k6nvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.864755 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c0e67d08-ffac-4bd2-ad70-190d2a1808df" (UID: "c0e67d08-ffac-4bd2-ad70-190d2a1808df"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.872836 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-inventory" (OuterVolumeSpecName: "inventory") pod "c0e67d08-ffac-4bd2-ad70-190d2a1808df" (UID: "c0e67d08-ffac-4bd2-ad70-190d2a1808df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.941825 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.941868 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6nvk\" (UniqueName: \"kubernetes.io/projected/c0e67d08-ffac-4bd2-ad70-190d2a1808df-kube-api-access-k6nvk\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:45 crc kubenswrapper[5129]: I0314 09:29:45.941879 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0e67d08-ffac-4bd2-ad70-190d2a1808df-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.165637 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-8hjzt"] Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.180697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" event={"ID":"c0e67d08-ffac-4bd2-ad70-190d2a1808df","Type":"ContainerDied","Data":"861a40ec8d68048a6051f67ad4a411595f0a823bc97a3c128f6dc0f59f40d8d7"} Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.180754 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="861a40ec8d68048a6051f67ad4a411595f0a823bc97a3c128f6dc0f59f40d8d7" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.180842 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wdjj9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.265468 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-49tl9"] Mar 14 09:29:46 crc kubenswrapper[5129]: E0314 09:29:46.266317 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e67d08-ffac-4bd2-ad70-190d2a1808df" containerName="configure-os-openstack-openstack-cell1" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.266337 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e67d08-ffac-4bd2-ad70-190d2a1808df" containerName="configure-os-openstack-openstack-cell1" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.266593 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e67d08-ffac-4bd2-ad70-190d2a1808df" containerName="configure-os-openstack-openstack-cell1" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.267403 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.271195 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.271430 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.299788 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-49tl9"] Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.358375 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.358446 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-1\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.358512 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-0\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.358547 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2vvx\" (UniqueName: \"kubernetes.io/projected/72f9865d-1bad-481b-8303-99a656a45ea5-kube-api-access-z2vvx\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.358716 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.461179 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.461314 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.461346 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-1\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.461389 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-0\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.461411 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2vvx\" (UniqueName: \"kubernetes.io/projected/72f9865d-1bad-481b-8303-99a656a45ea5-kube-api-access-z2vvx\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.467628 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.467981 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.468144 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-1\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.477317 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-0\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.486065 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2vvx\" (UniqueName: \"kubernetes.io/projected/72f9865d-1bad-481b-8303-99a656a45ea5-kube-api-access-z2vvx\") pod \"ssh-known-hosts-openstack-49tl9\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:46 crc kubenswrapper[5129]: I0314 09:29:46.592145 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:29:47 crc kubenswrapper[5129]: I0314 09:29:47.180056 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-49tl9"] Mar 14 09:29:47 crc kubenswrapper[5129]: I0314 09:29:47.198381 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-49tl9" event={"ID":"72f9865d-1bad-481b-8303-99a656a45ea5","Type":"ContainerStarted","Data":"7d4b300d376c1c20a7a1e489faeb2006747fe17e3860e17b2675f3c8dffb59a3"} Mar 14 09:29:47 crc kubenswrapper[5129]: I0314 09:29:47.200555 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" event={"ID":"e24d1244-561f-4d4a-a744-dd7255702ffb","Type":"ContainerStarted","Data":"81e1fda33f8381a3e8789c450a16ae75107f57055e5460268a7d87432dc93ded"} Mar 14 09:29:47 crc kubenswrapper[5129]: I0314 09:29:47.200674 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" event={"ID":"e24d1244-561f-4d4a-a744-dd7255702ffb","Type":"ContainerStarted","Data":"527faecb21883bfad8f2d8750e7a1c6246a626e7d78c4a82ee96c8d49958c8da"} Mar 14 09:29:47 crc kubenswrapper[5129]: I0314 09:29:47.229074 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" podStartSLOduration=1.753346859 podStartE2EDuration="2.229049883s" podCreationTimestamp="2026-03-14 09:29:45 +0000 UTC" firstStartedPulling="2026-03-14 09:29:46.194116161 +0000 UTC m=+9048.946031345" lastFinishedPulling="2026-03-14 09:29:46.669819185 +0000 UTC m=+9049.421734369" observedRunningTime="2026-03-14 09:29:47.222768384 +0000 UTC m=+9049.974683578" watchObservedRunningTime="2026-03-14 09:29:47.229049883 +0000 UTC m=+9049.980965067" Mar 14 09:29:48 crc kubenswrapper[5129]: I0314 09:29:48.212914 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-49tl9" event={"ID":"72f9865d-1bad-481b-8303-99a656a45ea5","Type":"ContainerStarted","Data":"cd18c9d6447c95c0a87e5819aab35950932aebfe2ba95e1a4f7d68b2eed47f96"} Mar 14 09:29:48 crc kubenswrapper[5129]: I0314 09:29:48.242746 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-49tl9" podStartSLOduration=1.809004802 podStartE2EDuration="2.242718763s" podCreationTimestamp="2026-03-14 09:29:46 +0000 UTC" firstStartedPulling="2026-03-14 09:29:47.17484854 +0000 UTC m=+9049.926763724" lastFinishedPulling="2026-03-14 09:29:47.608562501 +0000 UTC m=+9050.360477685" observedRunningTime="2026-03-14 09:29:48.240167834 +0000 UTC m=+9050.992083018" watchObservedRunningTime="2026-03-14 09:29:48.242718763 +0000 UTC m=+9050.994633957" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.165976 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558010-26868"] Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.168988 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-26868" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.171849 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.171965 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.172070 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.180199 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922"] Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.182399 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.186285 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.186419 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.191400 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-26868"] Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.236726 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922"] Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.283508 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42kg\" (UniqueName: \"kubernetes.io/projected/90ecf9e6-a1ff-4ce4-8e57-931450dd5261-kube-api-access-b42kg\") pod \"auto-csr-approver-29558010-26868\" (UID: \"90ecf9e6-a1ff-4ce4-8e57-931450dd5261\") " pod="openshift-infra/auto-csr-approver-29558010-26868" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.283626 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-secret-volume\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.283701 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx57n\" (UniqueName: \"kubernetes.io/projected/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-kube-api-access-gx57n\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.283799 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-config-volume\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.386374 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-config-volume\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.386500 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b42kg\" (UniqueName: \"kubernetes.io/projected/90ecf9e6-a1ff-4ce4-8e57-931450dd5261-kube-api-access-b42kg\") pod \"auto-csr-approver-29558010-26868\" (UID: \"90ecf9e6-a1ff-4ce4-8e57-931450dd5261\") " pod="openshift-infra/auto-csr-approver-29558010-26868" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.386554 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-secret-volume\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.386639 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx57n\" (UniqueName: \"kubernetes.io/projected/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-kube-api-access-gx57n\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.387383 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-config-volume\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.393797 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-secret-volume\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.408623 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx57n\" (UniqueName: \"kubernetes.io/projected/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-kube-api-access-gx57n\") pod \"collect-profiles-29558010-wv922\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.411051 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b42kg\" (UniqueName: \"kubernetes.io/projected/90ecf9e6-a1ff-4ce4-8e57-931450dd5261-kube-api-access-b42kg\") pod \"auto-csr-approver-29558010-26868\" (UID: \"90ecf9e6-a1ff-4ce4-8e57-931450dd5261\") " pod="openshift-infra/auto-csr-approver-29558010-26868" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.498526 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-26868" Mar 14 09:30:00 crc kubenswrapper[5129]: I0314 09:30:00.514711 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:01 crc kubenswrapper[5129]: I0314 09:30:01.001422 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922"] Mar 14 09:30:01 crc kubenswrapper[5129]: I0314 09:30:01.063224 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-26868"] Mar 14 09:30:01 crc kubenswrapper[5129]: I0314 09:30:01.340957 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" event={"ID":"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db","Type":"ContainerStarted","Data":"145a96c7a443b4a53dd82374ba4ed34c2101a2144a6533cd8c8736d71661ac69"} Mar 14 09:30:01 crc kubenswrapper[5129]: I0314 09:30:01.341014 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" event={"ID":"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db","Type":"ContainerStarted","Data":"9e65d86e7c03244df77e6c1d07f8fbde59b1c33524552cdf563ebe99dfe79bdb"} Mar 14 09:30:01 crc kubenswrapper[5129]: I0314 09:30:01.343456 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-26868" event={"ID":"90ecf9e6-a1ff-4ce4-8e57-931450dd5261","Type":"ContainerStarted","Data":"c803ed1c072c1b49c6432a45c55625c8bfd3167bcbdeb4288187c6451f4bb557"} Mar 14 09:30:01 crc kubenswrapper[5129]: I0314 09:30:01.372936 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" podStartSLOduration=1.372917948 podStartE2EDuration="1.372917948s" podCreationTimestamp="2026-03-14 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:30:01.368849608 +0000 UTC m=+9064.120764812" watchObservedRunningTime="2026-03-14 09:30:01.372917948 +0000 UTC m=+9064.124833132" Mar 14 09:30:02 crc kubenswrapper[5129]: I0314 09:30:02.354174 5129 generic.go:334] "Generic (PLEG): container finished" podID="736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" containerID="145a96c7a443b4a53dd82374ba4ed34c2101a2144a6533cd8c8736d71661ac69" exitCode=0 Mar 14 09:30:02 crc kubenswrapper[5129]: I0314 09:30:02.354222 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" event={"ID":"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db","Type":"ContainerDied","Data":"145a96c7a443b4a53dd82374ba4ed34c2101a2144a6533cd8c8736d71661ac69"} Mar 14 09:30:02 crc kubenswrapper[5129]: I0314 09:30:02.358311 5129 generic.go:334] "Generic (PLEG): container finished" podID="e24d1244-561f-4d4a-a744-dd7255702ffb" containerID="81e1fda33f8381a3e8789c450a16ae75107f57055e5460268a7d87432dc93ded" exitCode=0 Mar 14 09:30:02 crc kubenswrapper[5129]: I0314 09:30:02.358392 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" event={"ID":"e24d1244-561f-4d4a-a744-dd7255702ffb","Type":"ContainerDied","Data":"81e1fda33f8381a3e8789c450a16ae75107f57055e5460268a7d87432dc93ded"} Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.368668 5129 generic.go:334] "Generic (PLEG): container finished" podID="90ecf9e6-a1ff-4ce4-8e57-931450dd5261" containerID="d48ab269e23093c0eb28fc66ad6848f5499cc9c45566e6d196a92080f995a12e" exitCode=0 Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.369719 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-26868" event={"ID":"90ecf9e6-a1ff-4ce4-8e57-931450dd5261","Type":"ContainerDied","Data":"d48ab269e23093c0eb28fc66ad6848f5499cc9c45566e6d196a92080f995a12e"} Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.721911 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.838069 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.871592 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx57n\" (UniqueName: \"kubernetes.io/projected/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-kube-api-access-gx57n\") pod \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.871742 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-secret-volume\") pod \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.871931 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-config-volume\") pod \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\" (UID: \"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db\") " Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.873410 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-config-volume" (OuterVolumeSpecName: "config-volume") pod "736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" (UID: "736ecdc4-7395-4bb0-b3e5-fc6556e7c8db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.881201 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-kube-api-access-gx57n" (OuterVolumeSpecName: "kube-api-access-gx57n") pod "736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" (UID: "736ecdc4-7395-4bb0-b3e5-fc6556e7c8db"). InnerVolumeSpecName "kube-api-access-gx57n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.886738 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" (UID: "736ecdc4-7395-4bb0-b3e5-fc6556e7c8db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.973718 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-inventory\") pod \"e24d1244-561f-4d4a-a744-dd7255702ffb\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.973827 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x2tx\" (UniqueName: \"kubernetes.io/projected/e24d1244-561f-4d4a-a744-dd7255702ffb-kube-api-access-9x2tx\") pod \"e24d1244-561f-4d4a-a744-dd7255702ffb\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.973871 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-ssh-key-openstack-networker\") pod \"e24d1244-561f-4d4a-a744-dd7255702ffb\" (UID: \"e24d1244-561f-4d4a-a744-dd7255702ffb\") " Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.974433 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx57n\" (UniqueName: \"kubernetes.io/projected/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-kube-api-access-gx57n\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.974447 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.974457 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[5129]: I0314 09:30:03.981512 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24d1244-561f-4d4a-a744-dd7255702ffb-kube-api-access-9x2tx" (OuterVolumeSpecName: "kube-api-access-9x2tx") pod "e24d1244-561f-4d4a-a744-dd7255702ffb" (UID: "e24d1244-561f-4d4a-a744-dd7255702ffb"). InnerVolumeSpecName "kube-api-access-9x2tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.011955 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-inventory" (OuterVolumeSpecName: "inventory") pod "e24d1244-561f-4d4a-a744-dd7255702ffb" (UID: "e24d1244-561f-4d4a-a744-dd7255702ffb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.027468 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e24d1244-561f-4d4a-a744-dd7255702ffb" (UID: "e24d1244-561f-4d4a-a744-dd7255702ffb"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.076594 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.076702 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e24d1244-561f-4d4a-a744-dd7255702ffb-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.076713 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x2tx\" (UniqueName: \"kubernetes.io/projected/e24d1244-561f-4d4a-a744-dd7255702ffb-kube-api-access-9x2tx\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.379708 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" event={"ID":"e24d1244-561f-4d4a-a744-dd7255702ffb","Type":"ContainerDied","Data":"527faecb21883bfad8f2d8750e7a1c6246a626e7d78c4a82ee96c8d49958c8da"} Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.379759 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527faecb21883bfad8f2d8750e7a1c6246a626e7d78c4a82ee96c8d49958c8da" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.379765 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-8hjzt" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.381362 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.381358 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922" event={"ID":"736ecdc4-7395-4bb0-b3e5-fc6556e7c8db","Type":"ContainerDied","Data":"9e65d86e7c03244df77e6c1d07f8fbde59b1c33524552cdf563ebe99dfe79bdb"} Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.381482 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e65d86e7c03244df77e6c1d07f8fbde59b1c33524552cdf563ebe99dfe79bdb" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.383043 5129 generic.go:334] "Generic (PLEG): container finished" podID="72f9865d-1bad-481b-8303-99a656a45ea5" containerID="cd18c9d6447c95c0a87e5819aab35950932aebfe2ba95e1a4f7d68b2eed47f96" exitCode=0 Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.383130 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-49tl9" event={"ID":"72f9865d-1bad-481b-8303-99a656a45ea5","Type":"ContainerDied","Data":"cd18c9d6447c95c0a87e5819aab35950932aebfe2ba95e1a4f7d68b2eed47f96"} Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.510418 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-networker-74szz"] Mar 14 09:30:04 crc kubenswrapper[5129]: E0314 09:30:04.510961 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24d1244-561f-4d4a-a744-dd7255702ffb" containerName="reboot-os-openstack-openstack-networker" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.510984 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24d1244-561f-4d4a-a744-dd7255702ffb" containerName="reboot-os-openstack-openstack-networker" Mar 14 09:30:04 crc kubenswrapper[5129]: E0314 09:30:04.511011 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" containerName="collect-profiles" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.511019 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" containerName="collect-profiles" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.511297 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" containerName="collect-profiles" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.511359 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24d1244-561f-4d4a-a744-dd7255702ffb" containerName="reboot-os-openstack-openstack-networker" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.512286 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.522714 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-neutron-metadata-default-certs-0" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.522756 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.522709 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-ovn-default-certs-0" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.529694 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-74szz"] Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588308 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588409 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djbbw\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-kube-api-access-djbbw\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588469 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588516 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588558 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588642 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588684 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-inventory\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.588760 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691174 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691292 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djbbw\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-kube-api-access-djbbw\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691375 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691628 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691687 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691764 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691809 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-inventory\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.691920 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.697688 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.698177 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.698265 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.698538 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.699416 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.700472 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.702485 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-inventory\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.711267 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djbbw\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-kube-api-access-djbbw\") pod \"install-certs-openstack-openstack-networker-74szz\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.806817 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-26868" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.817462 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9"] Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.829769 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-g7ld9"] Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.841011 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.894726 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b42kg\" (UniqueName: \"kubernetes.io/projected/90ecf9e6-a1ff-4ce4-8e57-931450dd5261-kube-api-access-b42kg\") pod \"90ecf9e6-a1ff-4ce4-8e57-931450dd5261\" (UID: \"90ecf9e6-a1ff-4ce4-8e57-931450dd5261\") " Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.900011 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90ecf9e6-a1ff-4ce4-8e57-931450dd5261-kube-api-access-b42kg" (OuterVolumeSpecName: "kube-api-access-b42kg") pod "90ecf9e6-a1ff-4ce4-8e57-931450dd5261" (UID: "90ecf9e6-a1ff-4ce4-8e57-931450dd5261"). InnerVolumeSpecName "kube-api-access-b42kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:04 crc kubenswrapper[5129]: I0314 09:30:04.997613 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b42kg\" (UniqueName: \"kubernetes.io/projected/90ecf9e6-a1ff-4ce4-8e57-931450dd5261-kube-api-access-b42kg\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:05 crc kubenswrapper[5129]: I0314 09:30:05.393705 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-26868" Mar 14 09:30:05 crc kubenswrapper[5129]: I0314 09:30:05.398730 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-26868" event={"ID":"90ecf9e6-a1ff-4ce4-8e57-931450dd5261","Type":"ContainerDied","Data":"c803ed1c072c1b49c6432a45c55625c8bfd3167bcbdeb4288187c6451f4bb557"} Mar 14 09:30:05 crc kubenswrapper[5129]: I0314 09:30:05.398776 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c803ed1c072c1b49c6432a45c55625c8bfd3167bcbdeb4288187c6451f4bb557" Mar 14 09:30:05 crc kubenswrapper[5129]: I0314 09:30:05.429909 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-74szz"] Mar 14 09:30:05 crc kubenswrapper[5129]: W0314 09:30:05.432271 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod330ed053_e545_4c73_987a_0289f46396ff.slice/crio-1ff3c7c962b7f9e78dae82498589583aa862b888a7905b805c312c67aeabb512 WatchSource:0}: Error finding container 1ff3c7c962b7f9e78dae82498589583aa862b888a7905b805c312c67aeabb512: Status 404 returned error can't find the container with id 1ff3c7c962b7f9e78dae82498589583aa862b888a7905b805c312c67aeabb512 Mar 14 09:30:05 crc kubenswrapper[5129]: I0314 09:30:05.883091 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-tt9qw"] Mar 14 09:30:05 crc kubenswrapper[5129]: I0314 09:30:05.892425 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-tt9qw"] Mar 14 09:30:05 crc kubenswrapper[5129]: I0314 09:30:05.894446 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.026465 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2vvx\" (UniqueName: \"kubernetes.io/projected/72f9865d-1bad-481b-8303-99a656a45ea5-kube-api-access-z2vvx\") pod \"72f9865d-1bad-481b-8303-99a656a45ea5\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.026545 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-networker\") pod \"72f9865d-1bad-481b-8303-99a656a45ea5\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.026664 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-1\") pod \"72f9865d-1bad-481b-8303-99a656a45ea5\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.026774 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-cell1\") pod \"72f9865d-1bad-481b-8303-99a656a45ea5\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.026814 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-0\") pod \"72f9865d-1bad-481b-8303-99a656a45ea5\" (UID: \"72f9865d-1bad-481b-8303-99a656a45ea5\") " Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.039378 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f9865d-1bad-481b-8303-99a656a45ea5-kube-api-access-z2vvx" (OuterVolumeSpecName: "kube-api-access-z2vvx") pod "72f9865d-1bad-481b-8303-99a656a45ea5" (UID: "72f9865d-1bad-481b-8303-99a656a45ea5"). InnerVolumeSpecName "kube-api-access-z2vvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.062090 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210287bb-cc01-4243-ac52-f907bb970aa4" path="/var/lib/kubelet/pods/210287bb-cc01-4243-ac52-f907bb970aa4/volumes" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.063368 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f09ccbf1-7724-4b26-bc21-2822d2ea8eaa" path="/var/lib/kubelet/pods/f09ccbf1-7724-4b26-bc21-2822d2ea8eaa/volumes" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.066025 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "72f9865d-1bad-481b-8303-99a656a45ea5" (UID: "72f9865d-1bad-481b-8303-99a656a45ea5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.075929 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "72f9865d-1bad-481b-8303-99a656a45ea5" (UID: "72f9865d-1bad-481b-8303-99a656a45ea5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.083784 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "72f9865d-1bad-481b-8303-99a656a45ea5" (UID: "72f9865d-1bad-481b-8303-99a656a45ea5"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.088373 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "72f9865d-1bad-481b-8303-99a656a45ea5" (UID: "72f9865d-1bad-481b-8303-99a656a45ea5"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.130289 5129 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.130332 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.130342 5129 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.130354 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2vvx\" (UniqueName: \"kubernetes.io/projected/72f9865d-1bad-481b-8303-99a656a45ea5-kube-api-access-z2vvx\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.130367 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/72f9865d-1bad-481b-8303-99a656a45ea5-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.404444 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-49tl9" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.404431 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-49tl9" event={"ID":"72f9865d-1bad-481b-8303-99a656a45ea5","Type":"ContainerDied","Data":"7d4b300d376c1c20a7a1e489faeb2006747fe17e3860e17b2675f3c8dffb59a3"} Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.404839 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d4b300d376c1c20a7a1e489faeb2006747fe17e3860e17b2675f3c8dffb59a3" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.406075 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-74szz" event={"ID":"330ed053-e545-4c73-987a-0289f46396ff","Type":"ContainerStarted","Data":"1ff3c7c962b7f9e78dae82498589583aa862b888a7905b805c312c67aeabb512"} Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.507232 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fdmgm"] Mar 14 09:30:06 crc kubenswrapper[5129]: E0314 09:30:06.507718 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ecf9e6-a1ff-4ce4-8e57-931450dd5261" containerName="oc" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.507739 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ecf9e6-a1ff-4ce4-8e57-931450dd5261" containerName="oc" Mar 14 09:30:06 crc kubenswrapper[5129]: E0314 09:30:06.507754 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f9865d-1bad-481b-8303-99a656a45ea5" containerName="ssh-known-hosts-openstack" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.507760 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f9865d-1bad-481b-8303-99a656a45ea5" containerName="ssh-known-hosts-openstack" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.507919 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ecf9e6-a1ff-4ce4-8e57-931450dd5261" containerName="oc" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.507966 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f9865d-1bad-481b-8303-99a656a45ea5" containerName="ssh-known-hosts-openstack" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.508682 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.511152 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.511510 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.526431 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fdmgm"] Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.641216 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.641386 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj2bx\" (UniqueName: \"kubernetes.io/projected/d4f670da-3d07-4a74-a292-c7359810c516-kube-api-access-fj2bx\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.641427 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-inventory\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.744460 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj2bx\" (UniqueName: \"kubernetes.io/projected/d4f670da-3d07-4a74-a292-c7359810c516-kube-api-access-fj2bx\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.744524 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-inventory\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.744764 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.751762 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.752109 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-inventory\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.766104 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj2bx\" (UniqueName: \"kubernetes.io/projected/d4f670da-3d07-4a74-a292-c7359810c516-kube-api-access-fj2bx\") pod \"run-os-openstack-openstack-cell1-fdmgm\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:06 crc kubenswrapper[5129]: I0314 09:30:06.879494 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:07 crc kubenswrapper[5129]: I0314 09:30:07.434290 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-74szz" event={"ID":"330ed053-e545-4c73-987a-0289f46396ff","Type":"ContainerStarted","Data":"62c176aca546a4e10517520ca0031d9f7668a3e7d9d503ab1e0e8e3aa3687d5b"} Mar 14 09:30:07 crc kubenswrapper[5129]: I0314 09:30:07.468094 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-networker-74szz" podStartSLOduration=1.920034291 podStartE2EDuration="3.468072487s" podCreationTimestamp="2026-03-14 09:30:04 +0000 UTC" firstStartedPulling="2026-03-14 09:30:05.438833658 +0000 UTC m=+9068.190748842" lastFinishedPulling="2026-03-14 09:30:06.986871854 +0000 UTC m=+9069.738787038" observedRunningTime="2026-03-14 09:30:07.460627466 +0000 UTC m=+9070.212542650" watchObservedRunningTime="2026-03-14 09:30:07.468072487 +0000 UTC m=+9070.219987671" Mar 14 09:30:07 crc kubenswrapper[5129]: W0314 09:30:07.488467 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4f670da_3d07_4a74_a292_c7359810c516.slice/crio-18b49c0b6a1898660841da41ddacfa5b70fef9cc1ee12c7f63642e0b97cbdbf2 WatchSource:0}: Error finding container 18b49c0b6a1898660841da41ddacfa5b70fef9cc1ee12c7f63642e0b97cbdbf2: Status 404 returned error can't find the container with id 18b49c0b6a1898660841da41ddacfa5b70fef9cc1ee12c7f63642e0b97cbdbf2 Mar 14 09:30:07 crc kubenswrapper[5129]: I0314 09:30:07.488522 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fdmgm"] Mar 14 09:30:08 crc kubenswrapper[5129]: I0314 09:30:08.444785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" event={"ID":"d4f670da-3d07-4a74-a292-c7359810c516","Type":"ContainerStarted","Data":"fbf5f57d7b41ff21de1ec87cbab35c25c91e942f44be0d0136ace8e145c72df9"} Mar 14 09:30:08 crc kubenswrapper[5129]: I0314 09:30:08.446725 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" event={"ID":"d4f670da-3d07-4a74-a292-c7359810c516","Type":"ContainerStarted","Data":"18b49c0b6a1898660841da41ddacfa5b70fef9cc1ee12c7f63642e0b97cbdbf2"} Mar 14 09:30:20 crc kubenswrapper[5129]: I0314 09:30:20.579151 5129 generic.go:334] "Generic (PLEG): container finished" podID="d4f670da-3d07-4a74-a292-c7359810c516" containerID="fbf5f57d7b41ff21de1ec87cbab35c25c91e942f44be0d0136ace8e145c72df9" exitCode=0 Mar 14 09:30:20 crc kubenswrapper[5129]: I0314 09:30:20.580146 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" event={"ID":"d4f670da-3d07-4a74-a292-c7359810c516","Type":"ContainerDied","Data":"fbf5f57d7b41ff21de1ec87cbab35c25c91e942f44be0d0136ace8e145c72df9"} Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.443722 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.530705 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-ssh-key-openstack-cell1\") pod \"d4f670da-3d07-4a74-a292-c7359810c516\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.530818 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-inventory\") pod \"d4f670da-3d07-4a74-a292-c7359810c516\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.530974 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj2bx\" (UniqueName: \"kubernetes.io/projected/d4f670da-3d07-4a74-a292-c7359810c516-kube-api-access-fj2bx\") pod \"d4f670da-3d07-4a74-a292-c7359810c516\" (UID: \"d4f670da-3d07-4a74-a292-c7359810c516\") " Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.603395 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" event={"ID":"d4f670da-3d07-4a74-a292-c7359810c516","Type":"ContainerDied","Data":"18b49c0b6a1898660841da41ddacfa5b70fef9cc1ee12c7f63642e0b97cbdbf2"} Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.603454 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b49c0b6a1898660841da41ddacfa5b70fef9cc1ee12c7f63642e0b97cbdbf2" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.603475 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fdmgm" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.615114 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f670da-3d07-4a74-a292-c7359810c516-kube-api-access-fj2bx" (OuterVolumeSpecName: "kube-api-access-fj2bx") pod "d4f670da-3d07-4a74-a292-c7359810c516" (UID: "d4f670da-3d07-4a74-a292-c7359810c516"). InnerVolumeSpecName "kube-api-access-fj2bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.621787 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-inventory" (OuterVolumeSpecName: "inventory") pod "d4f670da-3d07-4a74-a292-c7359810c516" (UID: "d4f670da-3d07-4a74-a292-c7359810c516"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.622052 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d4f670da-3d07-4a74-a292-c7359810c516" (UID: "d4f670da-3d07-4a74-a292-c7359810c516"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.632842 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj2bx\" (UniqueName: \"kubernetes.io/projected/d4f670da-3d07-4a74-a292-c7359810c516-kube-api-access-fj2bx\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.632885 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.632899 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4f670da-3d07-4a74-a292-c7359810c516-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.728391 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-qkvqk"] Mar 14 09:30:22 crc kubenswrapper[5129]: E0314 09:30:22.728978 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f670da-3d07-4a74-a292-c7359810c516" containerName="run-os-openstack-openstack-cell1" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.728997 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f670da-3d07-4a74-a292-c7359810c516" containerName="run-os-openstack-openstack-cell1" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.729255 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f670da-3d07-4a74-a292-c7359810c516" containerName="run-os-openstack-openstack-cell1" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.730182 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.734331 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9tk\" (UniqueName: \"kubernetes.io/projected/dd1ca20d-4a40-4708-a05d-645ce7f315a2-kube-api-access-ml9tk\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.734422 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.734450 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-inventory\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.749999 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-qkvqk"] Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.836992 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9tk\" (UniqueName: \"kubernetes.io/projected/dd1ca20d-4a40-4708-a05d-645ce7f315a2-kube-api-access-ml9tk\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.837134 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.837174 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-inventory\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.841386 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.841563 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-inventory\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:22 crc kubenswrapper[5129]: I0314 09:30:22.862180 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9tk\" (UniqueName: \"kubernetes.io/projected/dd1ca20d-4a40-4708-a05d-645ce7f315a2-kube-api-access-ml9tk\") pod \"reboot-os-openstack-openstack-cell1-qkvqk\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:23 crc kubenswrapper[5129]: I0314 09:30:23.046834 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:23 crc kubenswrapper[5129]: I0314 09:30:23.611536 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-qkvqk"] Mar 14 09:30:24 crc kubenswrapper[5129]: I0314 09:30:24.634045 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" event={"ID":"dd1ca20d-4a40-4708-a05d-645ce7f315a2","Type":"ContainerStarted","Data":"c676b3c8aff222d29daf42c57a57c21a2b82711663595e88bacee064991b04c0"} Mar 14 09:30:25 crc kubenswrapper[5129]: I0314 09:30:25.683955 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" event={"ID":"dd1ca20d-4a40-4708-a05d-645ce7f315a2","Type":"ContainerStarted","Data":"70f2f768c07767c8d3bf62b6f02f3615b541637b5acd58c12ea054a37aa21d7d"} Mar 14 09:30:25 crc kubenswrapper[5129]: I0314 09:30:25.706629 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" podStartSLOduration=3.211746545 podStartE2EDuration="3.706585716s" podCreationTimestamp="2026-03-14 09:30:22 +0000 UTC" firstStartedPulling="2026-03-14 09:30:23.624944843 +0000 UTC m=+9086.376860027" lastFinishedPulling="2026-03-14 09:30:24.119783994 +0000 UTC m=+9086.871699198" observedRunningTime="2026-03-14 09:30:25.704980264 +0000 UTC m=+9088.456895448" watchObservedRunningTime="2026-03-14 09:30:25.706585716 +0000 UTC m=+9088.458500900" Mar 14 09:30:30 crc kubenswrapper[5129]: I0314 09:30:30.750795 5129 generic.go:334] "Generic (PLEG): container finished" podID="330ed053-e545-4c73-987a-0289f46396ff" containerID="62c176aca546a4e10517520ca0031d9f7668a3e7d9d503ab1e0e8e3aa3687d5b" exitCode=0 Mar 14 09:30:30 crc kubenswrapper[5129]: I0314 09:30:30.750885 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-74szz" event={"ID":"330ed053-e545-4c73-987a-0289f46396ff","Type":"ContainerDied","Data":"62c176aca546a4e10517520ca0031d9f7668a3e7d9d503ab1e0e8e3aa3687d5b"} Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.193913 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351415 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djbbw\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-kube-api-access-djbbw\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351478 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-neutron-metadata-combined-ca-bundle\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351511 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-ovn-default-certs-0\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351602 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ssh-key-openstack-networker\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351660 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-bootstrap-combined-ca-bundle\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351793 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-inventory\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351864 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-neutron-metadata-default-certs-0\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.351890 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ovn-combined-ca-bundle\") pod \"330ed053-e545-4c73-987a-0289f46396ff\" (UID: \"330ed053-e545-4c73-987a-0289f46396ff\") " Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.357385 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.357506 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-kube-api-access-djbbw" (OuterVolumeSpecName: "kube-api-access-djbbw") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "kube-api-access-djbbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.357615 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-networker-ovn-default-certs-0") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "openstack-networker-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.358726 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.364334 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.369058 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-networker-neutron-metadata-default-certs-0") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "openstack-networker-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.380930 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.394936 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-inventory" (OuterVolumeSpecName: "inventory") pod "330ed053-e545-4c73-987a-0289f46396ff" (UID: "330ed053-e545-4c73-987a-0289f46396ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454501 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djbbw\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-kube-api-access-djbbw\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454543 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454557 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-networker-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454567 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454576 5129 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454590 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454604 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-networker-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/330ed053-e545-4c73-987a-0289f46396ff-openstack-networker-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.454616 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/330ed053-e545-4c73-987a-0289f46396ff-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.781307 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-74szz" event={"ID":"330ed053-e545-4c73-987a-0289f46396ff","Type":"ContainerDied","Data":"1ff3c7c962b7f9e78dae82498589583aa862b888a7905b805c312c67aeabb512"} Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.781724 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ff3c7c962b7f9e78dae82498589583aa862b888a7905b805c312c67aeabb512" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.781581 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-74szz" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.846795 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-networker-2fl4w"] Mar 14 09:30:32 crc kubenswrapper[5129]: E0314 09:30:32.847595 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330ed053-e545-4c73-987a-0289f46396ff" containerName="install-certs-openstack-openstack-networker" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.847641 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="330ed053-e545-4c73-987a-0289f46396ff" containerName="install-certs-openstack-openstack-networker" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.847985 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="330ed053-e545-4c73-987a-0289f46396ff" containerName="install-certs-openstack-openstack-networker" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.849322 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.852308 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.852578 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.864680 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-2fl4w"] Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.864757 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgwh\" (UniqueName: \"kubernetes.io/projected/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-kube-api-access-9cgwh\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.864835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.864898 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.864963 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-inventory\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.865008 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.868987 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.966528 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.966632 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgwh\" (UniqueName: \"kubernetes.io/projected/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-kube-api-access-9cgwh\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.966699 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.966724 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.966781 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-inventory\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.967480 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.970459 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.971279 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.971318 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-inventory\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:32 crc kubenswrapper[5129]: I0314 09:30:32.997543 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgwh\" (UniqueName: \"kubernetes.io/projected/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-kube-api-access-9cgwh\") pod \"ovn-openstack-openstack-networker-2fl4w\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:33 crc kubenswrapper[5129]: I0314 09:30:33.181399 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:30:33 crc kubenswrapper[5129]: I0314 09:30:33.779397 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-2fl4w"] Mar 14 09:30:33 crc kubenswrapper[5129]: I0314 09:30:33.790340 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-2fl4w" event={"ID":"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1","Type":"ContainerStarted","Data":"40efd332f70e1d917e20ad1f80bddb1fee80c93e616602a693dc375324f1f8e9"} Mar 14 09:30:35 crc kubenswrapper[5129]: I0314 09:30:35.598351 5129 scope.go:117] "RemoveContainer" containerID="3b4f9c4100a2d7748487245e7c1632f84d707d6d12c708883f11dfb9f537a6d9" Mar 14 09:30:35 crc kubenswrapper[5129]: I0314 09:30:35.646572 5129 scope.go:117] "RemoveContainer" containerID="590608f3b22bf63f9e7dc0645f708b7ccee925ba750fb564f457a6440badb487" Mar 14 09:30:35 crc kubenswrapper[5129]: I0314 09:30:35.820348 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-2fl4w" event={"ID":"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1","Type":"ContainerStarted","Data":"c589927f01986320657af82d0bb0de255073ebdb5d8b174d994ff303584cc82a"} Mar 14 09:30:35 crc kubenswrapper[5129]: I0314 09:30:35.881081 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-networker-2fl4w" podStartSLOduration=3.33966233 podStartE2EDuration="3.881054267s" podCreationTimestamp="2026-03-14 09:30:32 +0000 UTC" firstStartedPulling="2026-03-14 09:30:33.788016365 +0000 UTC m=+9096.539931539" lastFinishedPulling="2026-03-14 09:30:34.329408292 +0000 UTC m=+9097.081323476" observedRunningTime="2026-03-14 09:30:35.853326098 +0000 UTC m=+9098.605241292" watchObservedRunningTime="2026-03-14 09:30:35.881054267 +0000 UTC m=+9098.632969451" Mar 14 09:30:44 crc kubenswrapper[5129]: I0314 09:30:44.919693 5129 generic.go:334] "Generic (PLEG): container finished" podID="dd1ca20d-4a40-4708-a05d-645ce7f315a2" containerID="70f2f768c07767c8d3bf62b6f02f3615b541637b5acd58c12ea054a37aa21d7d" exitCode=0 Mar 14 09:30:44 crc kubenswrapper[5129]: I0314 09:30:44.919817 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" event={"ID":"dd1ca20d-4a40-4708-a05d-645ce7f315a2","Type":"ContainerDied","Data":"70f2f768c07767c8d3bf62b6f02f3615b541637b5acd58c12ea054a37aa21d7d"} Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.441797 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.586964 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml9tk\" (UniqueName: \"kubernetes.io/projected/dd1ca20d-4a40-4708-a05d-645ce7f315a2-kube-api-access-ml9tk\") pod \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.587108 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-ssh-key-openstack-cell1\") pod \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.587259 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-inventory\") pod \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\" (UID: \"dd1ca20d-4a40-4708-a05d-645ce7f315a2\") " Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.605853 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1ca20d-4a40-4708-a05d-645ce7f315a2-kube-api-access-ml9tk" (OuterVolumeSpecName: "kube-api-access-ml9tk") pod "dd1ca20d-4a40-4708-a05d-645ce7f315a2" (UID: "dd1ca20d-4a40-4708-a05d-645ce7f315a2"). InnerVolumeSpecName "kube-api-access-ml9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.674931 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-inventory" (OuterVolumeSpecName: "inventory") pod "dd1ca20d-4a40-4708-a05d-645ce7f315a2" (UID: "dd1ca20d-4a40-4708-a05d-645ce7f315a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.685965 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dd1ca20d-4a40-4708-a05d-645ce7f315a2" (UID: "dd1ca20d-4a40-4708-a05d-645ce7f315a2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.689987 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml9tk\" (UniqueName: \"kubernetes.io/projected/dd1ca20d-4a40-4708-a05d-645ce7f315a2-kube-api-access-ml9tk\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.690030 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.690046 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd1ca20d-4a40-4708-a05d-645ce7f315a2-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.979263 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" event={"ID":"dd1ca20d-4a40-4708-a05d-645ce7f315a2","Type":"ContainerDied","Data":"c676b3c8aff222d29daf42c57a57c21a2b82711663595e88bacee064991b04c0"} Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.979338 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c676b3c8aff222d29daf42c57a57c21a2b82711663595e88bacee064991b04c0" Mar 14 09:30:46 crc kubenswrapper[5129]: I0314 09:30:46.979363 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-qkvqk" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.048772 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-pvr9b"] Mar 14 09:30:47 crc kubenswrapper[5129]: E0314 09:30:47.049669 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1ca20d-4a40-4708-a05d-645ce7f315a2" containerName="reboot-os-openstack-openstack-cell1" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.049699 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1ca20d-4a40-4708-a05d-645ce7f315a2" containerName="reboot-os-openstack-openstack-cell1" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.050153 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1ca20d-4a40-4708-a05d-645ce7f315a2" containerName="reboot-os-openstack-openstack-cell1" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.051434 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.057563 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.057851 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.069751 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.070230 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.070431 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.071302 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.078095 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-pvr9b"] Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.200558 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.200642 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.200957 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201027 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201201 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201286 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201351 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201416 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201444 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-inventory\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201523 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201667 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfmf\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-kube-api-access-fwfmf\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201782 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201815 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201833 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.201854 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304154 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304233 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304310 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304344 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304399 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304421 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304449 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304473 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304500 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-inventory\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304528 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304560 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfmf\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-kube-api-access-fwfmf\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304594 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304627 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304647 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.304670 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.309858 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.310616 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.310716 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.311578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.313206 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.313493 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.313776 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.314594 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.315500 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.315580 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-inventory\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.323110 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.327649 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfmf\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-kube-api-access-fwfmf\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.327658 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.331111 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.332022 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-pvr9b\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.374457 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:30:47 crc kubenswrapper[5129]: I0314 09:30:47.934819 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-pvr9b"] Mar 14 09:30:49 crc kubenswrapper[5129]: I0314 09:30:49.002494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" event={"ID":"7829da62-c2ff-4358-9ed3-147321c2292c","Type":"ContainerStarted","Data":"bd8e07a920cc9edef0486e93612d31c1bbfda395715e3b54ead9840e8d67004b"} Mar 14 09:30:50 crc kubenswrapper[5129]: I0314 09:30:50.017109 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" event={"ID":"7829da62-c2ff-4358-9ed3-147321c2292c","Type":"ContainerStarted","Data":"a21ff6534ec0f10d3a4e253d1b0f9cb0aeef49e22ad984d8d029126fdf14563d"} Mar 14 09:30:50 crc kubenswrapper[5129]: I0314 09:30:50.042947 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" podStartSLOduration=2.544338896 podStartE2EDuration="3.042927568s" podCreationTimestamp="2026-03-14 09:30:47 +0000 UTC" firstStartedPulling="2026-03-14 09:30:48.532901208 +0000 UTC m=+9111.284816392" lastFinishedPulling="2026-03-14 09:30:49.03148988 +0000 UTC m=+9111.783405064" observedRunningTime="2026-03-14 09:30:50.041387976 +0000 UTC m=+9112.793303180" watchObservedRunningTime="2026-03-14 09:30:50.042927568 +0000 UTC m=+9112.794842752" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.169339 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxs2g"] Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.173548 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.196764 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxs2g"] Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.271890 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjd6\" (UniqueName: \"kubernetes.io/projected/c01e228d-1686-48db-8cee-a4e5bc5cdf29-kube-api-access-9sjd6\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.271978 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-utilities\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.272111 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-catalog-content\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.374507 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjd6\" (UniqueName: \"kubernetes.io/projected/c01e228d-1686-48db-8cee-a4e5bc5cdf29-kube-api-access-9sjd6\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.374589 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-utilities\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.374738 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-catalog-content\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.375249 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-utilities\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.375264 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-catalog-content\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.395844 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjd6\" (UniqueName: \"kubernetes.io/projected/c01e228d-1686-48db-8cee-a4e5bc5cdf29-kube-api-access-9sjd6\") pod \"community-operators-sxs2g\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:11 crc kubenswrapper[5129]: I0314 09:31:11.503728 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:12 crc kubenswrapper[5129]: I0314 09:31:12.114508 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxs2g"] Mar 14 09:31:12 crc kubenswrapper[5129]: W0314 09:31:12.119945 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc01e228d_1686_48db_8cee_a4e5bc5cdf29.slice/crio-a50ede5dabe7d812534f836b3dfc61f73b44b894f74008e62353975361edc6fd WatchSource:0}: Error finding container a50ede5dabe7d812534f836b3dfc61f73b44b894f74008e62353975361edc6fd: Status 404 returned error can't find the container with id a50ede5dabe7d812534f836b3dfc61f73b44b894f74008e62353975361edc6fd Mar 14 09:31:12 crc kubenswrapper[5129]: I0314 09:31:12.241643 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxs2g" event={"ID":"c01e228d-1686-48db-8cee-a4e5bc5cdf29","Type":"ContainerStarted","Data":"a50ede5dabe7d812534f836b3dfc61f73b44b894f74008e62353975361edc6fd"} Mar 14 09:31:13 crc kubenswrapper[5129]: I0314 09:31:13.253526 5129 generic.go:334] "Generic (PLEG): container finished" podID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerID="1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47" exitCode=0 Mar 14 09:31:13 crc kubenswrapper[5129]: I0314 09:31:13.253654 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxs2g" event={"ID":"c01e228d-1686-48db-8cee-a4e5bc5cdf29","Type":"ContainerDied","Data":"1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47"} Mar 14 09:31:14 crc kubenswrapper[5129]: I0314 09:31:14.264728 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxs2g" event={"ID":"c01e228d-1686-48db-8cee-a4e5bc5cdf29","Type":"ContainerStarted","Data":"38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149"} Mar 14 09:31:16 crc kubenswrapper[5129]: I0314 09:31:16.307350 5129 generic.go:334] "Generic (PLEG): container finished" podID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerID="38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149" exitCode=0 Mar 14 09:31:16 crc kubenswrapper[5129]: I0314 09:31:16.307456 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxs2g" event={"ID":"c01e228d-1686-48db-8cee-a4e5bc5cdf29","Type":"ContainerDied","Data":"38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149"} Mar 14 09:31:18 crc kubenswrapper[5129]: I0314 09:31:18.326799 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxs2g" event={"ID":"c01e228d-1686-48db-8cee-a4e5bc5cdf29","Type":"ContainerStarted","Data":"b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062"} Mar 14 09:31:18 crc kubenswrapper[5129]: I0314 09:31:18.346330 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxs2g" podStartSLOduration=3.760305218 podStartE2EDuration="7.34631027s" podCreationTimestamp="2026-03-14 09:31:11 +0000 UTC" firstStartedPulling="2026-03-14 09:31:13.255841747 +0000 UTC m=+9136.007756931" lastFinishedPulling="2026-03-14 09:31:16.841846799 +0000 UTC m=+9139.593761983" observedRunningTime="2026-03-14 09:31:18.343515654 +0000 UTC m=+9141.095430838" watchObservedRunningTime="2026-03-14 09:31:18.34631027 +0000 UTC m=+9141.098225454" Mar 14 09:31:19 crc kubenswrapper[5129]: I0314 09:31:19.574843 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:31:19 crc kubenswrapper[5129]: I0314 09:31:19.575210 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:31:21 crc kubenswrapper[5129]: I0314 09:31:21.504912 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:21 crc kubenswrapper[5129]: I0314 09:31:21.505332 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:21 crc kubenswrapper[5129]: I0314 09:31:21.561407 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:22 crc kubenswrapper[5129]: I0314 09:31:22.411696 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:22 crc kubenswrapper[5129]: I0314 09:31:22.463777 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxs2g"] Mar 14 09:31:24 crc kubenswrapper[5129]: I0314 09:31:24.384671 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxs2g" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="registry-server" containerID="cri-o://b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062" gracePeriod=2 Mar 14 09:31:24 crc kubenswrapper[5129]: I0314 09:31:24.905300 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:24 crc kubenswrapper[5129]: I0314 09:31:24.953701 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-catalog-content\") pod \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " Mar 14 09:31:24 crc kubenswrapper[5129]: I0314 09:31:24.953853 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sjd6\" (UniqueName: \"kubernetes.io/projected/c01e228d-1686-48db-8cee-a4e5bc5cdf29-kube-api-access-9sjd6\") pod \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " Mar 14 09:31:24 crc kubenswrapper[5129]: I0314 09:31:24.953990 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-utilities\") pod \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\" (UID: \"c01e228d-1686-48db-8cee-a4e5bc5cdf29\") " Mar 14 09:31:24 crc kubenswrapper[5129]: I0314 09:31:24.955428 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-utilities" (OuterVolumeSpecName: "utilities") pod "c01e228d-1686-48db-8cee-a4e5bc5cdf29" (UID: "c01e228d-1686-48db-8cee-a4e5bc5cdf29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:31:24 crc kubenswrapper[5129]: I0314 09:31:24.963685 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01e228d-1686-48db-8cee-a4e5bc5cdf29-kube-api-access-9sjd6" (OuterVolumeSpecName: "kube-api-access-9sjd6") pod "c01e228d-1686-48db-8cee-a4e5bc5cdf29" (UID: "c01e228d-1686-48db-8cee-a4e5bc5cdf29"). InnerVolumeSpecName "kube-api-access-9sjd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.008066 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c01e228d-1686-48db-8cee-a4e5bc5cdf29" (UID: "c01e228d-1686-48db-8cee-a4e5bc5cdf29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.056728 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.056782 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sjd6\" (UniqueName: \"kubernetes.io/projected/c01e228d-1686-48db-8cee-a4e5bc5cdf29-kube-api-access-9sjd6\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.056798 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01e228d-1686-48db-8cee-a4e5bc5cdf29-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.395719 5129 generic.go:334] "Generic (PLEG): container finished" podID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerID="b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062" exitCode=0 Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.395764 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxs2g" event={"ID":"c01e228d-1686-48db-8cee-a4e5bc5cdf29","Type":"ContainerDied","Data":"b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062"} Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.395788 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxs2g" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.395807 5129 scope.go:117] "RemoveContainer" containerID="b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.395793 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxs2g" event={"ID":"c01e228d-1686-48db-8cee-a4e5bc5cdf29","Type":"ContainerDied","Data":"a50ede5dabe7d812534f836b3dfc61f73b44b894f74008e62353975361edc6fd"} Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.425572 5129 scope.go:117] "RemoveContainer" containerID="38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.440006 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxs2g"] Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.452750 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxs2g"] Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.454312 5129 scope.go:117] "RemoveContainer" containerID="1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.518844 5129 scope.go:117] "RemoveContainer" containerID="b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062" Mar 14 09:31:25 crc kubenswrapper[5129]: E0314 09:31:25.519313 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062\": container with ID starting with b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062 not found: ID does not exist" containerID="b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.519362 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062"} err="failed to get container status \"b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062\": rpc error: code = NotFound desc = could not find container \"b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062\": container with ID starting with b69d5e0d33f6a17d9b9cea07a08d31a67facf5783be69a0ec1c8bb7c102ee062 not found: ID does not exist" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.519387 5129 scope.go:117] "RemoveContainer" containerID="38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149" Mar 14 09:31:25 crc kubenswrapper[5129]: E0314 09:31:25.520096 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149\": container with ID starting with 38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149 not found: ID does not exist" containerID="38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.520149 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149"} err="failed to get container status \"38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149\": rpc error: code = NotFound desc = could not find container \"38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149\": container with ID starting with 38d12ba82ca712daa258325f12ccf8bfd4c4ff7ba7eaa270136f6df2a52d7149 not found: ID does not exist" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.520177 5129 scope.go:117] "RemoveContainer" containerID="1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47" Mar 14 09:31:25 crc kubenswrapper[5129]: E0314 09:31:25.520749 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47\": container with ID starting with 1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47 not found: ID does not exist" containerID="1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47" Mar 14 09:31:25 crc kubenswrapper[5129]: I0314 09:31:25.520769 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47"} err="failed to get container status \"1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47\": rpc error: code = NotFound desc = could not find container \"1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47\": container with ID starting with 1d51e2ea5b96b7e922fb1b4dc9f6682fba538a867cb67e63bed91c1460f41b47 not found: ID does not exist" Mar 14 09:31:26 crc kubenswrapper[5129]: I0314 09:31:26.054639 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" path="/var/lib/kubelet/pods/c01e228d-1686-48db-8cee-a4e5bc5cdf29/volumes" Mar 14 09:31:37 crc kubenswrapper[5129]: I0314 09:31:37.519032 5129 generic.go:334] "Generic (PLEG): container finished" podID="7829da62-c2ff-4358-9ed3-147321c2292c" containerID="a21ff6534ec0f10d3a4e253d1b0f9cb0aeef49e22ad984d8d029126fdf14563d" exitCode=0 Mar 14 09:31:37 crc kubenswrapper[5129]: I0314 09:31:37.519098 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" event={"ID":"7829da62-c2ff-4358-9ed3-147321c2292c","Type":"ContainerDied","Data":"a21ff6534ec0f10d3a4e253d1b0f9cb0aeef49e22ad984d8d029126fdf14563d"} Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.037082 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.079200 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwfmf\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-kube-api-access-fwfmf\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.079760 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-nova-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.079883 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-dhcp-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.080020 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-sriov-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.080157 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-telemetry-default-certs-0\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.080338 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-ovn-default-certs-0\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.080472 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-libvirt-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.080741 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-metadata-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.081090 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-inventory\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.081488 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ssh-key-openstack-cell1\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.081692 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ovn-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.081797 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-bootstrap-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.081942 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-telemetry-combined-ca-bundle\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.082042 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.082200 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-libvirt-default-certs-0\") pod \"7829da62-c2ff-4358-9ed3-147321c2292c\" (UID: \"7829da62-c2ff-4358-9ed3-147321c2292c\") " Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.090901 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.091055 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.091078 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.091134 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.092502 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-kube-api-access-fwfmf" (OuterVolumeSpecName: "kube-api-access-fwfmf") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "kube-api-access-fwfmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.093752 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.106007 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.106069 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.106159 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.106163 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.106220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.106197 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.108597 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.129356 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.129938 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-inventory" (OuterVolumeSpecName: "inventory") pod "7829da62-c2ff-4358-9ed3-147321c2292c" (UID: "7829da62-c2ff-4358-9ed3-147321c2292c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186754 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186808 5129 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186820 5129 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186834 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186848 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186861 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwfmf\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-kube-api-access-fwfmf\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186874 5129 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186888 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186900 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186911 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186923 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7829da62-c2ff-4358-9ed3-147321c2292c-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186938 5129 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186949 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186962 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.186973 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7829da62-c2ff-4358-9ed3-147321c2292c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.542912 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" event={"ID":"7829da62-c2ff-4358-9ed3-147321c2292c","Type":"ContainerDied","Data":"bd8e07a920cc9edef0486e93612d31c1bbfda395715e3b54ead9840e8d67004b"} Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.543326 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd8e07a920cc9edef0486e93612d31c1bbfda395715e3b54ead9840e8d67004b" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.543009 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-pvr9b" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.685897 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-9g2lm"] Mar 14 09:31:39 crc kubenswrapper[5129]: E0314 09:31:39.686375 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="extract-utilities" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.686393 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="extract-utilities" Mar 14 09:31:39 crc kubenswrapper[5129]: E0314 09:31:39.686415 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7829da62-c2ff-4358-9ed3-147321c2292c" containerName="install-certs-openstack-openstack-cell1" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.686422 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7829da62-c2ff-4358-9ed3-147321c2292c" containerName="install-certs-openstack-openstack-cell1" Mar 14 09:31:39 crc kubenswrapper[5129]: E0314 09:31:39.686438 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="registry-server" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.686444 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="registry-server" Mar 14 09:31:39 crc kubenswrapper[5129]: E0314 09:31:39.686471 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="extract-content" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.686477 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="extract-content" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.686744 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01e228d-1686-48db-8cee-a4e5bc5cdf29" containerName="registry-server" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.686771 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7829da62-c2ff-4358-9ed3-147321c2292c" containerName="install-certs-openstack-openstack-cell1" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.687490 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.693308 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.696273 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.697075 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-9g2lm"] Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.798644 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-inventory\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.798703 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70bcb47f-9330-44d8-8527-f73c8066eac0-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.799048 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.799140 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcg7q\" (UniqueName: \"kubernetes.io/projected/70bcb47f-9330-44d8-8527-f73c8066eac0-kube-api-access-mcg7q\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.799195 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.900837 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.900977 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-inventory\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.901010 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70bcb47f-9330-44d8-8527-f73c8066eac0-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.901102 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.901132 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcg7q\" (UniqueName: \"kubernetes.io/projected/70bcb47f-9330-44d8-8527-f73c8066eac0-kube-api-access-mcg7q\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.902230 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70bcb47f-9330-44d8-8527-f73c8066eac0-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.905292 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-inventory\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.906909 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.907035 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:39 crc kubenswrapper[5129]: I0314 09:31:39.918589 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcg7q\" (UniqueName: \"kubernetes.io/projected/70bcb47f-9330-44d8-8527-f73c8066eac0-kube-api-access-mcg7q\") pod \"ovn-openstack-openstack-cell1-9g2lm\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:40 crc kubenswrapper[5129]: I0314 09:31:40.018966 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:31:40 crc kubenswrapper[5129]: I0314 09:31:40.553254 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-9g2lm"] Mar 14 09:31:41 crc kubenswrapper[5129]: I0314 09:31:41.563391 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" event={"ID":"70bcb47f-9330-44d8-8527-f73c8066eac0","Type":"ContainerStarted","Data":"90eb34016ec801a75e39e4005ab0b300b3702edd0f541ed0406c33fc9794eeda"} Mar 14 09:31:42 crc kubenswrapper[5129]: I0314 09:31:42.577628 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" event={"ID":"70bcb47f-9330-44d8-8527-f73c8066eac0","Type":"ContainerStarted","Data":"eef79a580ba9653816d1824a0fba29c82d58d0c677c3f379cf943f3a75d6ee54"} Mar 14 09:31:42 crc kubenswrapper[5129]: I0314 09:31:42.605447 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" podStartSLOduration=3.113049584 podStartE2EDuration="3.605417447s" podCreationTimestamp="2026-03-14 09:31:39 +0000 UTC" firstStartedPulling="2026-03-14 09:31:41.028226153 +0000 UTC m=+9163.780141347" lastFinishedPulling="2026-03-14 09:31:41.520594026 +0000 UTC m=+9164.272509210" observedRunningTime="2026-03-14 09:31:42.596468416 +0000 UTC m=+9165.348383610" watchObservedRunningTime="2026-03-14 09:31:42.605417447 +0000 UTC m=+9165.357332631" Mar 14 09:31:49 crc kubenswrapper[5129]: I0314 09:31:49.574181 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:31:49 crc kubenswrapper[5129]: I0314 09:31:49.574721 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:31:52 crc kubenswrapper[5129]: I0314 09:31:52.697199 5129 generic.go:334] "Generic (PLEG): container finished" podID="98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" containerID="c589927f01986320657af82d0bb0de255073ebdb5d8b174d994ff303584cc82a" exitCode=0 Mar 14 09:31:52 crc kubenswrapper[5129]: I0314 09:31:52.697264 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-2fl4w" event={"ID":"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1","Type":"ContainerDied","Data":"c589927f01986320657af82d0bb0de255073ebdb5d8b174d994ff303584cc82a"} Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.194381 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.255779 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovncontroller-config-0\") pod \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.256352 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ssh-key-openstack-networker\") pod \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.256467 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cgwh\" (UniqueName: \"kubernetes.io/projected/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-kube-api-access-9cgwh\") pod \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.256544 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-inventory\") pod \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.256694 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovn-combined-ca-bundle\") pod \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\" (UID: \"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1\") " Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.264952 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" (UID: "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.265210 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-kube-api-access-9cgwh" (OuterVolumeSpecName: "kube-api-access-9cgwh") pod "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" (UID: "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1"). InnerVolumeSpecName "kube-api-access-9cgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.297728 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" (UID: "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.299135 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-inventory" (OuterVolumeSpecName: "inventory") pod "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" (UID: "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.300831 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" (UID: "98b9e1f1-5d2f-44b8-b30c-9ec2958abac1"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.361434 5129 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.361512 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.361534 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cgwh\" (UniqueName: \"kubernetes.io/projected/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-kube-api-access-9cgwh\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.361554 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.361576 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b9e1f1-5d2f-44b8-b30c-9ec2958abac1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.733428 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-2fl4w" event={"ID":"98b9e1f1-5d2f-44b8-b30c-9ec2958abac1","Type":"ContainerDied","Data":"40efd332f70e1d917e20ad1f80bddb1fee80c93e616602a693dc375324f1f8e9"} Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.733502 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40efd332f70e1d917e20ad1f80bddb1fee80c93e616602a693dc375324f1f8e9" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.733968 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-2fl4w" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.982410 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-hpflm"] Mar 14 09:31:54 crc kubenswrapper[5129]: E0314 09:31:54.982890 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" containerName="ovn-openstack-openstack-networker" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.982917 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" containerName="ovn-openstack-openstack-networker" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.983168 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b9e1f1-5d2f-44b8-b30c-9ec2958abac1" containerName="ovn-openstack-openstack-networker" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.984008 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.987622 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.988255 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 14 09:31:54 crc kubenswrapper[5129]: I0314 09:31:54.988767 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-q642x" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.001692 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.019869 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-hpflm"] Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.078798 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xngv\" (UniqueName: \"kubernetes.io/projected/62dd10c7-1cda-42f7-891e-6ec9740de425-kube-api-access-7xngv\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.078857 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.079021 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-inventory\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.079097 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.079127 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.079156 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.181632 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-inventory\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.181771 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.181823 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.181856 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.182878 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xngv\" (UniqueName: \"kubernetes.io/projected/62dd10c7-1cda-42f7-891e-6ec9740de425-kube-api-access-7xngv\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.183403 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.718170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.718475 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.718655 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.718912 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xngv\" (UniqueName: \"kubernetes.io/projected/62dd10c7-1cda-42f7-891e-6ec9740de425-kube-api-access-7xngv\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.722178 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.733052 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-inventory\") pod \"neutron-metadata-openstack-openstack-networker-hpflm\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:55 crc kubenswrapper[5129]: I0314 09:31:55.902067 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:31:56 crc kubenswrapper[5129]: W0314 09:31:56.473865 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62dd10c7_1cda_42f7_891e_6ec9740de425.slice/crio-16614e9adf3fc3362d8fe73069f6d96cfddb9e5f263cc716bd566c7111e5459a WatchSource:0}: Error finding container 16614e9adf3fc3362d8fe73069f6d96cfddb9e5f263cc716bd566c7111e5459a: Status 404 returned error can't find the container with id 16614e9adf3fc3362d8fe73069f6d96cfddb9e5f263cc716bd566c7111e5459a Mar 14 09:31:56 crc kubenswrapper[5129]: I0314 09:31:56.479500 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-hpflm"] Mar 14 09:31:56 crc kubenswrapper[5129]: I0314 09:31:56.760116 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" event={"ID":"62dd10c7-1cda-42f7-891e-6ec9740de425","Type":"ContainerStarted","Data":"16614e9adf3fc3362d8fe73069f6d96cfddb9e5f263cc716bd566c7111e5459a"} Mar 14 09:31:57 crc kubenswrapper[5129]: I0314 09:31:57.797517 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" event={"ID":"62dd10c7-1cda-42f7-891e-6ec9740de425","Type":"ContainerStarted","Data":"b1eee2e8659836eb5a731520c9048acfed8a16796c0088d80d17412531929ae8"} Mar 14 09:31:57 crc kubenswrapper[5129]: I0314 09:31:57.830998 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" podStartSLOduration=3.307311378 podStartE2EDuration="3.830973017s" podCreationTimestamp="2026-03-14 09:31:54 +0000 UTC" firstStartedPulling="2026-03-14 09:31:56.476508646 +0000 UTC m=+9179.228423830" lastFinishedPulling="2026-03-14 09:31:57.000170255 +0000 UTC m=+9179.752085469" observedRunningTime="2026-03-14 09:31:57.8244516 +0000 UTC m=+9180.576366854" watchObservedRunningTime="2026-03-14 09:31:57.830973017 +0000 UTC m=+9180.582888201" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.137897 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558012-jbc2x"] Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.141008 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-jbc2x" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.144369 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.144676 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.145153 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.150092 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-jbc2x"] Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.213684 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9nct\" (UniqueName: \"kubernetes.io/projected/2240b996-b085-40b6-bc39-f2466ca00def-kube-api-access-f9nct\") pod \"auto-csr-approver-29558012-jbc2x\" (UID: \"2240b996-b085-40b6-bc39-f2466ca00def\") " pod="openshift-infra/auto-csr-approver-29558012-jbc2x" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.316746 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9nct\" (UniqueName: \"kubernetes.io/projected/2240b996-b085-40b6-bc39-f2466ca00def-kube-api-access-f9nct\") pod \"auto-csr-approver-29558012-jbc2x\" (UID: \"2240b996-b085-40b6-bc39-f2466ca00def\") " pod="openshift-infra/auto-csr-approver-29558012-jbc2x" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.340350 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9nct\" (UniqueName: \"kubernetes.io/projected/2240b996-b085-40b6-bc39-f2466ca00def-kube-api-access-f9nct\") pod \"auto-csr-approver-29558012-jbc2x\" (UID: \"2240b996-b085-40b6-bc39-f2466ca00def\") " pod="openshift-infra/auto-csr-approver-29558012-jbc2x" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.463336 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-jbc2x" Mar 14 09:32:00 crc kubenswrapper[5129]: I0314 09:32:00.906544 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-jbc2x"] Mar 14 09:32:00 crc kubenswrapper[5129]: W0314 09:32:00.909839 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2240b996_b085_40b6_bc39_f2466ca00def.slice/crio-5ad53fdd41f0dda6f65cfade23f958fa3ed5937042baca2765b25e5e7054ba56 WatchSource:0}: Error finding container 5ad53fdd41f0dda6f65cfade23f958fa3ed5937042baca2765b25e5e7054ba56: Status 404 returned error can't find the container with id 5ad53fdd41f0dda6f65cfade23f958fa3ed5937042baca2765b25e5e7054ba56 Mar 14 09:32:01 crc kubenswrapper[5129]: I0314 09:32:01.832258 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-jbc2x" event={"ID":"2240b996-b085-40b6-bc39-f2466ca00def","Type":"ContainerStarted","Data":"5ad53fdd41f0dda6f65cfade23f958fa3ed5937042baca2765b25e5e7054ba56"} Mar 14 09:32:02 crc kubenswrapper[5129]: I0314 09:32:02.843107 5129 generic.go:334] "Generic (PLEG): container finished" podID="2240b996-b085-40b6-bc39-f2466ca00def" containerID="d2436b469869623fd27216a0bc87d82dab1b87682bf32eec3839e49c624136e3" exitCode=0 Mar 14 09:32:02 crc kubenswrapper[5129]: I0314 09:32:02.843156 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-jbc2x" event={"ID":"2240b996-b085-40b6-bc39-f2466ca00def","Type":"ContainerDied","Data":"d2436b469869623fd27216a0bc87d82dab1b87682bf32eec3839e49c624136e3"} Mar 14 09:32:04 crc kubenswrapper[5129]: I0314 09:32:04.243326 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-jbc2x" Mar 14 09:32:04 crc kubenswrapper[5129]: I0314 09:32:04.306861 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9nct\" (UniqueName: \"kubernetes.io/projected/2240b996-b085-40b6-bc39-f2466ca00def-kube-api-access-f9nct\") pod \"2240b996-b085-40b6-bc39-f2466ca00def\" (UID: \"2240b996-b085-40b6-bc39-f2466ca00def\") " Mar 14 09:32:04 crc kubenswrapper[5129]: I0314 09:32:04.317485 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2240b996-b085-40b6-bc39-f2466ca00def-kube-api-access-f9nct" (OuterVolumeSpecName: "kube-api-access-f9nct") pod "2240b996-b085-40b6-bc39-f2466ca00def" (UID: "2240b996-b085-40b6-bc39-f2466ca00def"). InnerVolumeSpecName "kube-api-access-f9nct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:32:04 crc kubenswrapper[5129]: I0314 09:32:04.410468 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9nct\" (UniqueName: \"kubernetes.io/projected/2240b996-b085-40b6-bc39-f2466ca00def-kube-api-access-f9nct\") on node \"crc\" DevicePath \"\"" Mar 14 09:32:04 crc kubenswrapper[5129]: I0314 09:32:04.869085 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-jbc2x" event={"ID":"2240b996-b085-40b6-bc39-f2466ca00def","Type":"ContainerDied","Data":"5ad53fdd41f0dda6f65cfade23f958fa3ed5937042baca2765b25e5e7054ba56"} Mar 14 09:32:04 crc kubenswrapper[5129]: I0314 09:32:04.869132 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad53fdd41f0dda6f65cfade23f958fa3ed5937042baca2765b25e5e7054ba56" Mar 14 09:32:04 crc kubenswrapper[5129]: I0314 09:32:04.869183 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-jbc2x" Mar 14 09:32:05 crc kubenswrapper[5129]: I0314 09:32:05.329890 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-7k6zz"] Mar 14 09:32:05 crc kubenswrapper[5129]: I0314 09:32:05.340420 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-7k6zz"] Mar 14 09:32:06 crc kubenswrapper[5129]: I0314 09:32:06.051130 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7285f3-a213-4658-9762-55b0d8ed9836" path="/var/lib/kubelet/pods/0b7285f3-a213-4658-9762-55b0d8ed9836/volumes" Mar 14 09:32:19 crc kubenswrapper[5129]: I0314 09:32:19.574009 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:32:19 crc kubenswrapper[5129]: I0314 09:32:19.574700 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:32:19 crc kubenswrapper[5129]: I0314 09:32:19.574751 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:32:19 crc kubenswrapper[5129]: I0314 09:32:19.575581 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:32:19 crc kubenswrapper[5129]: I0314 09:32:19.575647 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" gracePeriod=600 Mar 14 09:32:20 crc kubenswrapper[5129]: E0314 09:32:20.749823 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:32:21 crc kubenswrapper[5129]: I0314 09:32:21.043772 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" exitCode=0 Mar 14 09:32:21 crc kubenswrapper[5129]: I0314 09:32:21.043872 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816"} Mar 14 09:32:21 crc kubenswrapper[5129]: I0314 09:32:21.043919 5129 scope.go:117] "RemoveContainer" containerID="716bfedf95059e7409f096a660a732a3c112f45bc7a6d09224dd5044410d7300" Mar 14 09:32:21 crc kubenswrapper[5129]: I0314 09:32:21.045146 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:32:21 crc kubenswrapper[5129]: E0314 09:32:21.045582 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:32:35 crc kubenswrapper[5129]: I0314 09:32:35.853225 5129 scope.go:117] "RemoveContainer" containerID="125c18eb5bf85f6759ee74d38b3afc05e6594ea239dc40ff7d7a0a02266a6393" Mar 14 09:32:36 crc kubenswrapper[5129]: I0314 09:32:36.037655 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:32:36 crc kubenswrapper[5129]: E0314 09:32:36.038836 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:32:47 crc kubenswrapper[5129]: I0314 09:32:47.036840 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:32:47 crc kubenswrapper[5129]: E0314 09:32:47.038179 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:32:59 crc kubenswrapper[5129]: I0314 09:32:59.447780 5129 generic.go:334] "Generic (PLEG): container finished" podID="62dd10c7-1cda-42f7-891e-6ec9740de425" containerID="b1eee2e8659836eb5a731520c9048acfed8a16796c0088d80d17412531929ae8" exitCode=0 Mar 14 09:32:59 crc kubenswrapper[5129]: I0314 09:32:59.448356 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" event={"ID":"62dd10c7-1cda-42f7-891e-6ec9740de425","Type":"ContainerDied","Data":"b1eee2e8659836eb5a731520c9048acfed8a16796c0088d80d17412531929ae8"} Mar 14 09:33:00 crc kubenswrapper[5129]: I0314 09:33:00.912323 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.088001 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-inventory\") pod \"62dd10c7-1cda-42f7-891e-6ec9740de425\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.088097 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-ssh-key-openstack-networker\") pod \"62dd10c7-1cda-42f7-891e-6ec9740de425\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.088141 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-metadata-combined-ca-bundle\") pod \"62dd10c7-1cda-42f7-891e-6ec9740de425\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.088220 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-ovn-metadata-agent-neutron-config-0\") pod \"62dd10c7-1cda-42f7-891e-6ec9740de425\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.088322 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-nova-metadata-neutron-config-0\") pod \"62dd10c7-1cda-42f7-891e-6ec9740de425\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.088421 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xngv\" (UniqueName: \"kubernetes.io/projected/62dd10c7-1cda-42f7-891e-6ec9740de425-kube-api-access-7xngv\") pod \"62dd10c7-1cda-42f7-891e-6ec9740de425\" (UID: \"62dd10c7-1cda-42f7-891e-6ec9740de425\") " Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.095106 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62dd10c7-1cda-42f7-891e-6ec9740de425-kube-api-access-7xngv" (OuterVolumeSpecName: "kube-api-access-7xngv") pod "62dd10c7-1cda-42f7-891e-6ec9740de425" (UID: "62dd10c7-1cda-42f7-891e-6ec9740de425"). InnerVolumeSpecName "kube-api-access-7xngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.095832 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "62dd10c7-1cda-42f7-891e-6ec9740de425" (UID: "62dd10c7-1cda-42f7-891e-6ec9740de425"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.121281 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-inventory" (OuterVolumeSpecName: "inventory") pod "62dd10c7-1cda-42f7-891e-6ec9740de425" (UID: "62dd10c7-1cda-42f7-891e-6ec9740de425"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.122164 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "62dd10c7-1cda-42f7-891e-6ec9740de425" (UID: "62dd10c7-1cda-42f7-891e-6ec9740de425"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.125414 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "62dd10c7-1cda-42f7-891e-6ec9740de425" (UID: "62dd10c7-1cda-42f7-891e-6ec9740de425"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.135516 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "62dd10c7-1cda-42f7-891e-6ec9740de425" (UID: "62dd10c7-1cda-42f7-891e-6ec9740de425"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.191204 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.191814 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.191921 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.191985 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.192051 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/62dd10c7-1cda-42f7-891e-6ec9740de425-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.192105 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xngv\" (UniqueName: \"kubernetes.io/projected/62dd10c7-1cda-42f7-891e-6ec9740de425-kube-api-access-7xngv\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.470833 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" event={"ID":"62dd10c7-1cda-42f7-891e-6ec9740de425","Type":"ContainerDied","Data":"16614e9adf3fc3362d8fe73069f6d96cfddb9e5f263cc716bd566c7111e5459a"} Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.471177 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16614e9adf3fc3362d8fe73069f6d96cfddb9e5f263cc716bd566c7111e5459a" Mar 14 09:33:01 crc kubenswrapper[5129]: I0314 09:33:01.471044 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-hpflm" Mar 14 09:33:02 crc kubenswrapper[5129]: I0314 09:33:02.037184 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:33:02 crc kubenswrapper[5129]: E0314 09:33:02.038223 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:33:14 crc kubenswrapper[5129]: I0314 09:33:14.614215 5129 generic.go:334] "Generic (PLEG): container finished" podID="70bcb47f-9330-44d8-8527-f73c8066eac0" containerID="eef79a580ba9653816d1824a0fba29c82d58d0c677c3f379cf943f3a75d6ee54" exitCode=0 Mar 14 09:33:14 crc kubenswrapper[5129]: I0314 09:33:14.614299 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" event={"ID":"70bcb47f-9330-44d8-8527-f73c8066eac0","Type":"ContainerDied","Data":"eef79a580ba9653816d1824a0fba29c82d58d0c677c3f379cf943f3a75d6ee54"} Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.024507 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.036404 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:33:16 crc kubenswrapper[5129]: E0314 09:33:16.036799 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.116449 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcg7q\" (UniqueName: \"kubernetes.io/projected/70bcb47f-9330-44d8-8527-f73c8066eac0-kube-api-access-mcg7q\") pod \"70bcb47f-9330-44d8-8527-f73c8066eac0\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.116498 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ovn-combined-ca-bundle\") pod \"70bcb47f-9330-44d8-8527-f73c8066eac0\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.116556 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-inventory\") pod \"70bcb47f-9330-44d8-8527-f73c8066eac0\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.116637 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ssh-key-openstack-cell1\") pod \"70bcb47f-9330-44d8-8527-f73c8066eac0\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.116747 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70bcb47f-9330-44d8-8527-f73c8066eac0-ovncontroller-config-0\") pod \"70bcb47f-9330-44d8-8527-f73c8066eac0\" (UID: \"70bcb47f-9330-44d8-8527-f73c8066eac0\") " Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.125117 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70bcb47f-9330-44d8-8527-f73c8066eac0-kube-api-access-mcg7q" (OuterVolumeSpecName: "kube-api-access-mcg7q") pod "70bcb47f-9330-44d8-8527-f73c8066eac0" (UID: "70bcb47f-9330-44d8-8527-f73c8066eac0"). InnerVolumeSpecName "kube-api-access-mcg7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.144059 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "70bcb47f-9330-44d8-8527-f73c8066eac0" (UID: "70bcb47f-9330-44d8-8527-f73c8066eac0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.157692 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-inventory" (OuterVolumeSpecName: "inventory") pod "70bcb47f-9330-44d8-8527-f73c8066eac0" (UID: "70bcb47f-9330-44d8-8527-f73c8066eac0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.170710 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70bcb47f-9330-44d8-8527-f73c8066eac0-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "70bcb47f-9330-44d8-8527-f73c8066eac0" (UID: "70bcb47f-9330-44d8-8527-f73c8066eac0"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.174907 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "70bcb47f-9330-44d8-8527-f73c8066eac0" (UID: "70bcb47f-9330-44d8-8527-f73c8066eac0"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.219406 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.219444 5129 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70bcb47f-9330-44d8-8527-f73c8066eac0-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.219453 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcg7q\" (UniqueName: \"kubernetes.io/projected/70bcb47f-9330-44d8-8527-f73c8066eac0-kube-api-access-mcg7q\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.219462 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.219472 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70bcb47f-9330-44d8-8527-f73c8066eac0-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.635367 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" event={"ID":"70bcb47f-9330-44d8-8527-f73c8066eac0","Type":"ContainerDied","Data":"90eb34016ec801a75e39e4005ab0b300b3702edd0f541ed0406c33fc9794eeda"} Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.635919 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90eb34016ec801a75e39e4005ab0b300b3702edd0f541ed0406c33fc9794eeda" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.635454 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9g2lm" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.770164 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-t694n"] Mar 14 09:33:16 crc kubenswrapper[5129]: E0314 09:33:16.770736 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bcb47f-9330-44d8-8527-f73c8066eac0" containerName="ovn-openstack-openstack-cell1" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.770757 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bcb47f-9330-44d8-8527-f73c8066eac0" containerName="ovn-openstack-openstack-cell1" Mar 14 09:33:16 crc kubenswrapper[5129]: E0314 09:33:16.770781 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2240b996-b085-40b6-bc39-f2466ca00def" containerName="oc" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.770788 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2240b996-b085-40b6-bc39-f2466ca00def" containerName="oc" Mar 14 09:33:16 crc kubenswrapper[5129]: E0314 09:33:16.770813 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dd10c7-1cda-42f7-891e-6ec9740de425" containerName="neutron-metadata-openstack-openstack-networker" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.770822 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dd10c7-1cda-42f7-891e-6ec9740de425" containerName="neutron-metadata-openstack-openstack-networker" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.771029 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2240b996-b085-40b6-bc39-f2466ca00def" containerName="oc" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.771047 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dd10c7-1cda-42f7-891e-6ec9740de425" containerName="neutron-metadata-openstack-openstack-networker" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.771069 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="70bcb47f-9330-44d8-8527-f73c8066eac0" containerName="ovn-openstack-openstack-cell1" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.772087 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.774812 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.775009 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.775615 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.775743 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.775867 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.778521 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.788219 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-t694n"] Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.932633 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mdw\" (UniqueName: \"kubernetes.io/projected/cad1b172-56bf-4ac5-b262-f336f90e825c-kube-api-access-28mdw\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.932713 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.932834 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.932891 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.932999 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:16 crc kubenswrapper[5129]: I0314 09:33:16.933066 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.035150 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mdw\" (UniqueName: \"kubernetes.io/projected/cad1b172-56bf-4ac5-b262-f336f90e825c-kube-api-access-28mdw\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.035199 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.035260 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.035302 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.035344 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.035393 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.042513 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.042651 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.044141 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.044784 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.053493 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mdw\" (UniqueName: \"kubernetes.io/projected/cad1b172-56bf-4ac5-b262-f336f90e825c-kube-api-access-28mdw\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.054551 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-t694n\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.280663 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:33:17 crc kubenswrapper[5129]: I0314 09:33:17.912314 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-t694n"] Mar 14 09:33:18 crc kubenswrapper[5129]: I0314 09:33:18.655261 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" event={"ID":"cad1b172-56bf-4ac5-b262-f336f90e825c","Type":"ContainerStarted","Data":"bcece9fdaa7a852a0b1df4969741419c57c47164da6dfca71da8b3d3ea534e0e"} Mar 14 09:33:19 crc kubenswrapper[5129]: I0314 09:33:19.673165 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" event={"ID":"cad1b172-56bf-4ac5-b262-f336f90e825c","Type":"ContainerStarted","Data":"28cf2180f28393aded44c097457d1e4f6f2d35fe35948e8b0acc8c350c87491a"} Mar 14 09:33:19 crc kubenswrapper[5129]: I0314 09:33:19.695196 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" podStartSLOduration=3.210189335 podStartE2EDuration="3.69517812s" podCreationTimestamp="2026-03-14 09:33:16 +0000 UTC" firstStartedPulling="2026-03-14 09:33:17.909205319 +0000 UTC m=+9260.661120513" lastFinishedPulling="2026-03-14 09:33:18.394194104 +0000 UTC m=+9261.146109298" observedRunningTime="2026-03-14 09:33:19.693820254 +0000 UTC m=+9262.445735448" watchObservedRunningTime="2026-03-14 09:33:19.69517812 +0000 UTC m=+9262.447093304" Mar 14 09:33:27 crc kubenswrapper[5129]: I0314 09:33:27.036815 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:33:27 crc kubenswrapper[5129]: E0314 09:33:27.038048 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:33:38 crc kubenswrapper[5129]: I0314 09:33:38.051707 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:33:38 crc kubenswrapper[5129]: E0314 09:33:38.053138 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:33:52 crc kubenswrapper[5129]: I0314 09:33:52.036018 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:33:52 crc kubenswrapper[5129]: E0314 09:33:52.036828 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.151226 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558014-p9zlg"] Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.153221 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-p9zlg" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.156220 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.156385 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.162443 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-p9zlg"] Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.166712 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.308164 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fslbl\" (UniqueName: \"kubernetes.io/projected/82a626e8-0fb6-4e1b-b698-766eb0e72ea0-kube-api-access-fslbl\") pod \"auto-csr-approver-29558014-p9zlg\" (UID: \"82a626e8-0fb6-4e1b-b698-766eb0e72ea0\") " pod="openshift-infra/auto-csr-approver-29558014-p9zlg" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.410887 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fslbl\" (UniqueName: \"kubernetes.io/projected/82a626e8-0fb6-4e1b-b698-766eb0e72ea0-kube-api-access-fslbl\") pod \"auto-csr-approver-29558014-p9zlg\" (UID: \"82a626e8-0fb6-4e1b-b698-766eb0e72ea0\") " pod="openshift-infra/auto-csr-approver-29558014-p9zlg" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.436805 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fslbl\" (UniqueName: \"kubernetes.io/projected/82a626e8-0fb6-4e1b-b698-766eb0e72ea0-kube-api-access-fslbl\") pod \"auto-csr-approver-29558014-p9zlg\" (UID: \"82a626e8-0fb6-4e1b-b698-766eb0e72ea0\") " pod="openshift-infra/auto-csr-approver-29558014-p9zlg" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.487112 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-p9zlg" Mar 14 09:34:00 crc kubenswrapper[5129]: I0314 09:34:00.949876 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-p9zlg"] Mar 14 09:34:02 crc kubenswrapper[5129]: I0314 09:34:02.172220 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-p9zlg" event={"ID":"82a626e8-0fb6-4e1b-b698-766eb0e72ea0","Type":"ContainerStarted","Data":"6971d149c7b4e25889c021578d944382308c6b0fdbb752aaee094fb89396fbde"} Mar 14 09:34:03 crc kubenswrapper[5129]: I0314 09:34:03.186765 5129 generic.go:334] "Generic (PLEG): container finished" podID="82a626e8-0fb6-4e1b-b698-766eb0e72ea0" containerID="734841ac4ae95ab43993577d941b7ba43891b42cc34ffe936ced9f471e480767" exitCode=0 Mar 14 09:34:03 crc kubenswrapper[5129]: I0314 09:34:03.186919 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-p9zlg" event={"ID":"82a626e8-0fb6-4e1b-b698-766eb0e72ea0","Type":"ContainerDied","Data":"734841ac4ae95ab43993577d941b7ba43891b42cc34ffe936ced9f471e480767"} Mar 14 09:34:04 crc kubenswrapper[5129]: I0314 09:34:04.037226 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:34:04 crc kubenswrapper[5129]: E0314 09:34:04.038420 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:34:04 crc kubenswrapper[5129]: I0314 09:34:04.626833 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-p9zlg" Mar 14 09:34:04 crc kubenswrapper[5129]: I0314 09:34:04.711768 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fslbl\" (UniqueName: \"kubernetes.io/projected/82a626e8-0fb6-4e1b-b698-766eb0e72ea0-kube-api-access-fslbl\") pod \"82a626e8-0fb6-4e1b-b698-766eb0e72ea0\" (UID: \"82a626e8-0fb6-4e1b-b698-766eb0e72ea0\") " Mar 14 09:34:04 crc kubenswrapper[5129]: I0314 09:34:04.726272 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a626e8-0fb6-4e1b-b698-766eb0e72ea0-kube-api-access-fslbl" (OuterVolumeSpecName: "kube-api-access-fslbl") pod "82a626e8-0fb6-4e1b-b698-766eb0e72ea0" (UID: "82a626e8-0fb6-4e1b-b698-766eb0e72ea0"). InnerVolumeSpecName "kube-api-access-fslbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:34:04 crc kubenswrapper[5129]: I0314 09:34:04.814352 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fslbl\" (UniqueName: \"kubernetes.io/projected/82a626e8-0fb6-4e1b-b698-766eb0e72ea0-kube-api-access-fslbl\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:05 crc kubenswrapper[5129]: I0314 09:34:05.210833 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-p9zlg" event={"ID":"82a626e8-0fb6-4e1b-b698-766eb0e72ea0","Type":"ContainerDied","Data":"6971d149c7b4e25889c021578d944382308c6b0fdbb752aaee094fb89396fbde"} Mar 14 09:34:05 crc kubenswrapper[5129]: I0314 09:34:05.211106 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6971d149c7b4e25889c021578d944382308c6b0fdbb752aaee094fb89396fbde" Mar 14 09:34:05 crc kubenswrapper[5129]: I0314 09:34:05.210893 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-p9zlg" Mar 14 09:34:05 crc kubenswrapper[5129]: I0314 09:34:05.698722 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-486gk"] Mar 14 09:34:05 crc kubenswrapper[5129]: I0314 09:34:05.708137 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-486gk"] Mar 14 09:34:06 crc kubenswrapper[5129]: I0314 09:34:06.048757 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617d39f9-a13a-426f-9657-6b06f1eed2d5" path="/var/lib/kubelet/pods/617d39f9-a13a-426f-9657-6b06f1eed2d5/volumes" Mar 14 09:34:16 crc kubenswrapper[5129]: I0314 09:34:16.345106 5129 generic.go:334] "Generic (PLEG): container finished" podID="cad1b172-56bf-4ac5-b262-f336f90e825c" containerID="28cf2180f28393aded44c097457d1e4f6f2d35fe35948e8b0acc8c350c87491a" exitCode=0 Mar 14 09:34:16 crc kubenswrapper[5129]: I0314 09:34:16.345224 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" event={"ID":"cad1b172-56bf-4ac5-b262-f336f90e825c","Type":"ContainerDied","Data":"28cf2180f28393aded44c097457d1e4f6f2d35fe35948e8b0acc8c350c87491a"} Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.808081 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.969921 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-nova-metadata-neutron-config-0\") pod \"cad1b172-56bf-4ac5-b262-f336f90e825c\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.970021 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-inventory\") pod \"cad1b172-56bf-4ac5-b262-f336f90e825c\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.970046 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-metadata-combined-ca-bundle\") pod \"cad1b172-56bf-4ac5-b262-f336f90e825c\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.970164 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-ssh-key-openstack-cell1\") pod \"cad1b172-56bf-4ac5-b262-f336f90e825c\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.970195 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mdw\" (UniqueName: \"kubernetes.io/projected/cad1b172-56bf-4ac5-b262-f336f90e825c-kube-api-access-28mdw\") pod \"cad1b172-56bf-4ac5-b262-f336f90e825c\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.970224 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cad1b172-56bf-4ac5-b262-f336f90e825c\" (UID: \"cad1b172-56bf-4ac5-b262-f336f90e825c\") " Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.976730 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cad1b172-56bf-4ac5-b262-f336f90e825c" (UID: "cad1b172-56bf-4ac5-b262-f336f90e825c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:34:17 crc kubenswrapper[5129]: I0314 09:34:17.976994 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad1b172-56bf-4ac5-b262-f336f90e825c-kube-api-access-28mdw" (OuterVolumeSpecName: "kube-api-access-28mdw") pod "cad1b172-56bf-4ac5-b262-f336f90e825c" (UID: "cad1b172-56bf-4ac5-b262-f336f90e825c"). InnerVolumeSpecName "kube-api-access-28mdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.001156 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cad1b172-56bf-4ac5-b262-f336f90e825c" (UID: "cad1b172-56bf-4ac5-b262-f336f90e825c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.002755 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cad1b172-56bf-4ac5-b262-f336f90e825c" (UID: "cad1b172-56bf-4ac5-b262-f336f90e825c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.012979 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cad1b172-56bf-4ac5-b262-f336f90e825c" (UID: "cad1b172-56bf-4ac5-b262-f336f90e825c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.013144 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-inventory" (OuterVolumeSpecName: "inventory") pod "cad1b172-56bf-4ac5-b262-f336f90e825c" (UID: "cad1b172-56bf-4ac5-b262-f336f90e825c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.044741 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:34:18 crc kubenswrapper[5129]: E0314 09:34:18.045028 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.078346 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.078401 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.078417 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.078427 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.078436 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cad1b172-56bf-4ac5-b262-f336f90e825c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.078446 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28mdw\" (UniqueName: \"kubernetes.io/projected/cad1b172-56bf-4ac5-b262-f336f90e825c-kube-api-access-28mdw\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.367496 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" event={"ID":"cad1b172-56bf-4ac5-b262-f336f90e825c","Type":"ContainerDied","Data":"bcece9fdaa7a852a0b1df4969741419c57c47164da6dfca71da8b3d3ea534e0e"} Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.368044 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcece9fdaa7a852a0b1df4969741419c57c47164da6dfca71da8b3d3ea534e0e" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.367569 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-t694n" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.563445 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-p8qzl"] Mar 14 09:34:18 crc kubenswrapper[5129]: E0314 09:34:18.563864 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad1b172-56bf-4ac5-b262-f336f90e825c" containerName="neutron-metadata-openstack-openstack-cell1" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.563894 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad1b172-56bf-4ac5-b262-f336f90e825c" containerName="neutron-metadata-openstack-openstack-cell1" Mar 14 09:34:18 crc kubenswrapper[5129]: E0314 09:34:18.563905 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a626e8-0fb6-4e1b-b698-766eb0e72ea0" containerName="oc" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.563913 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a626e8-0fb6-4e1b-b698-766eb0e72ea0" containerName="oc" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.564101 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad1b172-56bf-4ac5-b262-f336f90e825c" containerName="neutron-metadata-openstack-openstack-cell1" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.564126 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a626e8-0fb6-4e1b-b698-766eb0e72ea0" containerName="oc" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.564806 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.567150 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.567349 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.567399 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.567429 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.569401 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.582378 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-p8qzl"] Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.695174 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.695260 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj9sp\" (UniqueName: \"kubernetes.io/projected/623aba17-af6f-4ec2-8d79-1d71984816d2-kube-api-access-zj9sp\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.695330 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.695355 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-inventory\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.695941 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.798131 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.798181 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.798226 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj9sp\" (UniqueName: \"kubernetes.io/projected/623aba17-af6f-4ec2-8d79-1d71984816d2-kube-api-access-zj9sp\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.798267 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.798289 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-inventory\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.802229 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-inventory\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.803926 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.805099 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.805565 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.815840 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj9sp\" (UniqueName: \"kubernetes.io/projected/623aba17-af6f-4ec2-8d79-1d71984816d2-kube-api-access-zj9sp\") pod \"libvirt-openstack-openstack-cell1-p8qzl\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:18 crc kubenswrapper[5129]: I0314 09:34:18.886529 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:34:19 crc kubenswrapper[5129]: I0314 09:34:19.406872 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-p8qzl"] Mar 14 09:34:20 crc kubenswrapper[5129]: I0314 09:34:20.388569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" event={"ID":"623aba17-af6f-4ec2-8d79-1d71984816d2","Type":"ContainerStarted","Data":"d4ce7fa6729a0852719287482f1d8b484fc51084c781b28eb53ea054d9b4fa65"} Mar 14 09:34:20 crc kubenswrapper[5129]: I0314 09:34:20.388640 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" event={"ID":"623aba17-af6f-4ec2-8d79-1d71984816d2","Type":"ContainerStarted","Data":"012410b0805dccc1d3bdc50df57f4b15f14302c323bb38c31b05cd1b34fac0b9"} Mar 14 09:34:20 crc kubenswrapper[5129]: I0314 09:34:20.404035 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" podStartSLOduration=1.944330076 podStartE2EDuration="2.404012473s" podCreationTimestamp="2026-03-14 09:34:18 +0000 UTC" firstStartedPulling="2026-03-14 09:34:19.412470001 +0000 UTC m=+9322.164385195" lastFinishedPulling="2026-03-14 09:34:19.872152388 +0000 UTC m=+9322.624067592" observedRunningTime="2026-03-14 09:34:20.402236266 +0000 UTC m=+9323.154151460" watchObservedRunningTime="2026-03-14 09:34:20.404012473 +0000 UTC m=+9323.155927667" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.491215 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b54nz"] Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.493884 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.508385 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b54nz"] Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.679159 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbszd\" (UniqueName: \"kubernetes.io/projected/bc01c711-b02b-48e2-8d63-bccc6b1babd5-kube-api-access-jbszd\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.679225 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-utilities\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.679299 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-catalog-content\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.780709 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-catalog-content\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.781249 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbszd\" (UniqueName: \"kubernetes.io/projected/bc01c711-b02b-48e2-8d63-bccc6b1babd5-kube-api-access-jbszd\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.781274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-utilities\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.781283 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-catalog-content\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.781665 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-utilities\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.804522 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbszd\" (UniqueName: \"kubernetes.io/projected/bc01c711-b02b-48e2-8d63-bccc6b1babd5-kube-api-access-jbszd\") pod \"certified-operators-b54nz\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:22 crc kubenswrapper[5129]: I0314 09:34:22.826028 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:23 crc kubenswrapper[5129]: I0314 09:34:23.456050 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b54nz"] Mar 14 09:34:23 crc kubenswrapper[5129]: W0314 09:34:23.459730 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc01c711_b02b_48e2_8d63_bccc6b1babd5.slice/crio-d977793918308cb0fa4e5319ebada769caf930826fc7102507f4fbe93d26e941 WatchSource:0}: Error finding container d977793918308cb0fa4e5319ebada769caf930826fc7102507f4fbe93d26e941: Status 404 returned error can't find the container with id d977793918308cb0fa4e5319ebada769caf930826fc7102507f4fbe93d26e941 Mar 14 09:34:24 crc kubenswrapper[5129]: I0314 09:34:24.425362 5129 generic.go:334] "Generic (PLEG): container finished" podID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerID="265709864d4bbe81365dd3ca384cde6ecd5556373744960a9509dcfe5ba20279" exitCode=0 Mar 14 09:34:24 crc kubenswrapper[5129]: I0314 09:34:24.425440 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b54nz" event={"ID":"bc01c711-b02b-48e2-8d63-bccc6b1babd5","Type":"ContainerDied","Data":"265709864d4bbe81365dd3ca384cde6ecd5556373744960a9509dcfe5ba20279"} Mar 14 09:34:24 crc kubenswrapper[5129]: I0314 09:34:24.425845 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b54nz" event={"ID":"bc01c711-b02b-48e2-8d63-bccc6b1babd5","Type":"ContainerStarted","Data":"d977793918308cb0fa4e5319ebada769caf930826fc7102507f4fbe93d26e941"} Mar 14 09:34:25 crc kubenswrapper[5129]: I0314 09:34:25.440567 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b54nz" event={"ID":"bc01c711-b02b-48e2-8d63-bccc6b1babd5","Type":"ContainerStarted","Data":"f2e87746e908b4e2d950512f06d5c91ccbf9da6dd82fd77371a2f4ceedea6233"} Mar 14 09:34:26 crc kubenswrapper[5129]: I0314 09:34:26.450377 5129 generic.go:334] "Generic (PLEG): container finished" podID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerID="f2e87746e908b4e2d950512f06d5c91ccbf9da6dd82fd77371a2f4ceedea6233" exitCode=0 Mar 14 09:34:26 crc kubenswrapper[5129]: I0314 09:34:26.450738 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b54nz" event={"ID":"bc01c711-b02b-48e2-8d63-bccc6b1babd5","Type":"ContainerDied","Data":"f2e87746e908b4e2d950512f06d5c91ccbf9da6dd82fd77371a2f4ceedea6233"} Mar 14 09:34:27 crc kubenswrapper[5129]: I0314 09:34:27.463579 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b54nz" event={"ID":"bc01c711-b02b-48e2-8d63-bccc6b1babd5","Type":"ContainerStarted","Data":"4e877f253ccb8d0dd3cabf885b2fa802ad137a367817e652844c43c11badd5cb"} Mar 14 09:34:27 crc kubenswrapper[5129]: I0314 09:34:27.485385 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b54nz" podStartSLOduration=3.038313855 podStartE2EDuration="5.485360816s" podCreationTimestamp="2026-03-14 09:34:22 +0000 UTC" firstStartedPulling="2026-03-14 09:34:24.428093881 +0000 UTC m=+9327.180009065" lastFinishedPulling="2026-03-14 09:34:26.875140842 +0000 UTC m=+9329.627056026" observedRunningTime="2026-03-14 09:34:27.479366693 +0000 UTC m=+9330.231281897" watchObservedRunningTime="2026-03-14 09:34:27.485360816 +0000 UTC m=+9330.237276000" Mar 14 09:34:30 crc kubenswrapper[5129]: I0314 09:34:30.036638 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:34:30 crc kubenswrapper[5129]: E0314 09:34:30.037444 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:34:32 crc kubenswrapper[5129]: I0314 09:34:32.826391 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:32 crc kubenswrapper[5129]: I0314 09:34:32.827534 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:33 crc kubenswrapper[5129]: I0314 09:34:33.316028 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:33 crc kubenswrapper[5129]: I0314 09:34:33.566977 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:33 crc kubenswrapper[5129]: I0314 09:34:33.620546 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b54nz"] Mar 14 09:34:35 crc kubenswrapper[5129]: I0314 09:34:35.542910 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b54nz" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="registry-server" containerID="cri-o://4e877f253ccb8d0dd3cabf885b2fa802ad137a367817e652844c43c11badd5cb" gracePeriod=2 Mar 14 09:34:36 crc kubenswrapper[5129]: I0314 09:34:36.008710 5129 scope.go:117] "RemoveContainer" containerID="55f1bdebc58ca8b23caeb2cbda7a0bf3975f7bee40b03aa542be3f65029499a1" Mar 14 09:34:36 crc kubenswrapper[5129]: I0314 09:34:36.559070 5129 generic.go:334] "Generic (PLEG): container finished" podID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerID="4e877f253ccb8d0dd3cabf885b2fa802ad137a367817e652844c43c11badd5cb" exitCode=0 Mar 14 09:34:36 crc kubenswrapper[5129]: I0314 09:34:36.559159 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b54nz" event={"ID":"bc01c711-b02b-48e2-8d63-bccc6b1babd5","Type":"ContainerDied","Data":"4e877f253ccb8d0dd3cabf885b2fa802ad137a367817e652844c43c11badd5cb"} Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.272500 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.400124 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-utilities\") pod \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.400512 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbszd\" (UniqueName: \"kubernetes.io/projected/bc01c711-b02b-48e2-8d63-bccc6b1babd5-kube-api-access-jbszd\") pod \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.400823 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-catalog-content\") pod \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\" (UID: \"bc01c711-b02b-48e2-8d63-bccc6b1babd5\") " Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.401388 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-utilities" (OuterVolumeSpecName: "utilities") pod "bc01c711-b02b-48e2-8d63-bccc6b1babd5" (UID: "bc01c711-b02b-48e2-8d63-bccc6b1babd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.412854 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc01c711-b02b-48e2-8d63-bccc6b1babd5-kube-api-access-jbszd" (OuterVolumeSpecName: "kube-api-access-jbszd") pod "bc01c711-b02b-48e2-8d63-bccc6b1babd5" (UID: "bc01c711-b02b-48e2-8d63-bccc6b1babd5"). InnerVolumeSpecName "kube-api-access-jbszd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.482837 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc01c711-b02b-48e2-8d63-bccc6b1babd5" (UID: "bc01c711-b02b-48e2-8d63-bccc6b1babd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.503701 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.503767 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbszd\" (UniqueName: \"kubernetes.io/projected/bc01c711-b02b-48e2-8d63-bccc6b1babd5-kube-api-access-jbszd\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.503783 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc01c711-b02b-48e2-8d63-bccc6b1babd5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.573034 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b54nz" event={"ID":"bc01c711-b02b-48e2-8d63-bccc6b1babd5","Type":"ContainerDied","Data":"d977793918308cb0fa4e5319ebada769caf930826fc7102507f4fbe93d26e941"} Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.573117 5129 scope.go:117] "RemoveContainer" containerID="4e877f253ccb8d0dd3cabf885b2fa802ad137a367817e652844c43c11badd5cb" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.573135 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b54nz" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.598165 5129 scope.go:117] "RemoveContainer" containerID="f2e87746e908b4e2d950512f06d5c91ccbf9da6dd82fd77371a2f4ceedea6233" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.629677 5129 scope.go:117] "RemoveContainer" containerID="265709864d4bbe81365dd3ca384cde6ecd5556373744960a9509dcfe5ba20279" Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.642385 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b54nz"] Mar 14 09:34:37 crc kubenswrapper[5129]: I0314 09:34:37.667917 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b54nz"] Mar 14 09:34:38 crc kubenswrapper[5129]: I0314 09:34:38.071884 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" path="/var/lib/kubelet/pods/bc01c711-b02b-48e2-8d63-bccc6b1babd5/volumes" Mar 14 09:34:42 crc kubenswrapper[5129]: I0314 09:34:42.037508 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:34:42 crc kubenswrapper[5129]: E0314 09:34:42.038687 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:34:53 crc kubenswrapper[5129]: I0314 09:34:53.036043 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:34:53 crc kubenswrapper[5129]: E0314 09:34:53.037491 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:35:05 crc kubenswrapper[5129]: I0314 09:35:05.036525 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:35:05 crc kubenswrapper[5129]: E0314 09:35:05.037549 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:35:18 crc kubenswrapper[5129]: I0314 09:35:18.046119 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:35:18 crc kubenswrapper[5129]: E0314 09:35:18.047347 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:35:29 crc kubenswrapper[5129]: I0314 09:35:29.037650 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:35:29 crc kubenswrapper[5129]: E0314 09:35:29.038499 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.442486 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2fm5f"] Mar 14 09:35:32 crc kubenswrapper[5129]: E0314 09:35:32.445877 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="extract-utilities" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.445894 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="extract-utilities" Mar 14 09:35:32 crc kubenswrapper[5129]: E0314 09:35:32.445910 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="registry-server" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.445921 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="registry-server" Mar 14 09:35:32 crc kubenswrapper[5129]: E0314 09:35:32.445947 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="extract-content" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.445954 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="extract-content" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.446187 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc01c711-b02b-48e2-8d63-bccc6b1babd5" containerName="registry-server" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.447744 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.464853 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fm5f"] Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.580361 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-catalog-content\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.580531 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtc4z\" (UniqueName: \"kubernetes.io/projected/89c5cdf3-4690-46be-ae26-c211f5d0126c-kube-api-access-mtc4z\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.580630 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-utilities\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.682894 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-utilities\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.683037 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-catalog-content\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.683155 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc4z\" (UniqueName: \"kubernetes.io/projected/89c5cdf3-4690-46be-ae26-c211f5d0126c-kube-api-access-mtc4z\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.683692 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-utilities\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.684154 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-catalog-content\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.736700 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc4z\" (UniqueName: \"kubernetes.io/projected/89c5cdf3-4690-46be-ae26-c211f5d0126c-kube-api-access-mtc4z\") pod \"redhat-marketplace-2fm5f\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:32 crc kubenswrapper[5129]: I0314 09:35:32.783627 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:33 crc kubenswrapper[5129]: I0314 09:35:33.357439 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fm5f"] Mar 14 09:35:33 crc kubenswrapper[5129]: I0314 09:35:33.401849 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fm5f" event={"ID":"89c5cdf3-4690-46be-ae26-c211f5d0126c","Type":"ContainerStarted","Data":"1bbce135767aa3fe9724f278aaecd9a0b920afd92a2fabb3c109287b7a0b2566"} Mar 14 09:35:34 crc kubenswrapper[5129]: I0314 09:35:34.412918 5129 generic.go:334] "Generic (PLEG): container finished" podID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerID="689da005c1ec7010d5469c36a92a8e3b7cd318a03911805526f367a775a633c5" exitCode=0 Mar 14 09:35:34 crc kubenswrapper[5129]: I0314 09:35:34.412990 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fm5f" event={"ID":"89c5cdf3-4690-46be-ae26-c211f5d0126c","Type":"ContainerDied","Data":"689da005c1ec7010d5469c36a92a8e3b7cd318a03911805526f367a775a633c5"} Mar 14 09:35:34 crc kubenswrapper[5129]: I0314 09:35:34.415307 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:35:35 crc kubenswrapper[5129]: I0314 09:35:35.426589 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fm5f" event={"ID":"89c5cdf3-4690-46be-ae26-c211f5d0126c","Type":"ContainerStarted","Data":"d16e09e6e79a8e672fb9190a8e69826db43c3f528ef7a0e17671b03f6a86d7e6"} Mar 14 09:35:36 crc kubenswrapper[5129]: I0314 09:35:36.438729 5129 generic.go:334] "Generic (PLEG): container finished" podID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerID="d16e09e6e79a8e672fb9190a8e69826db43c3f528ef7a0e17671b03f6a86d7e6" exitCode=0 Mar 14 09:35:36 crc kubenswrapper[5129]: I0314 09:35:36.438846 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fm5f" event={"ID":"89c5cdf3-4690-46be-ae26-c211f5d0126c","Type":"ContainerDied","Data":"d16e09e6e79a8e672fb9190a8e69826db43c3f528ef7a0e17671b03f6a86d7e6"} Mar 14 09:35:37 crc kubenswrapper[5129]: I0314 09:35:37.451842 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fm5f" event={"ID":"89c5cdf3-4690-46be-ae26-c211f5d0126c","Type":"ContainerStarted","Data":"6f2e8c74484593a422a7bf7bf072cb9caf16a76f10d0e03cc285c2b686c7cb3c"} Mar 14 09:35:37 crc kubenswrapper[5129]: I0314 09:35:37.487638 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2fm5f" podStartSLOduration=2.958291934 podStartE2EDuration="5.487611779s" podCreationTimestamp="2026-03-14 09:35:32 +0000 UTC" firstStartedPulling="2026-03-14 09:35:34.415040229 +0000 UTC m=+9397.166955413" lastFinishedPulling="2026-03-14 09:35:36.944360074 +0000 UTC m=+9399.696275258" observedRunningTime="2026-03-14 09:35:37.477967047 +0000 UTC m=+9400.229882231" watchObservedRunningTime="2026-03-14 09:35:37.487611779 +0000 UTC m=+9400.239526983" Mar 14 09:35:42 crc kubenswrapper[5129]: I0314 09:35:42.037886 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:35:42 crc kubenswrapper[5129]: E0314 09:35:42.039183 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:35:42 crc kubenswrapper[5129]: I0314 09:35:42.784893 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:42 crc kubenswrapper[5129]: I0314 09:35:42.784983 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:42 crc kubenswrapper[5129]: I0314 09:35:42.860949 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:43 crc kubenswrapper[5129]: I0314 09:35:43.597937 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:43 crc kubenswrapper[5129]: I0314 09:35:43.651562 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fm5f"] Mar 14 09:35:45 crc kubenswrapper[5129]: I0314 09:35:45.572881 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2fm5f" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="registry-server" containerID="cri-o://6f2e8c74484593a422a7bf7bf072cb9caf16a76f10d0e03cc285c2b686c7cb3c" gracePeriod=2 Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.587719 5129 generic.go:334] "Generic (PLEG): container finished" podID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerID="6f2e8c74484593a422a7bf7bf072cb9caf16a76f10d0e03cc285c2b686c7cb3c" exitCode=0 Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.587796 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fm5f" event={"ID":"89c5cdf3-4690-46be-ae26-c211f5d0126c","Type":"ContainerDied","Data":"6f2e8c74484593a422a7bf7bf072cb9caf16a76f10d0e03cc285c2b686c7cb3c"} Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.588238 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fm5f" event={"ID":"89c5cdf3-4690-46be-ae26-c211f5d0126c","Type":"ContainerDied","Data":"1bbce135767aa3fe9724f278aaecd9a0b920afd92a2fabb3c109287b7a0b2566"} Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.588260 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbce135767aa3fe9724f278aaecd9a0b920afd92a2fabb3c109287b7a0b2566" Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.645907 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.763520 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtc4z\" (UniqueName: \"kubernetes.io/projected/89c5cdf3-4690-46be-ae26-c211f5d0126c-kube-api-access-mtc4z\") pod \"89c5cdf3-4690-46be-ae26-c211f5d0126c\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.764102 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-utilities\") pod \"89c5cdf3-4690-46be-ae26-c211f5d0126c\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.764187 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-catalog-content\") pod \"89c5cdf3-4690-46be-ae26-c211f5d0126c\" (UID: \"89c5cdf3-4690-46be-ae26-c211f5d0126c\") " Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.766253 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-utilities" (OuterVolumeSpecName: "utilities") pod "89c5cdf3-4690-46be-ae26-c211f5d0126c" (UID: "89c5cdf3-4690-46be-ae26-c211f5d0126c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.779151 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c5cdf3-4690-46be-ae26-c211f5d0126c-kube-api-access-mtc4z" (OuterVolumeSpecName: "kube-api-access-mtc4z") pod "89c5cdf3-4690-46be-ae26-c211f5d0126c" (UID: "89c5cdf3-4690-46be-ae26-c211f5d0126c"). InnerVolumeSpecName "kube-api-access-mtc4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.798527 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89c5cdf3-4690-46be-ae26-c211f5d0126c" (UID: "89c5cdf3-4690-46be-ae26-c211f5d0126c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.868238 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtc4z\" (UniqueName: \"kubernetes.io/projected/89c5cdf3-4690-46be-ae26-c211f5d0126c-kube-api-access-mtc4z\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.868326 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:46 crc kubenswrapper[5129]: I0314 09:35:46.868343 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c5cdf3-4690-46be-ae26-c211f5d0126c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:35:47 crc kubenswrapper[5129]: I0314 09:35:47.599858 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fm5f" Mar 14 09:35:47 crc kubenswrapper[5129]: I0314 09:35:47.649812 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fm5f"] Mar 14 09:35:47 crc kubenswrapper[5129]: I0314 09:35:47.668509 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fm5f"] Mar 14 09:35:47 crc kubenswrapper[5129]: E0314 09:35:47.749366 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c5cdf3_4690_46be_ae26_c211f5d0126c.slice/crio-1bbce135767aa3fe9724f278aaecd9a0b920afd92a2fabb3c109287b7a0b2566\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c5cdf3_4690_46be_ae26_c211f5d0126c.slice\": RecentStats: unable to find data in memory cache]" Mar 14 09:35:48 crc kubenswrapper[5129]: I0314 09:35:48.084796 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" path="/var/lib/kubelet/pods/89c5cdf3-4690-46be-ae26-c211f5d0126c/volumes" Mar 14 09:35:50 crc kubenswrapper[5129]: I0314 09:35:50.417374 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerName="galera" probeResult="failure" output="command timed out" Mar 14 09:35:50 crc kubenswrapper[5129]: I0314 09:35:50.418429 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerName="galera" probeResult="failure" output="command timed out" Mar 14 09:35:57 crc kubenswrapper[5129]: I0314 09:35:57.037121 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:35:57 crc kubenswrapper[5129]: E0314 09:35:57.038089 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.174569 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558016-422qk"] Mar 14 09:36:00 crc kubenswrapper[5129]: E0314 09:36:00.176829 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="extract-utilities" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.176859 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="extract-utilities" Mar 14 09:36:00 crc kubenswrapper[5129]: E0314 09:36:00.176902 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="registry-server" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.176916 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="registry-server" Mar 14 09:36:00 crc kubenswrapper[5129]: E0314 09:36:00.176992 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="extract-content" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.177007 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="extract-content" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.177417 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c5cdf3-4690-46be-ae26-c211f5d0126c" containerName="registry-server" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.178954 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-422qk" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.185491 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.185565 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.186302 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.190368 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-422qk"] Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.260202 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsw4c\" (UniqueName: \"kubernetes.io/projected/1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98-kube-api-access-qsw4c\") pod \"auto-csr-approver-29558016-422qk\" (UID: \"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98\") " pod="openshift-infra/auto-csr-approver-29558016-422qk" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.364273 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsw4c\" (UniqueName: \"kubernetes.io/projected/1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98-kube-api-access-qsw4c\") pod \"auto-csr-approver-29558016-422qk\" (UID: \"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98\") " pod="openshift-infra/auto-csr-approver-29558016-422qk" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.397977 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsw4c\" (UniqueName: \"kubernetes.io/projected/1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98-kube-api-access-qsw4c\") pod \"auto-csr-approver-29558016-422qk\" (UID: \"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98\") " pod="openshift-infra/auto-csr-approver-29558016-422qk" Mar 14 09:36:00 crc kubenswrapper[5129]: I0314 09:36:00.504972 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-422qk" Mar 14 09:36:01 crc kubenswrapper[5129]: I0314 09:36:01.361895 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-422qk"] Mar 14 09:36:01 crc kubenswrapper[5129]: W0314 09:36:01.368384 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e66e8d9_d262_4c9f_8ef3_7eb33ea19a98.slice/crio-028dd9ac63c2940727d450058c7cff921aad111bd23a7861928f2ef5f7f4afe7 WatchSource:0}: Error finding container 028dd9ac63c2940727d450058c7cff921aad111bd23a7861928f2ef5f7f4afe7: Status 404 returned error can't find the container with id 028dd9ac63c2940727d450058c7cff921aad111bd23a7861928f2ef5f7f4afe7 Mar 14 09:36:01 crc kubenswrapper[5129]: I0314 09:36:01.881858 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-422qk" event={"ID":"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98","Type":"ContainerStarted","Data":"028dd9ac63c2940727d450058c7cff921aad111bd23a7861928f2ef5f7f4afe7"} Mar 14 09:36:02 crc kubenswrapper[5129]: I0314 09:36:02.897949 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-422qk" event={"ID":"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98","Type":"ContainerStarted","Data":"6ece8fe87ecff869cd2fe17822616508f2ab609ade8ff9d985ef99da7fc343c4"} Mar 14 09:36:02 crc kubenswrapper[5129]: I0314 09:36:02.928653 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558016-422qk" podStartSLOduration=1.894485465 podStartE2EDuration="2.928623115s" podCreationTimestamp="2026-03-14 09:36:00 +0000 UTC" firstStartedPulling="2026-03-14 09:36:01.374705337 +0000 UTC m=+9424.126620521" lastFinishedPulling="2026-03-14 09:36:02.408842967 +0000 UTC m=+9425.160758171" observedRunningTime="2026-03-14 09:36:02.922032486 +0000 UTC m=+9425.673947690" watchObservedRunningTime="2026-03-14 09:36:02.928623115 +0000 UTC m=+9425.680538329" Mar 14 09:36:03 crc kubenswrapper[5129]: I0314 09:36:03.911305 5129 generic.go:334] "Generic (PLEG): container finished" podID="1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98" containerID="6ece8fe87ecff869cd2fe17822616508f2ab609ade8ff9d985ef99da7fc343c4" exitCode=0 Mar 14 09:36:03 crc kubenswrapper[5129]: I0314 09:36:03.911403 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-422qk" event={"ID":"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98","Type":"ContainerDied","Data":"6ece8fe87ecff869cd2fe17822616508f2ab609ade8ff9d985ef99da7fc343c4"} Mar 14 09:36:05 crc kubenswrapper[5129]: I0314 09:36:05.940485 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-422qk" event={"ID":"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98","Type":"ContainerDied","Data":"028dd9ac63c2940727d450058c7cff921aad111bd23a7861928f2ef5f7f4afe7"} Mar 14 09:36:05 crc kubenswrapper[5129]: I0314 09:36:05.940907 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="028dd9ac63c2940727d450058c7cff921aad111bd23a7861928f2ef5f7f4afe7" Mar 14 09:36:06 crc kubenswrapper[5129]: I0314 09:36:06.073756 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-422qk" Mar 14 09:36:06 crc kubenswrapper[5129]: I0314 09:36:06.244667 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsw4c\" (UniqueName: \"kubernetes.io/projected/1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98-kube-api-access-qsw4c\") pod \"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98\" (UID: \"1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98\") " Mar 14 09:36:06 crc kubenswrapper[5129]: I0314 09:36:06.250877 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98-kube-api-access-qsw4c" (OuterVolumeSpecName: "kube-api-access-qsw4c") pod "1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98" (UID: "1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98"). InnerVolumeSpecName "kube-api-access-qsw4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:06 crc kubenswrapper[5129]: I0314 09:36:06.348816 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsw4c\" (UniqueName: \"kubernetes.io/projected/1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98-kube-api-access-qsw4c\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:06 crc kubenswrapper[5129]: I0314 09:36:06.954690 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-422qk" Mar 14 09:36:07 crc kubenswrapper[5129]: I0314 09:36:07.199559 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-26868"] Mar 14 09:36:07 crc kubenswrapper[5129]: I0314 09:36:07.216774 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-26868"] Mar 14 09:36:08 crc kubenswrapper[5129]: I0314 09:36:08.054947 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90ecf9e6-a1ff-4ce4-8e57-931450dd5261" path="/var/lib/kubelet/pods/90ecf9e6-a1ff-4ce4-8e57-931450dd5261/volumes" Mar 14 09:36:12 crc kubenswrapper[5129]: I0314 09:36:12.037887 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:36:12 crc kubenswrapper[5129]: E0314 09:36:12.039762 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.539564 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-blq5l"] Mar 14 09:36:17 crc kubenswrapper[5129]: E0314 09:36:17.541575 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98" containerName="oc" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.541631 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98" containerName="oc" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.542054 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98" containerName="oc" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.545459 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.557431 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-blq5l"] Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.684271 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq57f\" (UniqueName: \"kubernetes.io/projected/63f9d51f-dca6-4557-b8f0-d79815c8682a-kube-api-access-lq57f\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.684395 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-utilities\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.684437 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-catalog-content\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.786733 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq57f\" (UniqueName: \"kubernetes.io/projected/63f9d51f-dca6-4557-b8f0-d79815c8682a-kube-api-access-lq57f\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.786846 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-utilities\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.786887 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-catalog-content\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.787497 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-catalog-content\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.787592 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-utilities\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.810925 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq57f\" (UniqueName: \"kubernetes.io/projected/63f9d51f-dca6-4557-b8f0-d79815c8682a-kube-api-access-lq57f\") pod \"redhat-operators-blq5l\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:17 crc kubenswrapper[5129]: I0314 09:36:17.876587 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:18 crc kubenswrapper[5129]: I0314 09:36:18.383830 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-blq5l"] Mar 14 09:36:19 crc kubenswrapper[5129]: I0314 09:36:19.128058 5129 generic.go:334] "Generic (PLEG): container finished" podID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerID="43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361" exitCode=0 Mar 14 09:36:19 crc kubenswrapper[5129]: I0314 09:36:19.128193 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blq5l" event={"ID":"63f9d51f-dca6-4557-b8f0-d79815c8682a","Type":"ContainerDied","Data":"43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361"} Mar 14 09:36:19 crc kubenswrapper[5129]: I0314 09:36:19.128805 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blq5l" event={"ID":"63f9d51f-dca6-4557-b8f0-d79815c8682a","Type":"ContainerStarted","Data":"cf572b5a4463855c19a935a71d2483c2050aad1d51b1ff1b621f8e09de8d8914"} Mar 14 09:36:20 crc kubenswrapper[5129]: I0314 09:36:20.147877 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blq5l" event={"ID":"63f9d51f-dca6-4557-b8f0-d79815c8682a","Type":"ContainerStarted","Data":"40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896"} Mar 14 09:36:23 crc kubenswrapper[5129]: I0314 09:36:23.180387 5129 generic.go:334] "Generic (PLEG): container finished" podID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerID="40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896" exitCode=0 Mar 14 09:36:23 crc kubenswrapper[5129]: I0314 09:36:23.180642 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blq5l" event={"ID":"63f9d51f-dca6-4557-b8f0-d79815c8682a","Type":"ContainerDied","Data":"40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896"} Mar 14 09:36:24 crc kubenswrapper[5129]: I0314 09:36:24.037168 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:36:24 crc kubenswrapper[5129]: E0314 09:36:24.038171 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:36:24 crc kubenswrapper[5129]: I0314 09:36:24.197008 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blq5l" event={"ID":"63f9d51f-dca6-4557-b8f0-d79815c8682a","Type":"ContainerStarted","Data":"deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e"} Mar 14 09:36:24 crc kubenswrapper[5129]: I0314 09:36:24.232306 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-blq5l" podStartSLOduration=2.7295754309999998 podStartE2EDuration="7.23228498s" podCreationTimestamp="2026-03-14 09:36:17 +0000 UTC" firstStartedPulling="2026-03-14 09:36:19.13125066 +0000 UTC m=+9441.883165844" lastFinishedPulling="2026-03-14 09:36:23.633960189 +0000 UTC m=+9446.385875393" observedRunningTime="2026-03-14 09:36:24.221628081 +0000 UTC m=+9446.973543265" watchObservedRunningTime="2026-03-14 09:36:24.23228498 +0000 UTC m=+9446.984200154" Mar 14 09:36:27 crc kubenswrapper[5129]: I0314 09:36:27.877817 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:27 crc kubenswrapper[5129]: I0314 09:36:27.879256 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:28 crc kubenswrapper[5129]: I0314 09:36:28.957790 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-blq5l" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="registry-server" probeResult="failure" output=< Mar 14 09:36:28 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:36:28 crc kubenswrapper[5129]: > Mar 14 09:36:35 crc kubenswrapper[5129]: I0314 09:36:35.038497 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:36:35 crc kubenswrapper[5129]: E0314 09:36:35.040266 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:36:36 crc kubenswrapper[5129]: I0314 09:36:36.151658 5129 scope.go:117] "RemoveContainer" containerID="d48ab269e23093c0eb28fc66ad6848f5499cc9c45566e6d196a92080f995a12e" Mar 14 09:36:37 crc kubenswrapper[5129]: I0314 09:36:37.929677 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:37 crc kubenswrapper[5129]: I0314 09:36:37.989005 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:38 crc kubenswrapper[5129]: I0314 09:36:38.175526 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-blq5l"] Mar 14 09:36:39 crc kubenswrapper[5129]: I0314 09:36:39.416788 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-blq5l" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="registry-server" containerID="cri-o://deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e" gracePeriod=2 Mar 14 09:36:39 crc kubenswrapper[5129]: I0314 09:36:39.981304 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.121192 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-utilities\") pod \"63f9d51f-dca6-4557-b8f0-d79815c8682a\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.121451 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq57f\" (UniqueName: \"kubernetes.io/projected/63f9d51f-dca6-4557-b8f0-d79815c8682a-kube-api-access-lq57f\") pod \"63f9d51f-dca6-4557-b8f0-d79815c8682a\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.121573 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-catalog-content\") pod \"63f9d51f-dca6-4557-b8f0-d79815c8682a\" (UID: \"63f9d51f-dca6-4557-b8f0-d79815c8682a\") " Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.122883 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-utilities" (OuterVolumeSpecName: "utilities") pod "63f9d51f-dca6-4557-b8f0-d79815c8682a" (UID: "63f9d51f-dca6-4557-b8f0-d79815c8682a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.123479 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.132829 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f9d51f-dca6-4557-b8f0-d79815c8682a-kube-api-access-lq57f" (OuterVolumeSpecName: "kube-api-access-lq57f") pod "63f9d51f-dca6-4557-b8f0-d79815c8682a" (UID: "63f9d51f-dca6-4557-b8f0-d79815c8682a"). InnerVolumeSpecName "kube-api-access-lq57f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.225303 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq57f\" (UniqueName: \"kubernetes.io/projected/63f9d51f-dca6-4557-b8f0-d79815c8682a-kube-api-access-lq57f\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.265130 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63f9d51f-dca6-4557-b8f0-d79815c8682a" (UID: "63f9d51f-dca6-4557-b8f0-d79815c8682a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.327878 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f9d51f-dca6-4557-b8f0-d79815c8682a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.433210 5129 generic.go:334] "Generic (PLEG): container finished" podID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerID="deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e" exitCode=0 Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.433419 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blq5l" event={"ID":"63f9d51f-dca6-4557-b8f0-d79815c8682a","Type":"ContainerDied","Data":"deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e"} Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.434122 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-blq5l" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.434931 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-blq5l" event={"ID":"63f9d51f-dca6-4557-b8f0-d79815c8682a","Type":"ContainerDied","Data":"cf572b5a4463855c19a935a71d2483c2050aad1d51b1ff1b621f8e09de8d8914"} Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.434983 5129 scope.go:117] "RemoveContainer" containerID="deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.493824 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-blq5l"] Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.494108 5129 scope.go:117] "RemoveContainer" containerID="40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896" Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.503908 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-blq5l"] Mar 14 09:36:40 crc kubenswrapper[5129]: I0314 09:36:40.541485 5129 scope.go:117] "RemoveContainer" containerID="43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361" Mar 14 09:36:41 crc kubenswrapper[5129]: I0314 09:36:41.396384 5129 scope.go:117] "RemoveContainer" containerID="deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e" Mar 14 09:36:41 crc kubenswrapper[5129]: E0314 09:36:41.397117 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e\": container with ID starting with deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e not found: ID does not exist" containerID="deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e" Mar 14 09:36:41 crc kubenswrapper[5129]: I0314 09:36:41.397193 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e"} err="failed to get container status \"deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e\": rpc error: code = NotFound desc = could not find container \"deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e\": container with ID starting with deed4955f375105e80e746bbccc19398ee23030c9a1d69235866282153a3e60e not found: ID does not exist" Mar 14 09:36:41 crc kubenswrapper[5129]: I0314 09:36:41.397236 5129 scope.go:117] "RemoveContainer" containerID="40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896" Mar 14 09:36:41 crc kubenswrapper[5129]: E0314 09:36:41.397812 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896\": container with ID starting with 40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896 not found: ID does not exist" containerID="40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896" Mar 14 09:36:41 crc kubenswrapper[5129]: I0314 09:36:41.397857 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896"} err="failed to get container status \"40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896\": rpc error: code = NotFound desc = could not find container \"40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896\": container with ID starting with 40fab1d0121ad2f52f6f1bd539ab95c58899e2dff135b0d06545ab01c65b2896 not found: ID does not exist" Mar 14 09:36:41 crc kubenswrapper[5129]: I0314 09:36:41.397900 5129 scope.go:117] "RemoveContainer" containerID="43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361" Mar 14 09:36:41 crc kubenswrapper[5129]: E0314 09:36:41.398312 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361\": container with ID starting with 43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361 not found: ID does not exist" containerID="43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361" Mar 14 09:36:41 crc kubenswrapper[5129]: I0314 09:36:41.398337 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361"} err="failed to get container status \"43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361\": rpc error: code = NotFound desc = could not find container \"43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361\": container with ID starting with 43148d809efcedc1ffaee9ea83bb8107f74a01e19771fb734858f29de507e361 not found: ID does not exist" Mar 14 09:36:42 crc kubenswrapper[5129]: I0314 09:36:42.062049 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" path="/var/lib/kubelet/pods/63f9d51f-dca6-4557-b8f0-d79815c8682a/volumes" Mar 14 09:36:47 crc kubenswrapper[5129]: I0314 09:36:47.036995 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:36:47 crc kubenswrapper[5129]: E0314 09:36:47.038031 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:36:59 crc kubenswrapper[5129]: I0314 09:36:59.036884 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:36:59 crc kubenswrapper[5129]: E0314 09:36:59.038227 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:37:12 crc kubenswrapper[5129]: I0314 09:37:12.056620 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:37:12 crc kubenswrapper[5129]: E0314 09:37:12.057891 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:37:27 crc kubenswrapper[5129]: I0314 09:37:27.036694 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:37:28 crc kubenswrapper[5129]: I0314 09:37:28.208824 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"6c6c70ed35abb8104cdb80e74bbb70d10c64039fda720bf97b8f982caf34cf4c"} Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.160494 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558018-gdb87"] Mar 14 09:38:00 crc kubenswrapper[5129]: E0314 09:38:00.161490 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="registry-server" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.161508 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="registry-server" Mar 14 09:38:00 crc kubenswrapper[5129]: E0314 09:38:00.161539 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="extract-content" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.161545 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="extract-content" Mar 14 09:38:00 crc kubenswrapper[5129]: E0314 09:38:00.161559 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="extract-utilities" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.161568 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="extract-utilities" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.161768 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f9d51f-dca6-4557-b8f0-d79815c8682a" containerName="registry-server" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.162430 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-gdb87" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.165830 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.166250 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.166545 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.187267 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-gdb87"] Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.334791 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfzf\" (UniqueName: \"kubernetes.io/projected/aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7-kube-api-access-bzfzf\") pod \"auto-csr-approver-29558018-gdb87\" (UID: \"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7\") " pod="openshift-infra/auto-csr-approver-29558018-gdb87" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.438933 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfzf\" (UniqueName: \"kubernetes.io/projected/aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7-kube-api-access-bzfzf\") pod \"auto-csr-approver-29558018-gdb87\" (UID: \"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7\") " pod="openshift-infra/auto-csr-approver-29558018-gdb87" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.465576 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfzf\" (UniqueName: \"kubernetes.io/projected/aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7-kube-api-access-bzfzf\") pod \"auto-csr-approver-29558018-gdb87\" (UID: \"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7\") " pod="openshift-infra/auto-csr-approver-29558018-gdb87" Mar 14 09:38:00 crc kubenswrapper[5129]: I0314 09:38:00.487799 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-gdb87" Mar 14 09:38:01 crc kubenswrapper[5129]: I0314 09:38:01.021580 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-gdb87"] Mar 14 09:38:01 crc kubenswrapper[5129]: I0314 09:38:01.707373 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-gdb87" event={"ID":"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7","Type":"ContainerStarted","Data":"0cf2fc7816fa21330452fa835e039fcb328b60891a210e5c97cc58d3224ef7cf"} Mar 14 09:38:02 crc kubenswrapper[5129]: I0314 09:38:02.719959 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-gdb87" event={"ID":"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7","Type":"ContainerStarted","Data":"7221b561d1673c8b9037fede56f840f460a32c47e3362518d093389c1a3b2d1e"} Mar 14 09:38:02 crc kubenswrapper[5129]: I0314 09:38:02.742344 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558018-gdb87" podStartSLOduration=1.9328550660000001 podStartE2EDuration="2.742323318s" podCreationTimestamp="2026-03-14 09:38:00 +0000 UTC" firstStartedPulling="2026-03-14 09:38:01.029486385 +0000 UTC m=+9543.781401569" lastFinishedPulling="2026-03-14 09:38:01.838954637 +0000 UTC m=+9544.590869821" observedRunningTime="2026-03-14 09:38:02.740509008 +0000 UTC m=+9545.492424192" watchObservedRunningTime="2026-03-14 09:38:02.742323318 +0000 UTC m=+9545.494238502" Mar 14 09:38:03 crc kubenswrapper[5129]: I0314 09:38:03.730658 5129 generic.go:334] "Generic (PLEG): container finished" podID="aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7" containerID="7221b561d1673c8b9037fede56f840f460a32c47e3362518d093389c1a3b2d1e" exitCode=0 Mar 14 09:38:03 crc kubenswrapper[5129]: I0314 09:38:03.730750 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-gdb87" event={"ID":"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7","Type":"ContainerDied","Data":"7221b561d1673c8b9037fede56f840f460a32c47e3362518d093389c1a3b2d1e"} Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.166636 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-gdb87" Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.193996 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfzf\" (UniqueName: \"kubernetes.io/projected/aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7-kube-api-access-bzfzf\") pod \"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7\" (UID: \"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7\") " Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.204512 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7-kube-api-access-bzfzf" (OuterVolumeSpecName: "kube-api-access-bzfzf") pod "aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7" (UID: "aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7"). InnerVolumeSpecName "kube-api-access-bzfzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.297370 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfzf\" (UniqueName: \"kubernetes.io/projected/aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7-kube-api-access-bzfzf\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.796647 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-gdb87" event={"ID":"aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7","Type":"ContainerDied","Data":"0cf2fc7816fa21330452fa835e039fcb328b60891a210e5c97cc58d3224ef7cf"} Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.796733 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cf2fc7816fa21330452fa835e039fcb328b60891a210e5c97cc58d3224ef7cf" Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.796867 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-gdb87" Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.836165 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-jbc2x"] Mar 14 09:38:05 crc kubenswrapper[5129]: I0314 09:38:05.847341 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-jbc2x"] Mar 14 09:38:06 crc kubenswrapper[5129]: I0314 09:38:06.061531 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2240b996-b085-40b6-bc39-f2466ca00def" path="/var/lib/kubelet/pods/2240b996-b085-40b6-bc39-f2466ca00def/volumes" Mar 14 09:38:36 crc kubenswrapper[5129]: I0314 09:38:36.331888 5129 scope.go:117] "RemoveContainer" containerID="d2436b469869623fd27216a0bc87d82dab1b87682bf32eec3839e49c624136e3" Mar 14 09:38:40 crc kubenswrapper[5129]: I0314 09:38:40.269292 5129 generic.go:334] "Generic (PLEG): container finished" podID="623aba17-af6f-4ec2-8d79-1d71984816d2" containerID="d4ce7fa6729a0852719287482f1d8b484fc51084c781b28eb53ea054d9b4fa65" exitCode=0 Mar 14 09:38:40 crc kubenswrapper[5129]: I0314 09:38:40.269443 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" event={"ID":"623aba17-af6f-4ec2-8d79-1d71984816d2","Type":"ContainerDied","Data":"d4ce7fa6729a0852719287482f1d8b484fc51084c781b28eb53ea054d9b4fa65"} Mar 14 09:38:41 crc kubenswrapper[5129]: I0314 09:38:41.851451 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.025300 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-secret-0\") pod \"623aba17-af6f-4ec2-8d79-1d71984816d2\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.025459 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj9sp\" (UniqueName: \"kubernetes.io/projected/623aba17-af6f-4ec2-8d79-1d71984816d2-kube-api-access-zj9sp\") pod \"623aba17-af6f-4ec2-8d79-1d71984816d2\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.025672 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-combined-ca-bundle\") pod \"623aba17-af6f-4ec2-8d79-1d71984816d2\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.025712 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-ssh-key-openstack-cell1\") pod \"623aba17-af6f-4ec2-8d79-1d71984816d2\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.025747 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-inventory\") pod \"623aba17-af6f-4ec2-8d79-1d71984816d2\" (UID: \"623aba17-af6f-4ec2-8d79-1d71984816d2\") " Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.035006 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623aba17-af6f-4ec2-8d79-1d71984816d2-kube-api-access-zj9sp" (OuterVolumeSpecName: "kube-api-access-zj9sp") pod "623aba17-af6f-4ec2-8d79-1d71984816d2" (UID: "623aba17-af6f-4ec2-8d79-1d71984816d2"). InnerVolumeSpecName "kube-api-access-zj9sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.039794 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "623aba17-af6f-4ec2-8d79-1d71984816d2" (UID: "623aba17-af6f-4ec2-8d79-1d71984816d2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.063204 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-inventory" (OuterVolumeSpecName: "inventory") pod "623aba17-af6f-4ec2-8d79-1d71984816d2" (UID: "623aba17-af6f-4ec2-8d79-1d71984816d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.072027 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "623aba17-af6f-4ec2-8d79-1d71984816d2" (UID: "623aba17-af6f-4ec2-8d79-1d71984816d2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.078096 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "623aba17-af6f-4ec2-8d79-1d71984816d2" (UID: "623aba17-af6f-4ec2-8d79-1d71984816d2"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.129874 5129 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.129913 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.129924 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.129934 5129 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/623aba17-af6f-4ec2-8d79-1d71984816d2-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.129946 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj9sp\" (UniqueName: \"kubernetes.io/projected/623aba17-af6f-4ec2-8d79-1d71984816d2-kube-api-access-zj9sp\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.294717 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" event={"ID":"623aba17-af6f-4ec2-8d79-1d71984816d2","Type":"ContainerDied","Data":"012410b0805dccc1d3bdc50df57f4b15f14302c323bb38c31b05cd1b34fac0b9"} Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.294797 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="012410b0805dccc1d3bdc50df57f4b15f14302c323bb38c31b05cd1b34fac0b9" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.294804 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-p8qzl" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.597367 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-86mwr"] Mar 14 09:38:42 crc kubenswrapper[5129]: E0314 09:38:42.597801 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623aba17-af6f-4ec2-8d79-1d71984816d2" containerName="libvirt-openstack-openstack-cell1" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.597818 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="623aba17-af6f-4ec2-8d79-1d71984816d2" containerName="libvirt-openstack-openstack-cell1" Mar 14 09:38:42 crc kubenswrapper[5129]: E0314 09:38:42.597863 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7" containerName="oc" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.597870 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7" containerName="oc" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.598033 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="623aba17-af6f-4ec2-8d79-1d71984816d2" containerName="libvirt-openstack-openstack-cell1" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.598050 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7" containerName="oc" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.598743 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.601620 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.601791 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.602123 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.602215 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.602466 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.602751 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.605099 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.611815 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-86mwr"] Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.754457 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.754570 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.754650 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755042 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755125 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755161 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755213 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-inventory\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755242 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755280 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4kv\" (UniqueName: \"kubernetes.io/projected/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-kube-api-access-4x4kv\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755325 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.755409 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857112 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857207 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857263 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857292 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857358 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857389 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857411 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857435 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-inventory\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857455 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857474 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4kv\" (UniqueName: \"kubernetes.io/projected/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-kube-api-access-4x4kv\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.857494 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.858838 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.867487 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-inventory\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.868098 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.868195 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.868805 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.868910 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.868958 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.869115 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.884456 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.886683 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.888006 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4kv\" (UniqueName: \"kubernetes.io/projected/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-kube-api-access-4x4kv\") pod \"nova-cell1-openstack-openstack-cell1-86mwr\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:42 crc kubenswrapper[5129]: I0314 09:38:42.970038 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:38:43 crc kubenswrapper[5129]: I0314 09:38:43.619133 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-86mwr"] Mar 14 09:38:44 crc kubenswrapper[5129]: I0314 09:38:44.367480 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" event={"ID":"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9","Type":"ContainerStarted","Data":"40fb9927776c45f9f43a851569b480a9fbdbb8ce6aaae0833238ac5255eac2c3"} Mar 14 09:38:45 crc kubenswrapper[5129]: I0314 09:38:45.381728 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" event={"ID":"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9","Type":"ContainerStarted","Data":"75683916f9a502c48651ee21f0f0a752a86f9538dcc193cd658d083d30afd030"} Mar 14 09:38:45 crc kubenswrapper[5129]: I0314 09:38:45.417888 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" podStartSLOduration=2.99356507 podStartE2EDuration="3.417859728s" podCreationTimestamp="2026-03-14 09:38:42 +0000 UTC" firstStartedPulling="2026-03-14 09:38:43.624402637 +0000 UTC m=+9586.376317821" lastFinishedPulling="2026-03-14 09:38:44.048697285 +0000 UTC m=+9586.800612479" observedRunningTime="2026-03-14 09:38:45.408325169 +0000 UTC m=+9588.160240353" watchObservedRunningTime="2026-03-14 09:38:45.417859728 +0000 UTC m=+9588.169774912" Mar 14 09:39:49 crc kubenswrapper[5129]: I0314 09:39:49.574924 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:39:49 crc kubenswrapper[5129]: I0314 09:39:49.575934 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.178355 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558020-q9vqx"] Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.180970 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.187261 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.188273 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.188501 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.202515 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-q9vqx"] Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.280940 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcrgq\" (UniqueName: \"kubernetes.io/projected/5463f64e-9ca8-4ff7-8e75-25d58deedadf-kube-api-access-jcrgq\") pod \"auto-csr-approver-29558020-q9vqx\" (UID: \"5463f64e-9ca8-4ff7-8e75-25d58deedadf\") " pod="openshift-infra/auto-csr-approver-29558020-q9vqx" Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.382398 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcrgq\" (UniqueName: \"kubernetes.io/projected/5463f64e-9ca8-4ff7-8e75-25d58deedadf-kube-api-access-jcrgq\") pod \"auto-csr-approver-29558020-q9vqx\" (UID: \"5463f64e-9ca8-4ff7-8e75-25d58deedadf\") " pod="openshift-infra/auto-csr-approver-29558020-q9vqx" Mar 14 09:40:00 crc kubenswrapper[5129]: I0314 09:40:00.921500 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcrgq\" (UniqueName: \"kubernetes.io/projected/5463f64e-9ca8-4ff7-8e75-25d58deedadf-kube-api-access-jcrgq\") pod \"auto-csr-approver-29558020-q9vqx\" (UID: \"5463f64e-9ca8-4ff7-8e75-25d58deedadf\") " pod="openshift-infra/auto-csr-approver-29558020-q9vqx" Mar 14 09:40:01 crc kubenswrapper[5129]: I0314 09:40:01.107227 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" Mar 14 09:40:01 crc kubenswrapper[5129]: I0314 09:40:01.602350 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-q9vqx"] Mar 14 09:40:01 crc kubenswrapper[5129]: W0314 09:40:01.611346 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5463f64e_9ca8_4ff7_8e75_25d58deedadf.slice/crio-8a9e53e1cfd44d513bee9c17ffcbc46a714df2e4d81818d76ed0b2ad39560dfc WatchSource:0}: Error finding container 8a9e53e1cfd44d513bee9c17ffcbc46a714df2e4d81818d76ed0b2ad39560dfc: Status 404 returned error can't find the container with id 8a9e53e1cfd44d513bee9c17ffcbc46a714df2e4d81818d76ed0b2ad39560dfc Mar 14 09:40:02 crc kubenswrapper[5129]: I0314 09:40:02.465277 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" event={"ID":"5463f64e-9ca8-4ff7-8e75-25d58deedadf","Type":"ContainerStarted","Data":"8a9e53e1cfd44d513bee9c17ffcbc46a714df2e4d81818d76ed0b2ad39560dfc"} Mar 14 09:40:03 crc kubenswrapper[5129]: I0314 09:40:03.478938 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" event={"ID":"5463f64e-9ca8-4ff7-8e75-25d58deedadf","Type":"ContainerStarted","Data":"cf87037207a0cf1f4c446a27dfad5daac5e4fa823171734c3b313ef2a5061051"} Mar 14 09:40:03 crc kubenswrapper[5129]: I0314 09:40:03.504314 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" podStartSLOduration=2.253177199 podStartE2EDuration="3.504290868s" podCreationTimestamp="2026-03-14 09:40:00 +0000 UTC" firstStartedPulling="2026-03-14 09:40:01.615571652 +0000 UTC m=+9664.367486836" lastFinishedPulling="2026-03-14 09:40:02.866685311 +0000 UTC m=+9665.618600505" observedRunningTime="2026-03-14 09:40:03.494632216 +0000 UTC m=+9666.246547410" watchObservedRunningTime="2026-03-14 09:40:03.504290868 +0000 UTC m=+9666.256206062" Mar 14 09:40:04 crc kubenswrapper[5129]: I0314 09:40:04.491991 5129 generic.go:334] "Generic (PLEG): container finished" podID="5463f64e-9ca8-4ff7-8e75-25d58deedadf" containerID="cf87037207a0cf1f4c446a27dfad5daac5e4fa823171734c3b313ef2a5061051" exitCode=0 Mar 14 09:40:04 crc kubenswrapper[5129]: I0314 09:40:04.492080 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" event={"ID":"5463f64e-9ca8-4ff7-8e75-25d58deedadf","Type":"ContainerDied","Data":"cf87037207a0cf1f4c446a27dfad5daac5e4fa823171734c3b313ef2a5061051"} Mar 14 09:40:05 crc kubenswrapper[5129]: I0314 09:40:05.917234 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" Mar 14 09:40:05 crc kubenswrapper[5129]: I0314 09:40:05.945278 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcrgq\" (UniqueName: \"kubernetes.io/projected/5463f64e-9ca8-4ff7-8e75-25d58deedadf-kube-api-access-jcrgq\") pod \"5463f64e-9ca8-4ff7-8e75-25d58deedadf\" (UID: \"5463f64e-9ca8-4ff7-8e75-25d58deedadf\") " Mar 14 09:40:05 crc kubenswrapper[5129]: I0314 09:40:05.952985 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5463f64e-9ca8-4ff7-8e75-25d58deedadf-kube-api-access-jcrgq" (OuterVolumeSpecName: "kube-api-access-jcrgq") pod "5463f64e-9ca8-4ff7-8e75-25d58deedadf" (UID: "5463f64e-9ca8-4ff7-8e75-25d58deedadf"). InnerVolumeSpecName "kube-api-access-jcrgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:06 crc kubenswrapper[5129]: I0314 09:40:06.050164 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcrgq\" (UniqueName: \"kubernetes.io/projected/5463f64e-9ca8-4ff7-8e75-25d58deedadf-kube-api-access-jcrgq\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:06 crc kubenswrapper[5129]: I0314 09:40:06.556967 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" event={"ID":"5463f64e-9ca8-4ff7-8e75-25d58deedadf","Type":"ContainerDied","Data":"8a9e53e1cfd44d513bee9c17ffcbc46a714df2e4d81818d76ed0b2ad39560dfc"} Mar 14 09:40:06 crc kubenswrapper[5129]: I0314 09:40:06.557032 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a9e53e1cfd44d513bee9c17ffcbc46a714df2e4d81818d76ed0b2ad39560dfc" Mar 14 09:40:06 crc kubenswrapper[5129]: I0314 09:40:06.557034 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-q9vqx" Mar 14 09:40:06 crc kubenswrapper[5129]: I0314 09:40:06.612689 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-p9zlg"] Mar 14 09:40:06 crc kubenswrapper[5129]: I0314 09:40:06.623888 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-p9zlg"] Mar 14 09:40:08 crc kubenswrapper[5129]: I0314 09:40:08.059461 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a626e8-0fb6-4e1b-b698-766eb0e72ea0" path="/var/lib/kubelet/pods/82a626e8-0fb6-4e1b-b698-766eb0e72ea0/volumes" Mar 14 09:40:19 crc kubenswrapper[5129]: I0314 09:40:19.575083 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:40:19 crc kubenswrapper[5129]: I0314 09:40:19.575992 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:40:36 crc kubenswrapper[5129]: I0314 09:40:36.783931 5129 scope.go:117] "RemoveContainer" containerID="734841ac4ae95ab43993577d941b7ba43891b42cc34ffe936ced9f471e480767" Mar 14 09:40:49 crc kubenswrapper[5129]: I0314 09:40:49.578852 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:40:49 crc kubenswrapper[5129]: I0314 09:40:49.579861 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:40:49 crc kubenswrapper[5129]: I0314 09:40:49.580002 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:40:49 crc kubenswrapper[5129]: I0314 09:40:49.581540 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c6c70ed35abb8104cdb80e74bbb70d10c64039fda720bf97b8f982caf34cf4c"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:40:49 crc kubenswrapper[5129]: I0314 09:40:49.581776 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://6c6c70ed35abb8104cdb80e74bbb70d10c64039fda720bf97b8f982caf34cf4c" gracePeriod=600 Mar 14 09:40:50 crc kubenswrapper[5129]: I0314 09:40:50.179588 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="6c6c70ed35abb8104cdb80e74bbb70d10c64039fda720bf97b8f982caf34cf4c" exitCode=0 Mar 14 09:40:50 crc kubenswrapper[5129]: I0314 09:40:50.179658 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"6c6c70ed35abb8104cdb80e74bbb70d10c64039fda720bf97b8f982caf34cf4c"} Mar 14 09:40:50 crc kubenswrapper[5129]: I0314 09:40:50.180587 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6"} Mar 14 09:40:50 crc kubenswrapper[5129]: I0314 09:40:50.180668 5129 scope.go:117] "RemoveContainer" containerID="e282027e797b4880be26836792718e0bbc60f410fecc95d0c1f66bcb05a66816" Mar 14 09:41:19 crc kubenswrapper[5129]: I0314 09:41:19.187114 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" podUID="e789f354-e686-4cc9-a705-3af685a25988" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:41:19 crc kubenswrapper[5129]: I0314 09:41:19.187173 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7744kbgv" podUID="e789f354-e686-4cc9-a705-3af685a25988" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.183693 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c8c95"] Mar 14 09:41:23 crc kubenswrapper[5129]: E0314 09:41:23.185038 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5463f64e-9ca8-4ff7-8e75-25d58deedadf" containerName="oc" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.185057 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5463f64e-9ca8-4ff7-8e75-25d58deedadf" containerName="oc" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.185315 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5463f64e-9ca8-4ff7-8e75-25d58deedadf" containerName="oc" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.187401 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.196659 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8c95"] Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.286561 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-utilities\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.286693 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn8wv\" (UniqueName: \"kubernetes.io/projected/1f0c397a-768e-48f7-85cb-9ad6d194f60f-kube-api-access-mn8wv\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.286745 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-catalog-content\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.389110 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-utilities\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.389200 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn8wv\" (UniqueName: \"kubernetes.io/projected/1f0c397a-768e-48f7-85cb-9ad6d194f60f-kube-api-access-mn8wv\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.389236 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-catalog-content\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.389906 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-catalog-content\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.389914 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-utilities\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.415816 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn8wv\" (UniqueName: \"kubernetes.io/projected/1f0c397a-768e-48f7-85cb-9ad6d194f60f-kube-api-access-mn8wv\") pod \"community-operators-c8c95\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:23 crc kubenswrapper[5129]: I0314 09:41:23.553238 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:24 crc kubenswrapper[5129]: I0314 09:41:24.225841 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8c95"] Mar 14 09:41:24 crc kubenswrapper[5129]: W0314 09:41:24.227990 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f0c397a_768e_48f7_85cb_9ad6d194f60f.slice/crio-cb52365f9258a7b455df44f8123c0bf5d1a1d83bcd7460d53f7a985b04b12d1a WatchSource:0}: Error finding container cb52365f9258a7b455df44f8123c0bf5d1a1d83bcd7460d53f7a985b04b12d1a: Status 404 returned error can't find the container with id cb52365f9258a7b455df44f8123c0bf5d1a1d83bcd7460d53f7a985b04b12d1a Mar 14 09:41:24 crc kubenswrapper[5129]: I0314 09:41:24.638474 5129 generic.go:334] "Generic (PLEG): container finished" podID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerID="f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f" exitCode=0 Mar 14 09:41:24 crc kubenswrapper[5129]: I0314 09:41:24.638536 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c95" event={"ID":"1f0c397a-768e-48f7-85cb-9ad6d194f60f","Type":"ContainerDied","Data":"f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f"} Mar 14 09:41:24 crc kubenswrapper[5129]: I0314 09:41:24.639017 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c95" event={"ID":"1f0c397a-768e-48f7-85cb-9ad6d194f60f","Type":"ContainerStarted","Data":"cb52365f9258a7b455df44f8123c0bf5d1a1d83bcd7460d53f7a985b04b12d1a"} Mar 14 09:41:24 crc kubenswrapper[5129]: I0314 09:41:24.641949 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:41:26 crc kubenswrapper[5129]: I0314 09:41:26.668208 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c95" event={"ID":"1f0c397a-768e-48f7-85cb-9ad6d194f60f","Type":"ContainerStarted","Data":"d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173"} Mar 14 09:41:27 crc kubenswrapper[5129]: I0314 09:41:27.683799 5129 generic.go:334] "Generic (PLEG): container finished" podID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerID="d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173" exitCode=0 Mar 14 09:41:27 crc kubenswrapper[5129]: I0314 09:41:27.683945 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c95" event={"ID":"1f0c397a-768e-48f7-85cb-9ad6d194f60f","Type":"ContainerDied","Data":"d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173"} Mar 14 09:41:28 crc kubenswrapper[5129]: I0314 09:41:28.703239 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c95" event={"ID":"1f0c397a-768e-48f7-85cb-9ad6d194f60f","Type":"ContainerStarted","Data":"1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237"} Mar 14 09:41:28 crc kubenswrapper[5129]: I0314 09:41:28.746355 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c8c95" podStartSLOduration=2.272246812 podStartE2EDuration="5.746333561s" podCreationTimestamp="2026-03-14 09:41:23 +0000 UTC" firstStartedPulling="2026-03-14 09:41:24.641668876 +0000 UTC m=+9747.393584060" lastFinishedPulling="2026-03-14 09:41:28.115755625 +0000 UTC m=+9750.867670809" observedRunningTime="2026-03-14 09:41:28.738939921 +0000 UTC m=+9751.490855105" watchObservedRunningTime="2026-03-14 09:41:28.746333561 +0000 UTC m=+9751.498248755" Mar 14 09:41:33 crc kubenswrapper[5129]: I0314 09:41:33.554368 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:33 crc kubenswrapper[5129]: I0314 09:41:33.554939 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:33 crc kubenswrapper[5129]: I0314 09:41:33.634863 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:33 crc kubenswrapper[5129]: I0314 09:41:33.842145 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:36 crc kubenswrapper[5129]: I0314 09:41:36.878781 5129 scope.go:117] "RemoveContainer" containerID="d16e09e6e79a8e672fb9190a8e69826db43c3f528ef7a0e17671b03f6a86d7e6" Mar 14 09:41:36 crc kubenswrapper[5129]: I0314 09:41:36.915824 5129 scope.go:117] "RemoveContainer" containerID="689da005c1ec7010d5469c36a92a8e3b7cd318a03911805526f367a775a633c5" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.214797 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8c95"] Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.215168 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c8c95" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="registry-server" containerID="cri-o://1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237" gracePeriod=2 Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.788616 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.859660 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c95" event={"ID":"1f0c397a-768e-48f7-85cb-9ad6d194f60f","Type":"ContainerDied","Data":"1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237"} Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.859720 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c95" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.859748 5129 scope.go:117] "RemoveContainer" containerID="1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.859518 5129 generic.go:334] "Generic (PLEG): container finished" podID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerID="1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237" exitCode=0 Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.859850 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c95" event={"ID":"1f0c397a-768e-48f7-85cb-9ad6d194f60f","Type":"ContainerDied","Data":"cb52365f9258a7b455df44f8123c0bf5d1a1d83bcd7460d53f7a985b04b12d1a"} Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.890611 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-catalog-content\") pod \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.890802 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn8wv\" (UniqueName: \"kubernetes.io/projected/1f0c397a-768e-48f7-85cb-9ad6d194f60f-kube-api-access-mn8wv\") pod \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.891291 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-utilities\") pod \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\" (UID: \"1f0c397a-768e-48f7-85cb-9ad6d194f60f\") " Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.893844 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-utilities" (OuterVolumeSpecName: "utilities") pod "1f0c397a-768e-48f7-85cb-9ad6d194f60f" (UID: "1f0c397a-768e-48f7-85cb-9ad6d194f60f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.897806 5129 scope.go:117] "RemoveContainer" containerID="d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.906438 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0c397a-768e-48f7-85cb-9ad6d194f60f-kube-api-access-mn8wv" (OuterVolumeSpecName: "kube-api-access-mn8wv") pod "1f0c397a-768e-48f7-85cb-9ad6d194f60f" (UID: "1f0c397a-768e-48f7-85cb-9ad6d194f60f"). InnerVolumeSpecName "kube-api-access-mn8wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.956729 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f0c397a-768e-48f7-85cb-9ad6d194f60f" (UID: "1f0c397a-768e-48f7-85cb-9ad6d194f60f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.968851 5129 scope.go:117] "RemoveContainer" containerID="f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.995125 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.995184 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f0c397a-768e-48f7-85cb-9ad6d194f60f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:37 crc kubenswrapper[5129]: I0314 09:41:37.995203 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn8wv\" (UniqueName: \"kubernetes.io/projected/1f0c397a-768e-48f7-85cb-9ad6d194f60f-kube-api-access-mn8wv\") on node \"crc\" DevicePath \"\"" Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.006804 5129 scope.go:117] "RemoveContainer" containerID="1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237" Mar 14 09:41:38 crc kubenswrapper[5129]: E0314 09:41:38.008997 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237\": container with ID starting with 1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237 not found: ID does not exist" containerID="1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237" Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.009070 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237"} err="failed to get container status \"1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237\": rpc error: code = NotFound desc = could not find container \"1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237\": container with ID starting with 1e5879da3995417fd13a878e5f379989f96fc65494da595647c2f971b60b3237 not found: ID does not exist" Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.009102 5129 scope.go:117] "RemoveContainer" containerID="d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173" Mar 14 09:41:38 crc kubenswrapper[5129]: E0314 09:41:38.009647 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173\": container with ID starting with d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173 not found: ID does not exist" containerID="d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173" Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.009701 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173"} err="failed to get container status \"d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173\": rpc error: code = NotFound desc = could not find container \"d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173\": container with ID starting with d2143f09f48c62c4e044308987b0d936402370eb7f8362d33f7073e49c61d173 not found: ID does not exist" Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.009732 5129 scope.go:117] "RemoveContainer" containerID="f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f" Mar 14 09:41:38 crc kubenswrapper[5129]: E0314 09:41:38.011280 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f\": container with ID starting with f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f not found: ID does not exist" containerID="f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f" Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.011332 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f"} err="failed to get container status \"f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f\": rpc error: code = NotFound desc = could not find container \"f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f\": container with ID starting with f8d40d63a9b18700b2827ceb77f8f71218bf4f07ca82e87bf6781c07e505219f not found: ID does not exist" Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.191491 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8c95"] Mar 14 09:41:38 crc kubenswrapper[5129]: I0314 09:41:38.213224 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c8c95"] Mar 14 09:41:40 crc kubenswrapper[5129]: I0314 09:41:40.053761 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" path="/var/lib/kubelet/pods/1f0c397a-768e-48f7-85cb-9ad6d194f60f/volumes" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.165545 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558022-cjhmx"] Mar 14 09:42:00 crc kubenswrapper[5129]: E0314 09:42:00.166919 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="extract-utilities" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.166934 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="extract-utilities" Mar 14 09:42:00 crc kubenswrapper[5129]: E0314 09:42:00.166965 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="registry-server" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.166971 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="registry-server" Mar 14 09:42:00 crc kubenswrapper[5129]: E0314 09:42:00.167016 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="extract-content" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.167025 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="extract-content" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.167241 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0c397a-768e-48f7-85cb-9ad6d194f60f" containerName="registry-server" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.168276 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.170888 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.171064 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.171170 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.181184 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-cjhmx"] Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.245838 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfkqr\" (UniqueName: \"kubernetes.io/projected/90cdb4f4-8350-4920-b5a3-5097987bf81a-kube-api-access-bfkqr\") pod \"auto-csr-approver-29558022-cjhmx\" (UID: \"90cdb4f4-8350-4920-b5a3-5097987bf81a\") " pod="openshift-infra/auto-csr-approver-29558022-cjhmx" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.348745 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfkqr\" (UniqueName: \"kubernetes.io/projected/90cdb4f4-8350-4920-b5a3-5097987bf81a-kube-api-access-bfkqr\") pod \"auto-csr-approver-29558022-cjhmx\" (UID: \"90cdb4f4-8350-4920-b5a3-5097987bf81a\") " pod="openshift-infra/auto-csr-approver-29558022-cjhmx" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.374905 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfkqr\" (UniqueName: \"kubernetes.io/projected/90cdb4f4-8350-4920-b5a3-5097987bf81a-kube-api-access-bfkqr\") pod \"auto-csr-approver-29558022-cjhmx\" (UID: \"90cdb4f4-8350-4920-b5a3-5097987bf81a\") " pod="openshift-infra/auto-csr-approver-29558022-cjhmx" Mar 14 09:42:00 crc kubenswrapper[5129]: I0314 09:42:00.495972 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" Mar 14 09:42:01 crc kubenswrapper[5129]: I0314 09:42:01.014656 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-cjhmx"] Mar 14 09:42:01 crc kubenswrapper[5129]: W0314 09:42:01.024146 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cdb4f4_8350_4920_b5a3_5097987bf81a.slice/crio-cc80f6aaac52f1e49999e4a4eebeb752588d32bf667c635abfa2722d9c7d7a22 WatchSource:0}: Error finding container cc80f6aaac52f1e49999e4a4eebeb752588d32bf667c635abfa2722d9c7d7a22: Status 404 returned error can't find the container with id cc80f6aaac52f1e49999e4a4eebeb752588d32bf667c635abfa2722d9c7d7a22 Mar 14 09:42:01 crc kubenswrapper[5129]: I0314 09:42:01.198090 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" event={"ID":"90cdb4f4-8350-4920-b5a3-5097987bf81a","Type":"ContainerStarted","Data":"cc80f6aaac52f1e49999e4a4eebeb752588d32bf667c635abfa2722d9c7d7a22"} Mar 14 09:42:02 crc kubenswrapper[5129]: I0314 09:42:02.213415 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" event={"ID":"90cdb4f4-8350-4920-b5a3-5097987bf81a","Type":"ContainerStarted","Data":"af09795cd96bddd051e6a54886fbb3c130c9ce0f35174a98ee6144185ac3efd2"} Mar 14 09:42:02 crc kubenswrapper[5129]: I0314 09:42:02.235633 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" podStartSLOduration=1.437733365 podStartE2EDuration="2.235611443s" podCreationTimestamp="2026-03-14 09:42:00 +0000 UTC" firstStartedPulling="2026-03-14 09:42:01.027943983 +0000 UTC m=+9783.779859167" lastFinishedPulling="2026-03-14 09:42:01.825822061 +0000 UTC m=+9784.577737245" observedRunningTime="2026-03-14 09:42:02.228918981 +0000 UTC m=+9784.980834225" watchObservedRunningTime="2026-03-14 09:42:02.235611443 +0000 UTC m=+9784.987526627" Mar 14 09:42:03 crc kubenswrapper[5129]: I0314 09:42:03.232402 5129 generic.go:334] "Generic (PLEG): container finished" podID="90cdb4f4-8350-4920-b5a3-5097987bf81a" containerID="af09795cd96bddd051e6a54886fbb3c130c9ce0f35174a98ee6144185ac3efd2" exitCode=0 Mar 14 09:42:03 crc kubenswrapper[5129]: I0314 09:42:03.233511 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" event={"ID":"90cdb4f4-8350-4920-b5a3-5097987bf81a","Type":"ContainerDied","Data":"af09795cd96bddd051e6a54886fbb3c130c9ce0f35174a98ee6144185ac3efd2"} Mar 14 09:42:04 crc kubenswrapper[5129]: I0314 09:42:04.691766 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" Mar 14 09:42:04 crc kubenswrapper[5129]: I0314 09:42:04.764573 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfkqr\" (UniqueName: \"kubernetes.io/projected/90cdb4f4-8350-4920-b5a3-5097987bf81a-kube-api-access-bfkqr\") pod \"90cdb4f4-8350-4920-b5a3-5097987bf81a\" (UID: \"90cdb4f4-8350-4920-b5a3-5097987bf81a\") " Mar 14 09:42:04 crc kubenswrapper[5129]: I0314 09:42:04.780807 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cdb4f4-8350-4920-b5a3-5097987bf81a-kube-api-access-bfkqr" (OuterVolumeSpecName: "kube-api-access-bfkqr") pod "90cdb4f4-8350-4920-b5a3-5097987bf81a" (UID: "90cdb4f4-8350-4920-b5a3-5097987bf81a"). InnerVolumeSpecName "kube-api-access-bfkqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:04 crc kubenswrapper[5129]: I0314 09:42:04.867747 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfkqr\" (UniqueName: \"kubernetes.io/projected/90cdb4f4-8350-4920-b5a3-5097987bf81a-kube-api-access-bfkqr\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:05 crc kubenswrapper[5129]: I0314 09:42:05.268581 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" event={"ID":"90cdb4f4-8350-4920-b5a3-5097987bf81a","Type":"ContainerDied","Data":"cc80f6aaac52f1e49999e4a4eebeb752588d32bf667c635abfa2722d9c7d7a22"} Mar 14 09:42:05 crc kubenswrapper[5129]: I0314 09:42:05.268945 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc80f6aaac52f1e49999e4a4eebeb752588d32bf667c635abfa2722d9c7d7a22" Mar 14 09:42:05 crc kubenswrapper[5129]: I0314 09:42:05.268671 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-cjhmx" Mar 14 09:42:05 crc kubenswrapper[5129]: I0314 09:42:05.344117 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-422qk"] Mar 14 09:42:05 crc kubenswrapper[5129]: I0314 09:42:05.356229 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-422qk"] Mar 14 09:42:06 crc kubenswrapper[5129]: I0314 09:42:06.053396 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98" path="/var/lib/kubelet/pods/1e66e8d9-d262-4c9f-8ef3-7eb33ea19a98/volumes" Mar 14 09:42:07 crc kubenswrapper[5129]: I0314 09:42:07.299233 5129 generic.go:334] "Generic (PLEG): container finished" podID="ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" containerID="75683916f9a502c48651ee21f0f0a752a86f9538dcc193cd658d083d30afd030" exitCode=0 Mar 14 09:42:07 crc kubenswrapper[5129]: I0314 09:42:07.299362 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" event={"ID":"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9","Type":"ContainerDied","Data":"75683916f9a502c48651ee21f0f0a752a86f9538dcc193cd658d083d30afd030"} Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.863573 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.993782 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-inventory\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.994747 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-0\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.994787 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-ssh-key-openstack-cell1\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.995172 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-2\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.995781 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-3\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.995854 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-1\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.995914 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cells-global-config-0\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.996031 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4kv\" (UniqueName: \"kubernetes.io/projected/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-kube-api-access-4x4kv\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.996091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-1\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.996126 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-combined-ca-bundle\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:08 crc kubenswrapper[5129]: I0314 09:42:08.996200 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-0\") pod \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\" (UID: \"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9\") " Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.005101 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-kube-api-access-4x4kv" (OuterVolumeSpecName: "kube-api-access-4x4kv") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "kube-api-access-4x4kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.005414 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.029534 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.034707 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.039006 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.050245 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-inventory" (OuterVolumeSpecName: "inventory") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.053084 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.055354 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.055975 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.057836 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.069683 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" (UID: "ed1b368f-4cc5-4887-90ba-59b7ea16c6e9"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.100942 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4kv\" (UniqueName: \"kubernetes.io/projected/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-kube-api-access-4x4kv\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.100998 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101021 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101042 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101065 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101086 5129 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101103 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101120 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101139 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101157 5129 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.101175 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ed1b368f-4cc5-4887-90ba-59b7ea16c6e9-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.336070 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" event={"ID":"ed1b368f-4cc5-4887-90ba-59b7ea16c6e9","Type":"ContainerDied","Data":"40fb9927776c45f9f43a851569b480a9fbdbb8ce6aaae0833238ac5255eac2c3"} Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.336133 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-86mwr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.336164 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40fb9927776c45f9f43a851569b480a9fbdbb8ce6aaae0833238ac5255eac2c3" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.589504 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zmjr"] Mar 14 09:42:09 crc kubenswrapper[5129]: E0314 09:42:09.590102 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" containerName="nova-cell1-openstack-openstack-cell1" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.590125 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" containerName="nova-cell1-openstack-openstack-cell1" Mar 14 09:42:09 crc kubenswrapper[5129]: E0314 09:42:09.590140 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cdb4f4-8350-4920-b5a3-5097987bf81a" containerName="oc" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.590146 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cdb4f4-8350-4920-b5a3-5097987bf81a" containerName="oc" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.590359 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cdb4f4-8350-4920-b5a3-5097987bf81a" containerName="oc" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.590381 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1b368f-4cc5-4887-90ba-59b7ea16c6e9" containerName="nova-cell1-openstack-openstack-cell1" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.591165 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.594247 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.597423 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.597717 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.597871 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.597992 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.607026 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zmjr"] Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.664280 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2h8n\" (UniqueName: \"kubernetes.io/projected/5e83070c-b8b4-468d-bf05-414509537764-kube-api-access-b2h8n\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.664327 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-inventory\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.664352 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.664436 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.664470 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.664497 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.664528 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.767465 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.767639 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.767714 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.767792 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.768069 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2h8n\" (UniqueName: \"kubernetes.io/projected/5e83070c-b8b4-468d-bf05-414509537764-kube-api-access-b2h8n\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.768153 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-inventory\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.768205 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.774193 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.774757 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.776523 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.776751 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-inventory\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.777517 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.785521 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.795687 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2h8n\" (UniqueName: \"kubernetes.io/projected/5e83070c-b8b4-468d-bf05-414509537764-kube-api-access-b2h8n\") pod \"telemetry-openstack-openstack-cell1-9zmjr\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:09 crc kubenswrapper[5129]: I0314 09:42:09.984312 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:42:10 crc kubenswrapper[5129]: I0314 09:42:10.433746 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zmjr"] Mar 14 09:42:11 crc kubenswrapper[5129]: I0314 09:42:11.376390 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" event={"ID":"5e83070c-b8b4-468d-bf05-414509537764","Type":"ContainerStarted","Data":"d3ed89ac6338ebf8f3f2df40f7555a885fe36cd3bf93233cbf6d80f0c5a24f60"} Mar 14 09:42:11 crc kubenswrapper[5129]: I0314 09:42:11.376951 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" event={"ID":"5e83070c-b8b4-468d-bf05-414509537764","Type":"ContainerStarted","Data":"593468c32e02946874469ed1a8b9c37601ccf6be3b4f4daa1f83f68ee4c09365"} Mar 14 09:42:11 crc kubenswrapper[5129]: I0314 09:42:11.399980 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" podStartSLOduration=1.93776385 podStartE2EDuration="2.399957765s" podCreationTimestamp="2026-03-14 09:42:09 +0000 UTC" firstStartedPulling="2026-03-14 09:42:10.434393596 +0000 UTC m=+9793.186308790" lastFinishedPulling="2026-03-14 09:42:10.896587511 +0000 UTC m=+9793.648502705" observedRunningTime="2026-03-14 09:42:11.393234643 +0000 UTC m=+9794.145149827" watchObservedRunningTime="2026-03-14 09:42:11.399957765 +0000 UTC m=+9794.151872949" Mar 14 09:42:37 crc kubenswrapper[5129]: I0314 09:42:37.018530 5129 scope.go:117] "RemoveContainer" containerID="6f2e8c74484593a422a7bf7bf072cb9caf16a76f10d0e03cc285c2b686c7cb3c" Mar 14 09:42:37 crc kubenswrapper[5129]: I0314 09:42:37.058225 5129 scope.go:117] "RemoveContainer" containerID="6ece8fe87ecff869cd2fe17822616508f2ab609ade8ff9d985ef99da7fc343c4" Mar 14 09:42:49 crc kubenswrapper[5129]: I0314 09:42:49.574706 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:42:49 crc kubenswrapper[5129]: I0314 09:42:49.575737 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:43:19 crc kubenswrapper[5129]: I0314 09:43:19.574384 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:43:19 crc kubenswrapper[5129]: I0314 09:43:19.576591 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.574523 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.575395 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.575473 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.577284 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.577422 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" gracePeriod=600 Mar 14 09:43:49 crc kubenswrapper[5129]: E0314 09:43:49.710458 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.872994 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" exitCode=0 Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.873065 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6"} Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.873131 5129 scope.go:117] "RemoveContainer" containerID="6c6c70ed35abb8104cdb80e74bbb70d10c64039fda720bf97b8f982caf34cf4c" Mar 14 09:43:49 crc kubenswrapper[5129]: I0314 09:43:49.874143 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:43:49 crc kubenswrapper[5129]: E0314 09:43:49.874428 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.179988 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558024-vmbd2"] Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.187450 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-vmbd2" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.195630 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.196026 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.196184 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.224670 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-vmbd2"] Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.287182 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhtdc\" (UniqueName: \"kubernetes.io/projected/89441e20-499b-4f4f-b396-fc01294f25bb-kube-api-access-nhtdc\") pod \"auto-csr-approver-29558024-vmbd2\" (UID: \"89441e20-499b-4f4f-b396-fc01294f25bb\") " pod="openshift-infra/auto-csr-approver-29558024-vmbd2" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.392256 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhtdc\" (UniqueName: \"kubernetes.io/projected/89441e20-499b-4f4f-b396-fc01294f25bb-kube-api-access-nhtdc\") pod \"auto-csr-approver-29558024-vmbd2\" (UID: \"89441e20-499b-4f4f-b396-fc01294f25bb\") " pod="openshift-infra/auto-csr-approver-29558024-vmbd2" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.414538 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhtdc\" (UniqueName: \"kubernetes.io/projected/89441e20-499b-4f4f-b396-fc01294f25bb-kube-api-access-nhtdc\") pod \"auto-csr-approver-29558024-vmbd2\" (UID: \"89441e20-499b-4f4f-b396-fc01294f25bb\") " pod="openshift-infra/auto-csr-approver-29558024-vmbd2" Mar 14 09:44:00 crc kubenswrapper[5129]: I0314 09:44:00.515333 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-vmbd2" Mar 14 09:44:01 crc kubenswrapper[5129]: I0314 09:44:01.041810 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-vmbd2"] Mar 14 09:44:02 crc kubenswrapper[5129]: I0314 09:44:02.038382 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:44:02 crc kubenswrapper[5129]: E0314 09:44:02.039576 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:44:02 crc kubenswrapper[5129]: I0314 09:44:02.070113 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-vmbd2" event={"ID":"89441e20-499b-4f4f-b396-fc01294f25bb","Type":"ContainerStarted","Data":"f6ba21689ae07a87e16b5b02e28644f0d57c756a7ed89afbf4edfe6f5cf83caa"} Mar 14 09:44:04 crc kubenswrapper[5129]: I0314 09:44:04.102551 5129 generic.go:334] "Generic (PLEG): container finished" podID="89441e20-499b-4f4f-b396-fc01294f25bb" containerID="f614b3538de930975380c263fa0bccecbf741da9aba92c25e74789d20933f386" exitCode=0 Mar 14 09:44:04 crc kubenswrapper[5129]: I0314 09:44:04.102752 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-vmbd2" event={"ID":"89441e20-499b-4f4f-b396-fc01294f25bb","Type":"ContainerDied","Data":"f614b3538de930975380c263fa0bccecbf741da9aba92c25e74789d20933f386"} Mar 14 09:44:05 crc kubenswrapper[5129]: I0314 09:44:05.530621 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-vmbd2" Mar 14 09:44:05 crc kubenswrapper[5129]: I0314 09:44:05.649455 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhtdc\" (UniqueName: \"kubernetes.io/projected/89441e20-499b-4f4f-b396-fc01294f25bb-kube-api-access-nhtdc\") pod \"89441e20-499b-4f4f-b396-fc01294f25bb\" (UID: \"89441e20-499b-4f4f-b396-fc01294f25bb\") " Mar 14 09:44:05 crc kubenswrapper[5129]: I0314 09:44:05.687101 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89441e20-499b-4f4f-b396-fc01294f25bb-kube-api-access-nhtdc" (OuterVolumeSpecName: "kube-api-access-nhtdc") pod "89441e20-499b-4f4f-b396-fc01294f25bb" (UID: "89441e20-499b-4f4f-b396-fc01294f25bb"). InnerVolumeSpecName "kube-api-access-nhtdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:05 crc kubenswrapper[5129]: I0314 09:44:05.755933 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhtdc\" (UniqueName: \"kubernetes.io/projected/89441e20-499b-4f4f-b396-fc01294f25bb-kube-api-access-nhtdc\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:06 crc kubenswrapper[5129]: I0314 09:44:06.131147 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-vmbd2" event={"ID":"89441e20-499b-4f4f-b396-fc01294f25bb","Type":"ContainerDied","Data":"f6ba21689ae07a87e16b5b02e28644f0d57c756a7ed89afbf4edfe6f5cf83caa"} Mar 14 09:44:06 crc kubenswrapper[5129]: I0314 09:44:06.131206 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6ba21689ae07a87e16b5b02e28644f0d57c756a7ed89afbf4edfe6f5cf83caa" Mar 14 09:44:06 crc kubenswrapper[5129]: I0314 09:44:06.131279 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-vmbd2" Mar 14 09:44:06 crc kubenswrapper[5129]: I0314 09:44:06.645925 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-gdb87"] Mar 14 09:44:06 crc kubenswrapper[5129]: I0314 09:44:06.655294 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-gdb87"] Mar 14 09:44:08 crc kubenswrapper[5129]: I0314 09:44:08.057368 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7" path="/var/lib/kubelet/pods/aeacf7cd-0a54-43fa-8f6d-a6ffdd2590a7/volumes" Mar 14 09:44:13 crc kubenswrapper[5129]: I0314 09:44:13.036598 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:44:13 crc kubenswrapper[5129]: E0314 09:44:13.038004 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:44:28 crc kubenswrapper[5129]: I0314 09:44:28.050152 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:44:28 crc kubenswrapper[5129]: E0314 09:44:28.051540 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:44:37 crc kubenswrapper[5129]: I0314 09:44:37.248762 5129 scope.go:117] "RemoveContainer" containerID="7221b561d1673c8b9037fede56f840f460a32c47e3362518d093389c1a3b2d1e" Mar 14 09:44:43 crc kubenswrapper[5129]: I0314 09:44:43.037768 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:44:43 crc kubenswrapper[5129]: E0314 09:44:43.039362 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:44:50 crc kubenswrapper[5129]: I0314 09:44:50.417323 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerName="galera" probeResult="failure" output="command timed out" Mar 14 09:44:50 crc kubenswrapper[5129]: I0314 09:44:50.418122 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerName="galera" probeResult="failure" output="command timed out" Mar 14 09:44:55 crc kubenswrapper[5129]: I0314 09:44:55.037496 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:44:55 crc kubenswrapper[5129]: E0314 09:44:55.039491 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.165842 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp"] Mar 14 09:45:00 crc kubenswrapper[5129]: E0314 09:45:00.167490 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89441e20-499b-4f4f-b396-fc01294f25bb" containerName="oc" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.167520 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="89441e20-499b-4f4f-b396-fc01294f25bb" containerName="oc" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.167934 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="89441e20-499b-4f4f-b396-fc01294f25bb" containerName="oc" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.169141 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.172498 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.179403 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp"] Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.179529 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.355831 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pr5s\" (UniqueName: \"kubernetes.io/projected/0ed61f96-26b6-4993-a226-76f32ee3e8fe-kube-api-access-5pr5s\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.355925 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed61f96-26b6-4993-a226-76f32ee3e8fe-secret-volume\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.355976 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed61f96-26b6-4993-a226-76f32ee3e8fe-config-volume\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.458418 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed61f96-26b6-4993-a226-76f32ee3e8fe-config-volume\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.458630 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pr5s\" (UniqueName: \"kubernetes.io/projected/0ed61f96-26b6-4993-a226-76f32ee3e8fe-kube-api-access-5pr5s\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.458689 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed61f96-26b6-4993-a226-76f32ee3e8fe-secret-volume\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.459518 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed61f96-26b6-4993-a226-76f32ee3e8fe-config-volume\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.468199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed61f96-26b6-4993-a226-76f32ee3e8fe-secret-volume\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.482139 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pr5s\" (UniqueName: \"kubernetes.io/projected/0ed61f96-26b6-4993-a226-76f32ee3e8fe-kube-api-access-5pr5s\") pod \"collect-profiles-29558025-fjsgp\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:00 crc kubenswrapper[5129]: I0314 09:45:00.497030 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:01 crc kubenswrapper[5129]: I0314 09:45:01.019726 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp"] Mar 14 09:45:01 crc kubenswrapper[5129]: I0314 09:45:01.943788 5129 generic.go:334] "Generic (PLEG): container finished" podID="0ed61f96-26b6-4993-a226-76f32ee3e8fe" containerID="2fa27fa7e32cf9d821cb253ec4e112efb5f921164ab10d2042408a7b48721cbd" exitCode=0 Mar 14 09:45:01 crc kubenswrapper[5129]: I0314 09:45:01.943905 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" event={"ID":"0ed61f96-26b6-4993-a226-76f32ee3e8fe","Type":"ContainerDied","Data":"2fa27fa7e32cf9d821cb253ec4e112efb5f921164ab10d2042408a7b48721cbd"} Mar 14 09:45:01 crc kubenswrapper[5129]: I0314 09:45:01.944356 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" event={"ID":"0ed61f96-26b6-4993-a226-76f32ee3e8fe","Type":"ContainerStarted","Data":"7c1e9de590507f483a84f247a9a65c19e630f245b89a8554ed5153fa7a92428a"} Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.297126 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.437300 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pr5s\" (UniqueName: \"kubernetes.io/projected/0ed61f96-26b6-4993-a226-76f32ee3e8fe-kube-api-access-5pr5s\") pod \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.437549 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed61f96-26b6-4993-a226-76f32ee3e8fe-config-volume\") pod \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.437625 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed61f96-26b6-4993-a226-76f32ee3e8fe-secret-volume\") pod \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\" (UID: \"0ed61f96-26b6-4993-a226-76f32ee3e8fe\") " Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.438926 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed61f96-26b6-4993-a226-76f32ee3e8fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ed61f96-26b6-4993-a226-76f32ee3e8fe" (UID: "0ed61f96-26b6-4993-a226-76f32ee3e8fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.447289 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed61f96-26b6-4993-a226-76f32ee3e8fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ed61f96-26b6-4993-a226-76f32ee3e8fe" (UID: "0ed61f96-26b6-4993-a226-76f32ee3e8fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.447872 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed61f96-26b6-4993-a226-76f32ee3e8fe-kube-api-access-5pr5s" (OuterVolumeSpecName: "kube-api-access-5pr5s") pod "0ed61f96-26b6-4993-a226-76f32ee3e8fe" (UID: "0ed61f96-26b6-4993-a226-76f32ee3e8fe"). InnerVolumeSpecName "kube-api-access-5pr5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.542253 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pr5s\" (UniqueName: \"kubernetes.io/projected/0ed61f96-26b6-4993-a226-76f32ee3e8fe-kube-api-access-5pr5s\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.542921 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ed61f96-26b6-4993-a226-76f32ee3e8fe-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.542938 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ed61f96-26b6-4993-a226-76f32ee3e8fe-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.970751 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" event={"ID":"0ed61f96-26b6-4993-a226-76f32ee3e8fe","Type":"ContainerDied","Data":"7c1e9de590507f483a84f247a9a65c19e630f245b89a8554ed5153fa7a92428a"} Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.971121 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c1e9de590507f483a84f247a9a65c19e630f245b89a8554ed5153fa7a92428a" Mar 14 09:45:03 crc kubenswrapper[5129]: I0314 09:45:03.970815 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp" Mar 14 09:45:04 crc kubenswrapper[5129]: E0314 09:45:04.090096 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed61f96_26b6_4993_a226_76f32ee3e8fe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed61f96_26b6_4993_a226_76f32ee3e8fe.slice/crio-7c1e9de590507f483a84f247a9a65c19e630f245b89a8554ed5153fa7a92428a\": RecentStats: unable to find data in memory cache]" Mar 14 09:45:04 crc kubenswrapper[5129]: I0314 09:45:04.426316 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv"] Mar 14 09:45:04 crc kubenswrapper[5129]: I0314 09:45:04.438823 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-w4tqv"] Mar 14 09:45:06 crc kubenswrapper[5129]: I0314 09:45:06.053675 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fee75f5-d499-42e6-92a2-794e1b325cd4" path="/var/lib/kubelet/pods/6fee75f5-d499-42e6-92a2-794e1b325cd4/volumes" Mar 14 09:45:10 crc kubenswrapper[5129]: I0314 09:45:10.037238 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:45:10 crc kubenswrapper[5129]: E0314 09:45:10.038226 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:45:23 crc kubenswrapper[5129]: I0314 09:45:23.037086 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:45:23 crc kubenswrapper[5129]: E0314 09:45:23.038675 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:45:30 crc kubenswrapper[5129]: I0314 09:45:30.416751 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerName="galera" probeResult="failure" output="command timed out" Mar 14 09:45:37 crc kubenswrapper[5129]: I0314 09:45:37.040089 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:45:37 crc kubenswrapper[5129]: E0314 09:45:37.042231 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:45:37 crc kubenswrapper[5129]: I0314 09:45:37.352899 5129 scope.go:117] "RemoveContainer" containerID="d8362e92a7ee0f3cab0ef02001ba9898a72aeb469bcf6d5766fa8ddb3d5486af" Mar 14 09:45:51 crc kubenswrapper[5129]: I0314 09:45:51.902705 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hkh9b"] Mar 14 09:45:51 crc kubenswrapper[5129]: E0314 09:45:51.904332 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed61f96-26b6-4993-a226-76f32ee3e8fe" containerName="collect-profiles" Mar 14 09:45:51 crc kubenswrapper[5129]: I0314 09:45:51.904359 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed61f96-26b6-4993-a226-76f32ee3e8fe" containerName="collect-profiles" Mar 14 09:45:51 crc kubenswrapper[5129]: I0314 09:45:51.904654 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed61f96-26b6-4993-a226-76f32ee3e8fe" containerName="collect-profiles" Mar 14 09:45:51 crc kubenswrapper[5129]: I0314 09:45:51.906870 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:51 crc kubenswrapper[5129]: I0314 09:45:51.928287 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkh9b"] Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.037274 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:45:52 crc kubenswrapper[5129]: E0314 09:45:52.038109 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.074519 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5p4j\" (UniqueName: \"kubernetes.io/projected/e4369b6a-4f02-4e3f-a478-613b5973edb1-kube-api-access-g5p4j\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.074761 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-catalog-content\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.074949 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-utilities\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.177144 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5p4j\" (UniqueName: \"kubernetes.io/projected/e4369b6a-4f02-4e3f-a478-613b5973edb1-kube-api-access-g5p4j\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.177257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-catalog-content\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.178332 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-utilities\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.178459 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-catalog-content\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.178838 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-utilities\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.228697 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5p4j\" (UniqueName: \"kubernetes.io/projected/e4369b6a-4f02-4e3f-a478-613b5973edb1-kube-api-access-g5p4j\") pod \"certified-operators-hkh9b\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.274547 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:45:52 crc kubenswrapper[5129]: I0314 09:45:52.824166 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkh9b"] Mar 14 09:45:53 crc kubenswrapper[5129]: I0314 09:45:53.640193 5129 generic.go:334] "Generic (PLEG): container finished" podID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerID="d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c" exitCode=0 Mar 14 09:45:53 crc kubenswrapper[5129]: I0314 09:45:53.640319 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkh9b" event={"ID":"e4369b6a-4f02-4e3f-a478-613b5973edb1","Type":"ContainerDied","Data":"d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c"} Mar 14 09:45:53 crc kubenswrapper[5129]: I0314 09:45:53.640700 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkh9b" event={"ID":"e4369b6a-4f02-4e3f-a478-613b5973edb1","Type":"ContainerStarted","Data":"87896c7de6da9e5b7adce744ef8f1985320fa5cc244d041d68bdf07597e9bc97"} Mar 14 09:45:55 crc kubenswrapper[5129]: E0314 09:45:55.636364 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4369b6a_4f02_4e3f_a478_613b5973edb1.slice/crio-af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4369b6a_4f02_4e3f_a478_613b5973edb1.slice/crio-conmon-af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:45:55 crc kubenswrapper[5129]: I0314 09:45:55.669993 5129 generic.go:334] "Generic (PLEG): container finished" podID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerID="af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77" exitCode=0 Mar 14 09:45:55 crc kubenswrapper[5129]: I0314 09:45:55.670079 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkh9b" event={"ID":"e4369b6a-4f02-4e3f-a478-613b5973edb1","Type":"ContainerDied","Data":"af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77"} Mar 14 09:45:56 crc kubenswrapper[5129]: I0314 09:45:56.686421 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkh9b" event={"ID":"e4369b6a-4f02-4e3f-a478-613b5973edb1","Type":"ContainerStarted","Data":"014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b"} Mar 14 09:45:56 crc kubenswrapper[5129]: I0314 09:45:56.722412 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hkh9b" podStartSLOduration=3.221707736 podStartE2EDuration="5.722375733s" podCreationTimestamp="2026-03-14 09:45:51 +0000 UTC" firstStartedPulling="2026-03-14 09:45:53.642322649 +0000 UTC m=+10016.394237853" lastFinishedPulling="2026-03-14 09:45:56.142990676 +0000 UTC m=+10018.894905850" observedRunningTime="2026-03-14 09:45:56.709317358 +0000 UTC m=+10019.461232552" watchObservedRunningTime="2026-03-14 09:45:56.722375733 +0000 UTC m=+10019.474290917" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.163395 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558026-dsq9s"] Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.165324 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-dsq9s" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.168115 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.168579 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.168878 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.183862 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-dsq9s"] Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.286577 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfjrk\" (UniqueName: \"kubernetes.io/projected/8c7bbc39-64bc-4d2f-b9c9-9a515ac64414-kube-api-access-kfjrk\") pod \"auto-csr-approver-29558026-dsq9s\" (UID: \"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414\") " pod="openshift-infra/auto-csr-approver-29558026-dsq9s" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.389648 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfjrk\" (UniqueName: \"kubernetes.io/projected/8c7bbc39-64bc-4d2f-b9c9-9a515ac64414-kube-api-access-kfjrk\") pod \"auto-csr-approver-29558026-dsq9s\" (UID: \"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414\") " pod="openshift-infra/auto-csr-approver-29558026-dsq9s" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.416285 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfjrk\" (UniqueName: \"kubernetes.io/projected/8c7bbc39-64bc-4d2f-b9c9-9a515ac64414-kube-api-access-kfjrk\") pod \"auto-csr-approver-29558026-dsq9s\" (UID: \"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414\") " pod="openshift-infra/auto-csr-approver-29558026-dsq9s" Mar 14 09:46:00 crc kubenswrapper[5129]: I0314 09:46:00.491365 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-dsq9s" Mar 14 09:46:01 crc kubenswrapper[5129]: I0314 09:46:01.064591 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-dsq9s"] Mar 14 09:46:01 crc kubenswrapper[5129]: W0314 09:46:01.072334 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c7bbc39_64bc_4d2f_b9c9_9a515ac64414.slice/crio-844d840c15cdd653ca57be281036845ea7314ef5557478972113ebd13c98e10c WatchSource:0}: Error finding container 844d840c15cdd653ca57be281036845ea7314ef5557478972113ebd13c98e10c: Status 404 returned error can't find the container with id 844d840c15cdd653ca57be281036845ea7314ef5557478972113ebd13c98e10c Mar 14 09:46:01 crc kubenswrapper[5129]: I0314 09:46:01.780177 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-dsq9s" event={"ID":"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414","Type":"ContainerStarted","Data":"844d840c15cdd653ca57be281036845ea7314ef5557478972113ebd13c98e10c"} Mar 14 09:46:02 crc kubenswrapper[5129]: I0314 09:46:02.274689 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:46:02 crc kubenswrapper[5129]: I0314 09:46:02.276936 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:46:02 crc kubenswrapper[5129]: I0314 09:46:02.339630 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:46:02 crc kubenswrapper[5129]: I0314 09:46:02.798464 5129 generic.go:334] "Generic (PLEG): container finished" podID="8c7bbc39-64bc-4d2f-b9c9-9a515ac64414" containerID="726cbfa4d02721a7db3d794a673b33f7e05a56b25008eaa61cadbd47290ca253" exitCode=0 Mar 14 09:46:02 crc kubenswrapper[5129]: I0314 09:46:02.798888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-dsq9s" event={"ID":"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414","Type":"ContainerDied","Data":"726cbfa4d02721a7db3d794a673b33f7e05a56b25008eaa61cadbd47290ca253"} Mar 14 09:46:02 crc kubenswrapper[5129]: I0314 09:46:02.849488 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:46:02 crc kubenswrapper[5129]: I0314 09:46:02.913157 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hkh9b"] Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.247279 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-dsq9s" Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.412557 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfjrk\" (UniqueName: \"kubernetes.io/projected/8c7bbc39-64bc-4d2f-b9c9-9a515ac64414-kube-api-access-kfjrk\") pod \"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414\" (UID: \"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414\") " Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.421044 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7bbc39-64bc-4d2f-b9c9-9a515ac64414-kube-api-access-kfjrk" (OuterVolumeSpecName: "kube-api-access-kfjrk") pod "8c7bbc39-64bc-4d2f-b9c9-9a515ac64414" (UID: "8c7bbc39-64bc-4d2f-b9c9-9a515ac64414"). InnerVolumeSpecName "kube-api-access-kfjrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.515897 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfjrk\" (UniqueName: \"kubernetes.io/projected/8c7bbc39-64bc-4d2f-b9c9-9a515ac64414-kube-api-access-kfjrk\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.828260 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hkh9b" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="registry-server" containerID="cri-o://014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b" gracePeriod=2 Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.828708 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-dsq9s" Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.829255 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-dsq9s" event={"ID":"8c7bbc39-64bc-4d2f-b9c9-9a515ac64414","Type":"ContainerDied","Data":"844d840c15cdd653ca57be281036845ea7314ef5557478972113ebd13c98e10c"} Mar 14 09:46:04 crc kubenswrapper[5129]: I0314 09:46:04.829334 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844d840c15cdd653ca57be281036845ea7314ef5557478972113ebd13c98e10c" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.037753 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:46:05 crc kubenswrapper[5129]: E0314 09:46:05.038154 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.362714 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-q9vqx"] Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.375676 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-q9vqx"] Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.563301 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.673896 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5p4j\" (UniqueName: \"kubernetes.io/projected/e4369b6a-4f02-4e3f-a478-613b5973edb1-kube-api-access-g5p4j\") pod \"e4369b6a-4f02-4e3f-a478-613b5973edb1\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.674280 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-utilities\") pod \"e4369b6a-4f02-4e3f-a478-613b5973edb1\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.674421 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-catalog-content\") pod \"e4369b6a-4f02-4e3f-a478-613b5973edb1\" (UID: \"e4369b6a-4f02-4e3f-a478-613b5973edb1\") " Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.675717 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-utilities" (OuterVolumeSpecName: "utilities") pod "e4369b6a-4f02-4e3f-a478-613b5973edb1" (UID: "e4369b6a-4f02-4e3f-a478-613b5973edb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.681946 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4369b6a-4f02-4e3f-a478-613b5973edb1-kube-api-access-g5p4j" (OuterVolumeSpecName: "kube-api-access-g5p4j") pod "e4369b6a-4f02-4e3f-a478-613b5973edb1" (UID: "e4369b6a-4f02-4e3f-a478-613b5973edb1"). InnerVolumeSpecName "kube-api-access-g5p4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.752822 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4369b6a-4f02-4e3f-a478-613b5973edb1" (UID: "e4369b6a-4f02-4e3f-a478-613b5973edb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.778684 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.779088 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4369b6a-4f02-4e3f-a478-613b5973edb1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.779174 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5p4j\" (UniqueName: \"kubernetes.io/projected/e4369b6a-4f02-4e3f-a478-613b5973edb1-kube-api-access-g5p4j\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.871789 5129 generic.go:334] "Generic (PLEG): container finished" podID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerID="014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b" exitCode=0 Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.871907 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkh9b" event={"ID":"e4369b6a-4f02-4e3f-a478-613b5973edb1","Type":"ContainerDied","Data":"014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b"} Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.871992 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkh9b" event={"ID":"e4369b6a-4f02-4e3f-a478-613b5973edb1","Type":"ContainerDied","Data":"87896c7de6da9e5b7adce744ef8f1985320fa5cc244d041d68bdf07597e9bc97"} Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.872025 5129 scope.go:117] "RemoveContainer" containerID="014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.873083 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkh9b" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.923105 5129 scope.go:117] "RemoveContainer" containerID="af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77" Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.945166 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hkh9b"] Mar 14 09:46:05 crc kubenswrapper[5129]: I0314 09:46:05.955408 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hkh9b"] Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.051723 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5463f64e-9ca8-4ff7-8e75-25d58deedadf" path="/var/lib/kubelet/pods/5463f64e-9ca8-4ff7-8e75-25d58deedadf/volumes" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.052923 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" path="/var/lib/kubelet/pods/e4369b6a-4f02-4e3f-a478-613b5973edb1/volumes" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.456545 5129 scope.go:117] "RemoveContainer" containerID="d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.518294 5129 scope.go:117] "RemoveContainer" containerID="014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b" Mar 14 09:46:06 crc kubenswrapper[5129]: E0314 09:46:06.520823 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b\": container with ID starting with 014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b not found: ID does not exist" containerID="014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.520880 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b"} err="failed to get container status \"014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b\": rpc error: code = NotFound desc = could not find container \"014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b\": container with ID starting with 014ba63a6b86fe6b5811b840c86e1783b7aca695ce194d32b1fc59c265e1226b not found: ID does not exist" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.520909 5129 scope.go:117] "RemoveContainer" containerID="af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77" Mar 14 09:46:06 crc kubenswrapper[5129]: E0314 09:46:06.521349 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77\": container with ID starting with af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77 not found: ID does not exist" containerID="af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.521411 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77"} err="failed to get container status \"af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77\": rpc error: code = NotFound desc = could not find container \"af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77\": container with ID starting with af98746363dc4d50405dc3dddf69160d7a1b14759376d334a8cc39a70c938e77 not found: ID does not exist" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.521449 5129 scope.go:117] "RemoveContainer" containerID="d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c" Mar 14 09:46:06 crc kubenswrapper[5129]: E0314 09:46:06.521831 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c\": container with ID starting with d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c not found: ID does not exist" containerID="d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c" Mar 14 09:46:06 crc kubenswrapper[5129]: I0314 09:46:06.521859 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c"} err="failed to get container status \"d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c\": rpc error: code = NotFound desc = could not find container \"d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c\": container with ID starting with d003cfba9fb9a170785583361692cd3545833bc198fb63639bafcdbdf0ef986c not found: ID does not exist" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.809849 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l7c7m"] Mar 14 09:46:15 crc kubenswrapper[5129]: E0314 09:46:15.811006 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7bbc39-64bc-4d2f-b9c9-9a515ac64414" containerName="oc" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.811022 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7bbc39-64bc-4d2f-b9c9-9a515ac64414" containerName="oc" Mar 14 09:46:15 crc kubenswrapper[5129]: E0314 09:46:15.811039 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="extract-utilities" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.811046 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="extract-utilities" Mar 14 09:46:15 crc kubenswrapper[5129]: E0314 09:46:15.811066 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="extract-content" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.811073 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="extract-content" Mar 14 09:46:15 crc kubenswrapper[5129]: E0314 09:46:15.811096 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="registry-server" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.811121 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="registry-server" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.811352 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7bbc39-64bc-4d2f-b9c9-9a515ac64414" containerName="oc" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.811368 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4369b6a-4f02-4e3f-a478-613b5973edb1" containerName="registry-server" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.812937 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.837116 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv44g\" (UniqueName: \"kubernetes.io/projected/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-kube-api-access-fv44g\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.837797 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-utilities\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.837949 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-catalog-content\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.856731 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7c7m"] Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.940761 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv44g\" (UniqueName: \"kubernetes.io/projected/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-kube-api-access-fv44g\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.940878 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-utilities\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.940907 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-catalog-content\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.941408 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-catalog-content\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.941702 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-utilities\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:15 crc kubenswrapper[5129]: I0314 09:46:15.974956 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv44g\" (UniqueName: \"kubernetes.io/projected/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-kube-api-access-fv44g\") pod \"redhat-marketplace-l7c7m\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:16 crc kubenswrapper[5129]: I0314 09:46:16.147041 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:16 crc kubenswrapper[5129]: I0314 09:46:16.716080 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7c7m"] Mar 14 09:46:17 crc kubenswrapper[5129]: I0314 09:46:17.012398 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerStarted","Data":"7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9"} Mar 14 09:46:17 crc kubenswrapper[5129]: I0314 09:46:17.013025 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerStarted","Data":"4f30ed1c52c03291d5d10a374de79ac054efa63a646ba2719ab234298f41c405"} Mar 14 09:46:18 crc kubenswrapper[5129]: I0314 09:46:18.044790 5129 generic.go:334] "Generic (PLEG): container finished" podID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerID="7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9" exitCode=0 Mar 14 09:46:18 crc kubenswrapper[5129]: I0314 09:46:18.061461 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:46:18 crc kubenswrapper[5129]: E0314 09:46:18.062002 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:46:18 crc kubenswrapper[5129]: I0314 09:46:18.082301 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerDied","Data":"7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9"} Mar 14 09:46:18 crc kubenswrapper[5129]: I0314 09:46:18.082355 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerStarted","Data":"3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c"} Mar 14 09:46:19 crc kubenswrapper[5129]: I0314 09:46:19.062223 5129 generic.go:334] "Generic (PLEG): container finished" podID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerID="3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c" exitCode=0 Mar 14 09:46:19 crc kubenswrapper[5129]: I0314 09:46:19.062466 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerDied","Data":"3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c"} Mar 14 09:46:20 crc kubenswrapper[5129]: I0314 09:46:20.093918 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerStarted","Data":"96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c"} Mar 14 09:46:20 crc kubenswrapper[5129]: I0314 09:46:20.121133 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l7c7m" podStartSLOduration=2.677470466 podStartE2EDuration="5.121110865s" podCreationTimestamp="2026-03-14 09:46:15 +0000 UTC" firstStartedPulling="2026-03-14 09:46:17.015722094 +0000 UTC m=+10039.767637278" lastFinishedPulling="2026-03-14 09:46:19.459362473 +0000 UTC m=+10042.211277677" observedRunningTime="2026-03-14 09:46:20.117356633 +0000 UTC m=+10042.869271817" watchObservedRunningTime="2026-03-14 09:46:20.121110865 +0000 UTC m=+10042.873026049" Mar 14 09:46:26 crc kubenswrapper[5129]: I0314 09:46:26.147804 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:26 crc kubenswrapper[5129]: I0314 09:46:26.149328 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:26 crc kubenswrapper[5129]: I0314 09:46:26.205685 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:27 crc kubenswrapper[5129]: I0314 09:46:27.264748 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:27 crc kubenswrapper[5129]: I0314 09:46:27.330696 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7c7m"] Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.222568 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l7c7m" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="registry-server" containerID="cri-o://96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c" gracePeriod=2 Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.795995 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.846889 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-utilities\") pod \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.847051 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv44g\" (UniqueName: \"kubernetes.io/projected/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-kube-api-access-fv44g\") pod \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.847360 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-catalog-content\") pod \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\" (UID: \"ca2cf854-ae2a-4556-a431-a7a0f0d82da6\") " Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.849316 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-utilities" (OuterVolumeSpecName: "utilities") pod "ca2cf854-ae2a-4556-a431-a7a0f0d82da6" (UID: "ca2cf854-ae2a-4556-a431-a7a0f0d82da6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.877063 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-kube-api-access-fv44g" (OuterVolumeSpecName: "kube-api-access-fv44g") pod "ca2cf854-ae2a-4556-a431-a7a0f0d82da6" (UID: "ca2cf854-ae2a-4556-a431-a7a0f0d82da6"). InnerVolumeSpecName "kube-api-access-fv44g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.893200 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca2cf854-ae2a-4556-a431-a7a0f0d82da6" (UID: "ca2cf854-ae2a-4556-a431-a7a0f0d82da6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.949845 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv44g\" (UniqueName: \"kubernetes.io/projected/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-kube-api-access-fv44g\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.949919 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:29 crc kubenswrapper[5129]: I0314 09:46:29.949930 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2cf854-ae2a-4556-a431-a7a0f0d82da6-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.038093 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:46:30 crc kubenswrapper[5129]: E0314 09:46:30.038349 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.238849 5129 generic.go:334] "Generic (PLEG): container finished" podID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerID="96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c" exitCode=0 Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.238926 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerDied","Data":"96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c"} Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.239033 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7c7m" event={"ID":"ca2cf854-ae2a-4556-a431-a7a0f0d82da6","Type":"ContainerDied","Data":"4f30ed1c52c03291d5d10a374de79ac054efa63a646ba2719ab234298f41c405"} Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.239038 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7c7m" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.239077 5129 scope.go:117] "RemoveContainer" containerID="96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.273661 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7c7m"] Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.284354 5129 scope.go:117] "RemoveContainer" containerID="3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.285858 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7c7m"] Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.319829 5129 scope.go:117] "RemoveContainer" containerID="7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.393389 5129 scope.go:117] "RemoveContainer" containerID="96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c" Mar 14 09:46:30 crc kubenswrapper[5129]: E0314 09:46:30.395699 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c\": container with ID starting with 96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c not found: ID does not exist" containerID="96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.395761 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c"} err="failed to get container status \"96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c\": rpc error: code = NotFound desc = could not find container \"96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c\": container with ID starting with 96c002750fff9cad8e9dd0639da1704cc60b6aa335af8a98846e5e5b8931d94c not found: ID does not exist" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.395790 5129 scope.go:117] "RemoveContainer" containerID="3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c" Mar 14 09:46:30 crc kubenswrapper[5129]: E0314 09:46:30.396557 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c\": container with ID starting with 3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c not found: ID does not exist" containerID="3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.396640 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c"} err="failed to get container status \"3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c\": rpc error: code = NotFound desc = could not find container \"3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c\": container with ID starting with 3134fe30a202624462840f277c7f1c315b98a106557c96d24ade4319448d486c not found: ID does not exist" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.396667 5129 scope.go:117] "RemoveContainer" containerID="7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9" Mar 14 09:46:30 crc kubenswrapper[5129]: E0314 09:46:30.397186 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9\": container with ID starting with 7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9 not found: ID does not exist" containerID="7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9" Mar 14 09:46:30 crc kubenswrapper[5129]: I0314 09:46:30.397278 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9"} err="failed to get container status \"7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9\": rpc error: code = NotFound desc = could not find container \"7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9\": container with ID starting with 7aa95cdc597e4c4fd20cf21c0de18401393e763754c4f084e43f8bcc320710d9 not found: ID does not exist" Mar 14 09:46:32 crc kubenswrapper[5129]: I0314 09:46:32.053795 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" path="/var/lib/kubelet/pods/ca2cf854-ae2a-4556-a431-a7a0f0d82da6/volumes" Mar 14 09:46:37 crc kubenswrapper[5129]: I0314 09:46:37.433880 5129 scope.go:117] "RemoveContainer" containerID="cf87037207a0cf1f4c446a27dfad5daac5e4fa823171734c3b313ef2a5061051" Mar 14 09:46:44 crc kubenswrapper[5129]: I0314 09:46:44.039415 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:46:44 crc kubenswrapper[5129]: E0314 09:46:44.048506 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:46:56 crc kubenswrapper[5129]: I0314 09:46:56.037285 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:46:56 crc kubenswrapper[5129]: E0314 09:46:56.038537 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:47:11 crc kubenswrapper[5129]: I0314 09:47:11.037067 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:47:11 crc kubenswrapper[5129]: E0314 09:47:11.038332 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:47:23 crc kubenswrapper[5129]: I0314 09:47:23.038292 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:47:23 crc kubenswrapper[5129]: E0314 09:47:23.040027 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.904569 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lrktq"] Mar 14 09:47:32 crc kubenswrapper[5129]: E0314 09:47:32.905943 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="extract-content" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.905961 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="extract-content" Mar 14 09:47:32 crc kubenswrapper[5129]: E0314 09:47:32.905991 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="extract-utilities" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.905998 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="extract-utilities" Mar 14 09:47:32 crc kubenswrapper[5129]: E0314 09:47:32.906022 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="registry-server" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.906028 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="registry-server" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.906262 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2cf854-ae2a-4556-a431-a7a0f0d82da6" containerName="registry-server" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.908174 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.934101 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lrktq"] Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.981447 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9b7\" (UniqueName: \"kubernetes.io/projected/b1d49833-c9d6-4787-9002-bac2efd7f2b4-kube-api-access-7c9b7\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.981545 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-utilities\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:32 crc kubenswrapper[5129]: I0314 09:47:32.981692 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-catalog-content\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.085747 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9b7\" (UniqueName: \"kubernetes.io/projected/b1d49833-c9d6-4787-9002-bac2efd7f2b4-kube-api-access-7c9b7\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.086311 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-utilities\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.086859 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-utilities\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.087130 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-catalog-content\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.087534 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-catalog-content\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.109921 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9b7\" (UniqueName: \"kubernetes.io/projected/b1d49833-c9d6-4787-9002-bac2efd7f2b4-kube-api-access-7c9b7\") pod \"redhat-operators-lrktq\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.237116 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:33 crc kubenswrapper[5129]: I0314 09:47:33.831505 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lrktq"] Mar 14 09:47:35 crc kubenswrapper[5129]: I0314 09:47:35.037748 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:47:35 crc kubenswrapper[5129]: E0314 09:47:35.038815 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:47:35 crc kubenswrapper[5129]: I0314 09:47:35.147232 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerID="89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5" exitCode=0 Mar 14 09:47:35 crc kubenswrapper[5129]: I0314 09:47:35.147665 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lrktq" event={"ID":"b1d49833-c9d6-4787-9002-bac2efd7f2b4","Type":"ContainerDied","Data":"89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5"} Mar 14 09:47:35 crc kubenswrapper[5129]: I0314 09:47:35.147832 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lrktq" event={"ID":"b1d49833-c9d6-4787-9002-bac2efd7f2b4","Type":"ContainerStarted","Data":"60c9dd556b1281e36617b25a076af3b4112212533baffe3d55ea15e01c1af947"} Mar 14 09:47:35 crc kubenswrapper[5129]: I0314 09:47:35.152922 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:47:36 crc kubenswrapper[5129]: I0314 09:47:36.162283 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lrktq" event={"ID":"b1d49833-c9d6-4787-9002-bac2efd7f2b4","Type":"ContainerStarted","Data":"01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b"} Mar 14 09:47:40 crc kubenswrapper[5129]: I0314 09:47:40.219168 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerID="01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b" exitCode=0 Mar 14 09:47:40 crc kubenswrapper[5129]: I0314 09:47:40.219231 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lrktq" event={"ID":"b1d49833-c9d6-4787-9002-bac2efd7f2b4","Type":"ContainerDied","Data":"01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b"} Mar 14 09:47:41 crc kubenswrapper[5129]: I0314 09:47:41.239961 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lrktq" event={"ID":"b1d49833-c9d6-4787-9002-bac2efd7f2b4","Type":"ContainerStarted","Data":"31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec"} Mar 14 09:47:41 crc kubenswrapper[5129]: I0314 09:47:41.287966 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lrktq" podStartSLOduration=3.757442352 podStartE2EDuration="9.287941659s" podCreationTimestamp="2026-03-14 09:47:32 +0000 UTC" firstStartedPulling="2026-03-14 09:47:35.152481701 +0000 UTC m=+10117.904396915" lastFinishedPulling="2026-03-14 09:47:40.682980998 +0000 UTC m=+10123.434896222" observedRunningTime="2026-03-14 09:47:41.273688862 +0000 UTC m=+10124.025604056" watchObservedRunningTime="2026-03-14 09:47:41.287941659 +0000 UTC m=+10124.039856843" Mar 14 09:47:43 crc kubenswrapper[5129]: I0314 09:47:43.238203 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:43 crc kubenswrapper[5129]: I0314 09:47:43.238269 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:44 crc kubenswrapper[5129]: I0314 09:47:44.305583 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lrktq" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="registry-server" probeResult="failure" output=< Mar 14 09:47:44 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:47:44 crc kubenswrapper[5129]: > Mar 14 09:47:46 crc kubenswrapper[5129]: I0314 09:47:46.037824 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:47:46 crc kubenswrapper[5129]: E0314 09:47:46.038931 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:47:52 crc kubenswrapper[5129]: I0314 09:47:52.402302 5129 generic.go:334] "Generic (PLEG): container finished" podID="5e83070c-b8b4-468d-bf05-414509537764" containerID="d3ed89ac6338ebf8f3f2df40f7555a885fe36cd3bf93233cbf6d80f0c5a24f60" exitCode=0 Mar 14 09:47:52 crc kubenswrapper[5129]: I0314 09:47:52.402556 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" event={"ID":"5e83070c-b8b4-468d-bf05-414509537764","Type":"ContainerDied","Data":"d3ed89ac6338ebf8f3f2df40f7555a885fe36cd3bf93233cbf6d80f0c5a24f60"} Mar 14 09:47:53 crc kubenswrapper[5129]: I0314 09:47:53.337283 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:53 crc kubenswrapper[5129]: I0314 09:47:53.418930 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:53 crc kubenswrapper[5129]: I0314 09:47:53.593220 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lrktq"] Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.022790 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.194056 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-0\") pod \"5e83070c-b8b4-468d-bf05-414509537764\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.194139 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-1\") pod \"5e83070c-b8b4-468d-bf05-414509537764\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.194173 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ssh-key-openstack-cell1\") pod \"5e83070c-b8b4-468d-bf05-414509537764\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.194258 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2h8n\" (UniqueName: \"kubernetes.io/projected/5e83070c-b8b4-468d-bf05-414509537764-kube-api-access-b2h8n\") pod \"5e83070c-b8b4-468d-bf05-414509537764\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.194356 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-2\") pod \"5e83070c-b8b4-468d-bf05-414509537764\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.194383 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-telemetry-combined-ca-bundle\") pod \"5e83070c-b8b4-468d-bf05-414509537764\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.194496 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-inventory\") pod \"5e83070c-b8b4-468d-bf05-414509537764\" (UID: \"5e83070c-b8b4-468d-bf05-414509537764\") " Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.201817 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e83070c-b8b4-468d-bf05-414509537764-kube-api-access-b2h8n" (OuterVolumeSpecName: "kube-api-access-b2h8n") pod "5e83070c-b8b4-468d-bf05-414509537764" (UID: "5e83070c-b8b4-468d-bf05-414509537764"). InnerVolumeSpecName "kube-api-access-b2h8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.202931 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5e83070c-b8b4-468d-bf05-414509537764" (UID: "5e83070c-b8b4-468d-bf05-414509537764"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.228869 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "5e83070c-b8b4-468d-bf05-414509537764" (UID: "5e83070c-b8b4-468d-bf05-414509537764"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.229554 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5e83070c-b8b4-468d-bf05-414509537764" (UID: "5e83070c-b8b4-468d-bf05-414509537764"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.230920 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "5e83070c-b8b4-468d-bf05-414509537764" (UID: "5e83070c-b8b4-468d-bf05-414509537764"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.239519 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-inventory" (OuterVolumeSpecName: "inventory") pod "5e83070c-b8b4-468d-bf05-414509537764" (UID: "5e83070c-b8b4-468d-bf05-414509537764"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.242818 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "5e83070c-b8b4-468d-bf05-414509537764" (UID: "5e83070c-b8b4-468d-bf05-414509537764"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.298386 5129 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.299000 5129 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.299018 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.299033 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2h8n\" (UniqueName: \"kubernetes.io/projected/5e83070c-b8b4-468d-bf05-414509537764-kube-api-access-b2h8n\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.299045 5129 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.299060 5129 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.299073 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e83070c-b8b4-468d-bf05-414509537764-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.437755 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" event={"ID":"5e83070c-b8b4-468d-bf05-414509537764","Type":"ContainerDied","Data":"593468c32e02946874469ed1a8b9c37601ccf6be3b4f4daa1f83f68ee4c09365"} Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.438792 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593468c32e02946874469ed1a8b9c37601ccf6be3b4f4daa1f83f68ee4c09365" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.438413 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lrktq" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="registry-server" containerID="cri-o://31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec" gracePeriod=2 Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.437833 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zmjr" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.726118 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56qrq"] Mar 14 09:47:54 crc kubenswrapper[5129]: E0314 09:47:54.726689 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e83070c-b8b4-468d-bf05-414509537764" containerName="telemetry-openstack-openstack-cell1" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.726712 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e83070c-b8b4-468d-bf05-414509537764" containerName="telemetry-openstack-openstack-cell1" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.726982 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e83070c-b8b4-468d-bf05-414509537764" containerName="telemetry-openstack-openstack-cell1" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.727922 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.731435 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.731697 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.731847 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.731993 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.732139 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.748900 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56qrq"] Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.888460 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.915034 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.915222 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.915272 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.915432 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:54 crc kubenswrapper[5129]: I0314 09:47:54.915469 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-kube-api-access-qjdfg\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.017081 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-utilities\") pod \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.017260 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-catalog-content\") pod \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.017375 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c9b7\" (UniqueName: \"kubernetes.io/projected/b1d49833-c9d6-4787-9002-bac2efd7f2b4-kube-api-access-7c9b7\") pod \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\" (UID: \"b1d49833-c9d6-4787-9002-bac2efd7f2b4\") " Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.017927 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.017969 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-kube-api-access-qjdfg\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.018085 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.018198 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.018242 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.019384 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-utilities" (OuterVolumeSpecName: "utilities") pod "b1d49833-c9d6-4787-9002-bac2efd7f2b4" (UID: "b1d49833-c9d6-4787-9002-bac2efd7f2b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.025883 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.025891 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.026467 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.026705 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d49833-c9d6-4787-9002-bac2efd7f2b4-kube-api-access-7c9b7" (OuterVolumeSpecName: "kube-api-access-7c9b7") pod "b1d49833-c9d6-4787-9002-bac2efd7f2b4" (UID: "b1d49833-c9d6-4787-9002-bac2efd7f2b4"). InnerVolumeSpecName "kube-api-access-7c9b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.040653 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.044015 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-kube-api-access-qjdfg\") pod \"neutron-sriov-openstack-openstack-cell1-56qrq\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.059305 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.120532 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c9b7\" (UniqueName: \"kubernetes.io/projected/b1d49833-c9d6-4787-9002-bac2efd7f2b4-kube-api-access-7c9b7\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.120571 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.228959 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1d49833-c9d6-4787-9002-bac2efd7f2b4" (UID: "b1d49833-c9d6-4787-9002-bac2efd7f2b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.337098 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d49833-c9d6-4787-9002-bac2efd7f2b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.456272 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerID="31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec" exitCode=0 Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.456340 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lrktq" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.456383 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lrktq" event={"ID":"b1d49833-c9d6-4787-9002-bac2efd7f2b4","Type":"ContainerDied","Data":"31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec"} Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.456457 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lrktq" event={"ID":"b1d49833-c9d6-4787-9002-bac2efd7f2b4","Type":"ContainerDied","Data":"60c9dd556b1281e36617b25a076af3b4112212533baffe3d55ea15e01c1af947"} Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.456511 5129 scope.go:117] "RemoveContainer" containerID="31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.510377 5129 scope.go:117] "RemoveContainer" containerID="01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.516156 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lrktq"] Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.543231 5129 scope.go:117] "RemoveContainer" containerID="89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.544343 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lrktq"] Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.615858 5129 scope.go:117] "RemoveContainer" containerID="31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec" Mar 14 09:47:55 crc kubenswrapper[5129]: E0314 09:47:55.617269 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec\": container with ID starting with 31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec not found: ID does not exist" containerID="31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.617708 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec"} err="failed to get container status \"31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec\": rpc error: code = NotFound desc = could not find container \"31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec\": container with ID starting with 31c40146571f2181ab44b22f7496bed7e8e2b0e7bd1ea22b5e0560e00409a2ec not found: ID does not exist" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.617826 5129 scope.go:117] "RemoveContainer" containerID="01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b" Mar 14 09:47:55 crc kubenswrapper[5129]: E0314 09:47:55.618410 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b\": container with ID starting with 01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b not found: ID does not exist" containerID="01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.618476 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b"} err="failed to get container status \"01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b\": rpc error: code = NotFound desc = could not find container \"01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b\": container with ID starting with 01aa0943b03357160074f14c61ff6520303a63f1978081536e7147a35aa5d60b not found: ID does not exist" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.618519 5129 scope.go:117] "RemoveContainer" containerID="89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5" Mar 14 09:47:55 crc kubenswrapper[5129]: E0314 09:47:55.618883 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5\": container with ID starting with 89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5 not found: ID does not exist" containerID="89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.618908 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5"} err="failed to get container status \"89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5\": rpc error: code = NotFound desc = could not find container \"89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5\": container with ID starting with 89c5f83c798d4bc1a993305fa1cbf5640a5d7029ca189ab88c79bf1edb7652f5 not found: ID does not exist" Mar 14 09:47:55 crc kubenswrapper[5129]: I0314 09:47:55.728301 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56qrq"] Mar 14 09:47:56 crc kubenswrapper[5129]: I0314 09:47:56.052037 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" path="/var/lib/kubelet/pods/b1d49833-c9d6-4787-9002-bac2efd7f2b4/volumes" Mar 14 09:47:56 crc kubenswrapper[5129]: I0314 09:47:56.468568 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" event={"ID":"1b7e8f66-393d-491c-b548-2eb8e08b7b1c","Type":"ContainerStarted","Data":"d4768b8b22a5a6d06ee20d3cbff743292bcd62cd77f633c97be097eccaf24410"} Mar 14 09:47:57 crc kubenswrapper[5129]: I0314 09:47:57.484434 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" event={"ID":"1b7e8f66-393d-491c-b548-2eb8e08b7b1c","Type":"ContainerStarted","Data":"6d2a48e47ff8ea112ccbf5c3849923012d2c97dff57382b015f6a59bff68fea3"} Mar 14 09:47:57 crc kubenswrapper[5129]: I0314 09:47:57.515653 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" podStartSLOduration=3.002716601 podStartE2EDuration="3.515597642s" podCreationTimestamp="2026-03-14 09:47:54 +0000 UTC" firstStartedPulling="2026-03-14 09:47:55.736714047 +0000 UTC m=+10138.488629251" lastFinishedPulling="2026-03-14 09:47:56.249595108 +0000 UTC m=+10139.001510292" observedRunningTime="2026-03-14 09:47:57.512974361 +0000 UTC m=+10140.264889585" watchObservedRunningTime="2026-03-14 09:47:57.515597642 +0000 UTC m=+10140.267512866" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.039181 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:48:00 crc kubenswrapper[5129]: E0314 09:48:00.040592 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.147265 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558028-9c48n"] Mar 14 09:48:00 crc kubenswrapper[5129]: E0314 09:48:00.147901 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="extract-utilities" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.147925 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="extract-utilities" Mar 14 09:48:00 crc kubenswrapper[5129]: E0314 09:48:00.147940 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="extract-content" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.147948 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="extract-content" Mar 14 09:48:00 crc kubenswrapper[5129]: E0314 09:48:00.147961 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="registry-server" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.147970 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="registry-server" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.148281 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d49833-c9d6-4787-9002-bac2efd7f2b4" containerName="registry-server" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.149227 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-9c48n" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.151934 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.152290 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.152570 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.161566 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-9c48n"] Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.180434 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqg8r\" (UniqueName: \"kubernetes.io/projected/7e3134ce-77da-48c0-a333-5ba15fc92075-kube-api-access-wqg8r\") pod \"auto-csr-approver-29558028-9c48n\" (UID: \"7e3134ce-77da-48c0-a333-5ba15fc92075\") " pod="openshift-infra/auto-csr-approver-29558028-9c48n" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.283072 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqg8r\" (UniqueName: \"kubernetes.io/projected/7e3134ce-77da-48c0-a333-5ba15fc92075-kube-api-access-wqg8r\") pod \"auto-csr-approver-29558028-9c48n\" (UID: \"7e3134ce-77da-48c0-a333-5ba15fc92075\") " pod="openshift-infra/auto-csr-approver-29558028-9c48n" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.311534 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqg8r\" (UniqueName: \"kubernetes.io/projected/7e3134ce-77da-48c0-a333-5ba15fc92075-kube-api-access-wqg8r\") pod \"auto-csr-approver-29558028-9c48n\" (UID: \"7e3134ce-77da-48c0-a333-5ba15fc92075\") " pod="openshift-infra/auto-csr-approver-29558028-9c48n" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.481733 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-9c48n" Mar 14 09:48:00 crc kubenswrapper[5129]: I0314 09:48:00.996570 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-9c48n"] Mar 14 09:48:01 crc kubenswrapper[5129]: I0314 09:48:01.543413 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-9c48n" event={"ID":"7e3134ce-77da-48c0-a333-5ba15fc92075","Type":"ContainerStarted","Data":"86fdf744d8eb47658a04ff71ed3c0b0cea7d26c525cf448abba822467f409d57"} Mar 14 09:48:02 crc kubenswrapper[5129]: I0314 09:48:02.558590 5129 generic.go:334] "Generic (PLEG): container finished" podID="7e3134ce-77da-48c0-a333-5ba15fc92075" containerID="6bcc3d092f44b453c23c9935b660f292c545da33bae0cf773f0de7c6e3cbfa65" exitCode=0 Mar 14 09:48:02 crc kubenswrapper[5129]: I0314 09:48:02.558701 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-9c48n" event={"ID":"7e3134ce-77da-48c0-a333-5ba15fc92075","Type":"ContainerDied","Data":"6bcc3d092f44b453c23c9935b660f292c545da33bae0cf773f0de7c6e3cbfa65"} Mar 14 09:48:04 crc kubenswrapper[5129]: I0314 09:48:04.007996 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-9c48n" Mar 14 09:48:04 crc kubenswrapper[5129]: I0314 09:48:04.179556 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqg8r\" (UniqueName: \"kubernetes.io/projected/7e3134ce-77da-48c0-a333-5ba15fc92075-kube-api-access-wqg8r\") pod \"7e3134ce-77da-48c0-a333-5ba15fc92075\" (UID: \"7e3134ce-77da-48c0-a333-5ba15fc92075\") " Mar 14 09:48:04 crc kubenswrapper[5129]: I0314 09:48:04.185479 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3134ce-77da-48c0-a333-5ba15fc92075-kube-api-access-wqg8r" (OuterVolumeSpecName: "kube-api-access-wqg8r") pod "7e3134ce-77da-48c0-a333-5ba15fc92075" (UID: "7e3134ce-77da-48c0-a333-5ba15fc92075"). InnerVolumeSpecName "kube-api-access-wqg8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:48:04 crc kubenswrapper[5129]: I0314 09:48:04.283321 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqg8r\" (UniqueName: \"kubernetes.io/projected/7e3134ce-77da-48c0-a333-5ba15fc92075-kube-api-access-wqg8r\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:04 crc kubenswrapper[5129]: I0314 09:48:04.592675 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-9c48n" event={"ID":"7e3134ce-77da-48c0-a333-5ba15fc92075","Type":"ContainerDied","Data":"86fdf744d8eb47658a04ff71ed3c0b0cea7d26c525cf448abba822467f409d57"} Mar 14 09:48:04 crc kubenswrapper[5129]: I0314 09:48:04.592731 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86fdf744d8eb47658a04ff71ed3c0b0cea7d26c525cf448abba822467f409d57" Mar 14 09:48:04 crc kubenswrapper[5129]: I0314 09:48:04.592787 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-9c48n" Mar 14 09:48:05 crc kubenswrapper[5129]: I0314 09:48:05.118398 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-cjhmx"] Mar 14 09:48:05 crc kubenswrapper[5129]: I0314 09:48:05.165474 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-cjhmx"] Mar 14 09:48:06 crc kubenswrapper[5129]: I0314 09:48:06.056863 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cdb4f4-8350-4920-b5a3-5097987bf81a" path="/var/lib/kubelet/pods/90cdb4f4-8350-4920-b5a3-5097987bf81a/volumes" Mar 14 09:48:15 crc kubenswrapper[5129]: I0314 09:48:15.038349 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:48:15 crc kubenswrapper[5129]: E0314 09:48:15.040157 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:48:27 crc kubenswrapper[5129]: I0314 09:48:27.037567 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:48:27 crc kubenswrapper[5129]: E0314 09:48:27.038911 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:48:37 crc kubenswrapper[5129]: I0314 09:48:37.600403 5129 scope.go:117] "RemoveContainer" containerID="af09795cd96bddd051e6a54886fbb3c130c9ce0f35174a98ee6144185ac3efd2" Mar 14 09:48:39 crc kubenswrapper[5129]: I0314 09:48:39.037719 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:48:39 crc kubenswrapper[5129]: E0314 09:48:39.038581 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:48:41 crc kubenswrapper[5129]: I0314 09:48:41.141637 5129 generic.go:334] "Generic (PLEG): container finished" podID="1b7e8f66-393d-491c-b548-2eb8e08b7b1c" containerID="6d2a48e47ff8ea112ccbf5c3849923012d2c97dff57382b015f6a59bff68fea3" exitCode=0 Mar 14 09:48:41 crc kubenswrapper[5129]: I0314 09:48:41.141682 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" event={"ID":"1b7e8f66-393d-491c-b548-2eb8e08b7b1c","Type":"ContainerDied","Data":"6d2a48e47ff8ea112ccbf5c3849923012d2c97dff57382b015f6a59bff68fea3"} Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.683768 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.774259 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-ssh-key-openstack-cell1\") pod \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.774403 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-agent-neutron-config-0\") pod \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.774507 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-combined-ca-bundle\") pod \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.774660 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-inventory\") pod \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.774815 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-kube-api-access-qjdfg\") pod \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\" (UID: \"1b7e8f66-393d-491c-b548-2eb8e08b7b1c\") " Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.794908 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "1b7e8f66-393d-491c-b548-2eb8e08b7b1c" (UID: "1b7e8f66-393d-491c-b548-2eb8e08b7b1c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.795160 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-kube-api-access-qjdfg" (OuterVolumeSpecName: "kube-api-access-qjdfg") pod "1b7e8f66-393d-491c-b548-2eb8e08b7b1c" (UID: "1b7e8f66-393d-491c-b548-2eb8e08b7b1c"). InnerVolumeSpecName "kube-api-access-qjdfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.810685 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "1b7e8f66-393d-491c-b548-2eb8e08b7b1c" (UID: "1b7e8f66-393d-491c-b548-2eb8e08b7b1c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.816533 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "1b7e8f66-393d-491c-b548-2eb8e08b7b1c" (UID: "1b7e8f66-393d-491c-b548-2eb8e08b7b1c"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.832253 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-inventory" (OuterVolumeSpecName: "inventory") pod "1b7e8f66-393d-491c-b548-2eb8e08b7b1c" (UID: "1b7e8f66-393d-491c-b548-2eb8e08b7b1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.878091 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-kube-api-access-qjdfg\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.878371 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.878560 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.878757 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:42 crc kubenswrapper[5129]: I0314 09:48:42.878888 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b7e8f66-393d-491c-b548-2eb8e08b7b1c-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.173565 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" event={"ID":"1b7e8f66-393d-491c-b548-2eb8e08b7b1c","Type":"ContainerDied","Data":"d4768b8b22a5a6d06ee20d3cbff743292bcd62cd77f633c97be097eccaf24410"} Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.173773 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4768b8b22a5a6d06ee20d3cbff743292bcd62cd77f633c97be097eccaf24410" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.173667 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56qrq" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.389748 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5"] Mar 14 09:48:43 crc kubenswrapper[5129]: E0314 09:48:43.391310 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3134ce-77da-48c0-a333-5ba15fc92075" containerName="oc" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.391347 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3134ce-77da-48c0-a333-5ba15fc92075" containerName="oc" Mar 14 09:48:43 crc kubenswrapper[5129]: E0314 09:48:43.391418 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7e8f66-393d-491c-b548-2eb8e08b7b1c" containerName="neutron-sriov-openstack-openstack-cell1" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.391430 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7e8f66-393d-491c-b548-2eb8e08b7b1c" containerName="neutron-sriov-openstack-openstack-cell1" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.391730 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3134ce-77da-48c0-a333-5ba15fc92075" containerName="oc" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.391790 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b7e8f66-393d-491c-b548-2eb8e08b7b1c" containerName="neutron-sriov-openstack-openstack-cell1" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.392951 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.395522 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.397132 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.398191 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.398951 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.399159 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.403827 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5"] Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.508885 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.509177 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.509377 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.509559 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkngk\" (UniqueName: \"kubernetes.io/projected/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-kube-api-access-gkngk\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.509665 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.612326 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.612473 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkngk\" (UniqueName: \"kubernetes.io/projected/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-kube-api-access-gkngk\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.612535 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.612744 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.612806 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.619437 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.619548 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.620367 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.621414 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.639800 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkngk\" (UniqueName: \"kubernetes.io/projected/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-kube-api-access-gkngk\") pod \"neutron-dhcp-openstack-openstack-cell1-tg2d5\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:43 crc kubenswrapper[5129]: I0314 09:48:43.716505 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:48:44 crc kubenswrapper[5129]: I0314 09:48:44.393288 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5"] Mar 14 09:48:45 crc kubenswrapper[5129]: I0314 09:48:45.197174 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" event={"ID":"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a","Type":"ContainerStarted","Data":"e21d26d52c4083fd9ef7f9597e3a8ac094dc98bf28465868f96caec94278f6c4"} Mar 14 09:48:45 crc kubenswrapper[5129]: I0314 09:48:45.197765 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" event={"ID":"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a","Type":"ContainerStarted","Data":"4541df561ceddf64f1f62500abd03cabfce416a0a74d1256e0f4d6aac8bc792d"} Mar 14 09:48:45 crc kubenswrapper[5129]: I0314 09:48:45.232558 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" podStartSLOduration=1.829992758 podStartE2EDuration="2.232533643s" podCreationTimestamp="2026-03-14 09:48:43 +0000 UTC" firstStartedPulling="2026-03-14 09:48:44.379840809 +0000 UTC m=+10187.131756023" lastFinishedPulling="2026-03-14 09:48:44.782381704 +0000 UTC m=+10187.534296908" observedRunningTime="2026-03-14 09:48:45.223121088 +0000 UTC m=+10187.975036342" watchObservedRunningTime="2026-03-14 09:48:45.232533643 +0000 UTC m=+10187.984448837" Mar 14 09:48:53 crc kubenswrapper[5129]: I0314 09:48:53.037343 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:48:54 crc kubenswrapper[5129]: I0314 09:48:54.325627 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"39ea69b02689a042635c743d7d2e440d716c42761508e3ed743fe8476218b55d"} Mar 14 09:49:09 crc kubenswrapper[5129]: I0314 09:49:09.136930 5129 trace.go:236] Trace[97804955]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (14-Mar-2026 09:49:08.096) (total time: 1040ms): Mar 14 09:49:09 crc kubenswrapper[5129]: Trace[97804955]: [1.040533483s] [1.040533483s] END Mar 14 09:49:49 crc kubenswrapper[5129]: I0314 09:49:49.203480 5129 generic.go:334] "Generic (PLEG): container finished" podID="0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" containerID="e21d26d52c4083fd9ef7f9597e3a8ac094dc98bf28465868f96caec94278f6c4" exitCode=0 Mar 14 09:49:49 crc kubenswrapper[5129]: I0314 09:49:49.203578 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" event={"ID":"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a","Type":"ContainerDied","Data":"e21d26d52c4083fd9ef7f9597e3a8ac094dc98bf28465868f96caec94278f6c4"} Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.706520 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.808265 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkngk\" (UniqueName: \"kubernetes.io/projected/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-kube-api-access-gkngk\") pod \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.808489 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-ssh-key-openstack-cell1\") pod \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.808570 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-agent-neutron-config-0\") pod \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.808596 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-inventory\") pod \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.808770 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-combined-ca-bundle\") pod \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\" (UID: \"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a\") " Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.822979 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-kube-api-access-gkngk" (OuterVolumeSpecName: "kube-api-access-gkngk") pod "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" (UID: "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a"). InnerVolumeSpecName "kube-api-access-gkngk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.832926 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" (UID: "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.854803 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" (UID: "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.866947 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-inventory" (OuterVolumeSpecName: "inventory") pod "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" (UID: "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.880908 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" (UID: "0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.912077 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkngk\" (UniqueName: \"kubernetes.io/projected/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-kube-api-access-gkngk\") on node \"crc\" DevicePath \"\"" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.912130 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.912151 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.912170 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:49:50 crc kubenswrapper[5129]: I0314 09:49:50.912193 5129 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:49:51 crc kubenswrapper[5129]: I0314 09:49:51.230317 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" event={"ID":"0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a","Type":"ContainerDied","Data":"4541df561ceddf64f1f62500abd03cabfce416a0a74d1256e0f4d6aac8bc792d"} Mar 14 09:49:51 crc kubenswrapper[5129]: I0314 09:49:51.230372 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4541df561ceddf64f1f62500abd03cabfce416a0a74d1256e0f4d6aac8bc792d" Mar 14 09:49:51 crc kubenswrapper[5129]: I0314 09:49:51.230386 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tg2d5" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.177442 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558030-l827m"] Mar 14 09:50:00 crc kubenswrapper[5129]: E0314 09:50:00.179212 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.179239 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.179836 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.181960 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-l827m" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.185393 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.185837 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.186071 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.197017 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-l827m"] Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.302982 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkmf\" (UniqueName: \"kubernetes.io/projected/c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25-kube-api-access-mrkmf\") pod \"auto-csr-approver-29558030-l827m\" (UID: \"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25\") " pod="openshift-infra/auto-csr-approver-29558030-l827m" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.405710 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkmf\" (UniqueName: \"kubernetes.io/projected/c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25-kube-api-access-mrkmf\") pod \"auto-csr-approver-29558030-l827m\" (UID: \"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25\") " pod="openshift-infra/auto-csr-approver-29558030-l827m" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.437579 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkmf\" (UniqueName: \"kubernetes.io/projected/c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25-kube-api-access-mrkmf\") pod \"auto-csr-approver-29558030-l827m\" (UID: \"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25\") " pod="openshift-infra/auto-csr-approver-29558030-l827m" Mar 14 09:50:00 crc kubenswrapper[5129]: I0314 09:50:00.511758 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-l827m" Mar 14 09:50:01 crc kubenswrapper[5129]: I0314 09:50:01.093928 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-l827m"] Mar 14 09:50:01 crc kubenswrapper[5129]: W0314 09:50:01.116260 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d00e8e_a3ff_464e_87bf_b5cfbd4d0c25.slice/crio-bd8b12190c66e01dac391a532eb6aaada8880a181c5a0402500e1a7befe76a86 WatchSource:0}: Error finding container bd8b12190c66e01dac391a532eb6aaada8880a181c5a0402500e1a7befe76a86: Status 404 returned error can't find the container with id bd8b12190c66e01dac391a532eb6aaada8880a181c5a0402500e1a7befe76a86 Mar 14 09:50:01 crc kubenswrapper[5129]: I0314 09:50:01.354769 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-l827m" event={"ID":"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25","Type":"ContainerStarted","Data":"bd8b12190c66e01dac391a532eb6aaada8880a181c5a0402500e1a7befe76a86"} Mar 14 09:50:02 crc kubenswrapper[5129]: E0314 09:50:02.873640 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d00e8e_a3ff_464e_87bf_b5cfbd4d0c25.slice/crio-conmon-f97b2630266a6a99f241ea7371794aa2573a6f8651c7d6b6c5bf8d37f6ee3c5e.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:50:03 crc kubenswrapper[5129]: I0314 09:50:03.387005 5129 generic.go:334] "Generic (PLEG): container finished" podID="c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25" containerID="f97b2630266a6a99f241ea7371794aa2573a6f8651c7d6b6c5bf8d37f6ee3c5e" exitCode=0 Mar 14 09:50:03 crc kubenswrapper[5129]: I0314 09:50:03.387088 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-l827m" event={"ID":"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25","Type":"ContainerDied","Data":"f97b2630266a6a99f241ea7371794aa2573a6f8651c7d6b6c5bf8d37f6ee3c5e"} Mar 14 09:50:04 crc kubenswrapper[5129]: I0314 09:50:04.831926 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-l827m" Mar 14 09:50:04 crc kubenswrapper[5129]: I0314 09:50:04.934025 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrkmf\" (UniqueName: \"kubernetes.io/projected/c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25-kube-api-access-mrkmf\") pod \"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25\" (UID: \"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25\") " Mar 14 09:50:04 crc kubenswrapper[5129]: I0314 09:50:04.940906 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25-kube-api-access-mrkmf" (OuterVolumeSpecName: "kube-api-access-mrkmf") pod "c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25" (UID: "c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25"). InnerVolumeSpecName "kube-api-access-mrkmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:05 crc kubenswrapper[5129]: I0314 09:50:05.037746 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrkmf\" (UniqueName: \"kubernetes.io/projected/c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25-kube-api-access-mrkmf\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:05 crc kubenswrapper[5129]: I0314 09:50:05.417661 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-l827m" event={"ID":"c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25","Type":"ContainerDied","Data":"bd8b12190c66e01dac391a532eb6aaada8880a181c5a0402500e1a7befe76a86"} Mar 14 09:50:05 crc kubenswrapper[5129]: I0314 09:50:05.417726 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd8b12190c66e01dac391a532eb6aaada8880a181c5a0402500e1a7befe76a86" Mar 14 09:50:05 crc kubenswrapper[5129]: I0314 09:50:05.417752 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-l827m" Mar 14 09:50:05 crc kubenswrapper[5129]: I0314 09:50:05.932515 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-vmbd2"] Mar 14 09:50:05 crc kubenswrapper[5129]: I0314 09:50:05.948892 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-vmbd2"] Mar 14 09:50:06 crc kubenswrapper[5129]: I0314 09:50:06.053329 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89441e20-499b-4f4f-b396-fc01294f25bb" path="/var/lib/kubelet/pods/89441e20-499b-4f4f-b396-fc01294f25bb/volumes" Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.232808 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.233964 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="6f79de3c-8258-4bcf-a312-057813424e32" containerName="nova-cell0-conductor-conductor" containerID="cri-o://574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" gracePeriod=30 Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.808745 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.809472 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="21a58896-9f4d-4489-9153-a3b8afe3cf4d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff" gracePeriod=30 Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.966275 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.966537 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" containerName="nova-scheduler-scheduler" containerID="cri-o://472634747f52059da3d761b3a3bdd41094664ee0ace37406a9752c847b14d94e" gracePeriod=30 Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.998200 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.998575 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-log" containerID="cri-o://9e8a4770203afb39f79326205afcb095edbeea6008e03a7fe92d9458e7d894a6" gracePeriod=30 Mar 14 09:50:13 crc kubenswrapper[5129]: I0314 09:50:13.999207 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-api" containerID="cri-o://cbdfb8b0ab3287febbf034c3a49f809061aa20d007aa0cba7d27620d21d4a9b7" gracePeriod=30 Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.010991 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.011280 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-log" containerID="cri-o://4bf71143b0559213036068f1f3d659044bcc0bf54ddd158b895ec54f9bae65f0" gracePeriod=30 Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.011402 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-metadata" containerID="cri-o://180360d91d5ab64db48f11db1ed5a475e0f0bcbb71ddadbcad80fb051c73ecb0" gracePeriod=30 Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.527965 5129 generic.go:334] "Generic (PLEG): container finished" podID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerID="9e8a4770203afb39f79326205afcb095edbeea6008e03a7fe92d9458e7d894a6" exitCode=143 Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.528326 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af1b5d6b-7c27-46cc-932c-cccec1fd8eca","Type":"ContainerDied","Data":"9e8a4770203afb39f79326205afcb095edbeea6008e03a7fe92d9458e7d894a6"} Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.546947 5129 generic.go:334] "Generic (PLEG): container finished" podID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerID="4bf71143b0559213036068f1f3d659044bcc0bf54ddd158b895ec54f9bae65f0" exitCode=143 Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.547002 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc194f90-df3e-4808-a5a6-8cbff23f04a0","Type":"ContainerDied","Data":"4bf71143b0559213036068f1f3d659044bcc0bf54ddd158b895ec54f9bae65f0"} Mar 14 09:50:14 crc kubenswrapper[5129]: I0314 09:50:14.899361 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.011402 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v592m\" (UniqueName: \"kubernetes.io/projected/21a58896-9f4d-4489-9153-a3b8afe3cf4d-kube-api-access-v592m\") pod \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.011972 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-combined-ca-bundle\") pod \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.012103 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-config-data\") pod \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\" (UID: \"21a58896-9f4d-4489-9153-a3b8afe3cf4d\") " Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.018908 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a58896-9f4d-4489-9153-a3b8afe3cf4d-kube-api-access-v592m" (OuterVolumeSpecName: "kube-api-access-v592m") pod "21a58896-9f4d-4489-9153-a3b8afe3cf4d" (UID: "21a58896-9f4d-4489-9153-a3b8afe3cf4d"). InnerVolumeSpecName "kube-api-access-v592m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.048065 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-config-data" (OuterVolumeSpecName: "config-data") pod "21a58896-9f4d-4489-9153-a3b8afe3cf4d" (UID: "21a58896-9f4d-4489-9153-a3b8afe3cf4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.055569 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21a58896-9f4d-4489-9153-a3b8afe3cf4d" (UID: "21a58896-9f4d-4489-9153-a3b8afe3cf4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.114477 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v592m\" (UniqueName: \"kubernetes.io/projected/21a58896-9f4d-4489-9153-a3b8afe3cf4d-kube-api-access-v592m\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.114518 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.114532 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a58896-9f4d-4489-9153-a3b8afe3cf4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.557715 5129 generic.go:334] "Generic (PLEG): container finished" podID="21a58896-9f4d-4489-9153-a3b8afe3cf4d" containerID="83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff" exitCode=0 Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.557767 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"21a58896-9f4d-4489-9153-a3b8afe3cf4d","Type":"ContainerDied","Data":"83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff"} Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.557780 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.557804 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"21a58896-9f4d-4489-9153-a3b8afe3cf4d","Type":"ContainerDied","Data":"903bb0fc346f245841ed75c5f73cadcd8387c83124e697ca8f88a5c6754e4418"} Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.557826 5129 scope.go:117] "RemoveContainer" containerID="83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.595412 5129 scope.go:117] "RemoveContainer" containerID="83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff" Mar 14 09:50:15 crc kubenswrapper[5129]: E0314 09:50:15.598849 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff\": container with ID starting with 83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff not found: ID does not exist" containerID="83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.598929 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff"} err="failed to get container status \"83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff\": rpc error: code = NotFound desc = could not find container \"83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff\": container with ID starting with 83c7ffc57bf3bf2e0bf3ecf8b5a8730366d28b41616007b4398844ae1165bbff not found: ID does not exist" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.617504 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.649687 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.679658 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:50:15 crc kubenswrapper[5129]: E0314 09:50:15.680204 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25" containerName="oc" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.680224 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25" containerName="oc" Mar 14 09:50:15 crc kubenswrapper[5129]: E0314 09:50:15.680245 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a58896-9f4d-4489-9153-a3b8afe3cf4d" containerName="nova-cell1-conductor-conductor" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.680251 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a58896-9f4d-4489-9153-a3b8afe3cf4d" containerName="nova-cell1-conductor-conductor" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.680431 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25" containerName="oc" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.680455 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a58896-9f4d-4489-9153-a3b8afe3cf4d" containerName="nova-cell1-conductor-conductor" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.681328 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.693315 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.699159 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.728141 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmsk\" (UniqueName: \"kubernetes.io/projected/43a2480b-1483-449f-b24a-7d2213ad8a2f-kube-api-access-ccmsk\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.728202 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a2480b-1483-449f-b24a-7d2213ad8a2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.728346 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a2480b-1483-449f-b24a-7d2213ad8a2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.830841 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a2480b-1483-449f-b24a-7d2213ad8a2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.830946 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmsk\" (UniqueName: \"kubernetes.io/projected/43a2480b-1483-449f-b24a-7d2213ad8a2f-kube-api-access-ccmsk\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.830977 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a2480b-1483-449f-b24a-7d2213ad8a2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.843489 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a2480b-1483-449f-b24a-7d2213ad8a2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.847869 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a2480b-1483-449f-b24a-7d2213ad8a2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:15 crc kubenswrapper[5129]: I0314 09:50:15.880475 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmsk\" (UniqueName: \"kubernetes.io/projected/43a2480b-1483-449f-b24a-7d2213ad8a2f-kube-api-access-ccmsk\") pod \"nova-cell1-conductor-0\" (UID: \"43a2480b-1483-449f-b24a-7d2213ad8a2f\") " pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:16 crc kubenswrapper[5129]: I0314 09:50:16.034557 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:16 crc kubenswrapper[5129]: I0314 09:50:16.052840 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a58896-9f4d-4489-9153-a3b8afe3cf4d" path="/var/lib/kubelet/pods/21a58896-9f4d-4489-9153-a3b8afe3cf4d/volumes" Mar 14 09:50:17 crc kubenswrapper[5129]: E0314 09:50:17.296920 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088 is running failed: container process not found" containerID="574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 09:50:17 crc kubenswrapper[5129]: E0314 09:50:17.298180 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088 is running failed: container process not found" containerID="574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 09:50:17 crc kubenswrapper[5129]: E0314 09:50:17.298639 5129 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088 is running failed: container process not found" containerID="574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 14 09:50:17 crc kubenswrapper[5129]: E0314 09:50:17.298676 5129 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="6f79de3c-8258-4bcf-a312-057813424e32" containerName="nova-cell0-conductor-conductor" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.366948 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 09:50:17 crc kubenswrapper[5129]: W0314 09:50:17.404840 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43a2480b_1483_449f_b24a_7d2213ad8a2f.slice/crio-f618b663ca0137586a42a8c742222953ec54fe9f497315f4c6b00919910d5922 WatchSource:0}: Error finding container f618b663ca0137586a42a8c742222953ec54fe9f497315f4c6b00919910d5922: Status 404 returned error can't find the container with id f618b663ca0137586a42a8c742222953ec54fe9f497315f4c6b00919910d5922 Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.481874 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.571142 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq52d\" (UniqueName: \"kubernetes.io/projected/6f79de3c-8258-4bcf-a312-057813424e32-kube-api-access-kq52d\") pod \"6f79de3c-8258-4bcf-a312-057813424e32\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.571306 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-combined-ca-bundle\") pod \"6f79de3c-8258-4bcf-a312-057813424e32\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.571447 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-config-data\") pod \"6f79de3c-8258-4bcf-a312-057813424e32\" (UID: \"6f79de3c-8258-4bcf-a312-057813424e32\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.578748 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f79de3c-8258-4bcf-a312-057813424e32-kube-api-access-kq52d" (OuterVolumeSpecName: "kube-api-access-kq52d") pod "6f79de3c-8258-4bcf-a312-057813424e32" (UID: "6f79de3c-8258-4bcf-a312-057813424e32"). InnerVolumeSpecName "kube-api-access-kq52d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.585318 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"43a2480b-1483-449f-b24a-7d2213ad8a2f","Type":"ContainerStarted","Data":"f618b663ca0137586a42a8c742222953ec54fe9f497315f4c6b00919910d5922"} Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.587083 5129 generic.go:334] "Generic (PLEG): container finished" podID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerID="cbdfb8b0ab3287febbf034c3a49f809061aa20d007aa0cba7d27620d21d4a9b7" exitCode=0 Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.587130 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af1b5d6b-7c27-46cc-932c-cccec1fd8eca","Type":"ContainerDied","Data":"cbdfb8b0ab3287febbf034c3a49f809061aa20d007aa0cba7d27620d21d4a9b7"} Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.589279 5129 generic.go:334] "Generic (PLEG): container finished" podID="03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" containerID="472634747f52059da3d761b3a3bdd41094664ee0ace37406a9752c847b14d94e" exitCode=0 Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.589324 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5","Type":"ContainerDied","Data":"472634747f52059da3d761b3a3bdd41094664ee0ace37406a9752c847b14d94e"} Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.589340 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5","Type":"ContainerDied","Data":"6212e83dc95f430abcb73e89c4a4c82a0dd7dc29612f3c0a1ed0d1008b793d1f"} Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.589351 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6212e83dc95f430abcb73e89c4a4c82a0dd7dc29612f3c0a1ed0d1008b793d1f" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.590731 5129 generic.go:334] "Generic (PLEG): container finished" podID="6f79de3c-8258-4bcf-a312-057813424e32" containerID="574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" exitCode=0 Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.590793 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f79de3c-8258-4bcf-a312-057813424e32","Type":"ContainerDied","Data":"574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088"} Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.590825 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f79de3c-8258-4bcf-a312-057813424e32","Type":"ContainerDied","Data":"49c3271b49cf9e621325f1d86c01adba6c37200663e868c3663ef37fe49a2e0d"} Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.590846 5129 scope.go:117] "RemoveContainer" containerID="574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.590896 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.590973 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.593649 5129 generic.go:334] "Generic (PLEG): container finished" podID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerID="180360d91d5ab64db48f11db1ed5a475e0f0bcbb71ddadbcad80fb051c73ecb0" exitCode=0 Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.593673 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc194f90-df3e-4808-a5a6-8cbff23f04a0","Type":"ContainerDied","Data":"180360d91d5ab64db48f11db1ed5a475e0f0bcbb71ddadbcad80fb051c73ecb0"} Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.632919 5129 scope.go:117] "RemoveContainer" containerID="574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" Mar 14 09:50:17 crc kubenswrapper[5129]: E0314 09:50:17.633301 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088\": container with ID starting with 574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088 not found: ID does not exist" containerID="574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.633308 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f79de3c-8258-4bcf-a312-057813424e32" (UID: "6f79de3c-8258-4bcf-a312-057813424e32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.633343 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088"} err="failed to get container status \"574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088\": rpc error: code = NotFound desc = could not find container \"574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088\": container with ID starting with 574ebf49806272ec598c228b7d81f86c67d26bffd88e31742277301873ef8088 not found: ID does not exist" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.652079 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-config-data" (OuterVolumeSpecName: "config-data") pod "6f79de3c-8258-4bcf-a312-057813424e32" (UID: "6f79de3c-8258-4bcf-a312-057813424e32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.680267 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-combined-ca-bundle\") pod \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.680534 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc489\" (UniqueName: \"kubernetes.io/projected/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-kube-api-access-mc489\") pod \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.680743 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-config-data\") pod \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\" (UID: \"03ba9afa-3daa-45ae-849c-ebfa4cbc08a5\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.681858 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.681886 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq52d\" (UniqueName: \"kubernetes.io/projected/6f79de3c-8258-4bcf-a312-057813424e32-kube-api-access-kq52d\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.681900 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f79de3c-8258-4bcf-a312-057813424e32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.722004 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-kube-api-access-mc489" (OuterVolumeSpecName: "kube-api-access-mc489") pod "03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" (UID: "03ba9afa-3daa-45ae-849c-ebfa4cbc08a5"). InnerVolumeSpecName "kube-api-access-mc489". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.741846 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-config-data" (OuterVolumeSpecName: "config-data") pod "03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" (UID: "03ba9afa-3daa-45ae-849c-ebfa4cbc08a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.748145 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.756638 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" (UID: "03ba9afa-3daa-45ae-849c-ebfa4cbc08a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.762958 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796395 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-config-data\") pod \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796458 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc194f90-df3e-4808-a5a6-8cbff23f04a0-logs\") pod \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796508 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-nova-metadata-tls-certs\") pod \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796583 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-logs\") pod \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796674 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-config-data\") pod \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796752 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-combined-ca-bundle\") pod \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796805 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dtbt\" (UniqueName: \"kubernetes.io/projected/fc194f90-df3e-4808-a5a6-8cbff23f04a0-kube-api-access-6dtbt\") pod \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796838 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-internal-tls-certs\") pod \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796896 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djc67\" (UniqueName: \"kubernetes.io/projected/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-kube-api-access-djc67\") pod \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.796988 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-public-tls-certs\") pod \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\" (UID: \"af1b5d6b-7c27-46cc-932c-cccec1fd8eca\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.797015 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-combined-ca-bundle\") pod \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\" (UID: \"fc194f90-df3e-4808-a5a6-8cbff23f04a0\") " Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.797325 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-logs" (OuterVolumeSpecName: "logs") pod "af1b5d6b-7c27-46cc-932c-cccec1fd8eca" (UID: "af1b5d6b-7c27-46cc-932c-cccec1fd8eca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.797489 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.797509 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.797520 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.797530 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc489\" (UniqueName: \"kubernetes.io/projected/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5-kube-api-access-mc489\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.798828 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc194f90-df3e-4808-a5a6-8cbff23f04a0-logs" (OuterVolumeSpecName: "logs") pod "fc194f90-df3e-4808-a5a6-8cbff23f04a0" (UID: "fc194f90-df3e-4808-a5a6-8cbff23f04a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.833326 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-kube-api-access-djc67" (OuterVolumeSpecName: "kube-api-access-djc67") pod "af1b5d6b-7c27-46cc-932c-cccec1fd8eca" (UID: "af1b5d6b-7c27-46cc-932c-cccec1fd8eca"). InnerVolumeSpecName "kube-api-access-djc67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.833858 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc194f90-df3e-4808-a5a6-8cbff23f04a0-kube-api-access-6dtbt" (OuterVolumeSpecName: "kube-api-access-6dtbt") pod "fc194f90-df3e-4808-a5a6-8cbff23f04a0" (UID: "fc194f90-df3e-4808-a5a6-8cbff23f04a0"). InnerVolumeSpecName "kube-api-access-6dtbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.882114 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-config-data" (OuterVolumeSpecName: "config-data") pod "fc194f90-df3e-4808-a5a6-8cbff23f04a0" (UID: "fc194f90-df3e-4808-a5a6-8cbff23f04a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.882229 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-config-data" (OuterVolumeSpecName: "config-data") pod "af1b5d6b-7c27-46cc-932c-cccec1fd8eca" (UID: "af1b5d6b-7c27-46cc-932c-cccec1fd8eca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.888248 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc194f90-df3e-4808-a5a6-8cbff23f04a0" (UID: "fc194f90-df3e-4808-a5a6-8cbff23f04a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.907863 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.907913 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dtbt\" (UniqueName: \"kubernetes.io/projected/fc194f90-df3e-4808-a5a6-8cbff23f04a0-kube-api-access-6dtbt\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.907928 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djc67\" (UniqueName: \"kubernetes.io/projected/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-kube-api-access-djc67\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.907940 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.907952 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.907966 5129 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc194f90-df3e-4808-a5a6-8cbff23f04a0-logs\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.911297 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af1b5d6b-7c27-46cc-932c-cccec1fd8eca" (UID: "af1b5d6b-7c27-46cc-932c-cccec1fd8eca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.942749 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fc194f90-df3e-4808-a5a6-8cbff23f04a0" (UID: "fc194f90-df3e-4808-a5a6-8cbff23f04a0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.962019 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "af1b5d6b-7c27-46cc-932c-cccec1fd8eca" (UID: "af1b5d6b-7c27-46cc-932c-cccec1fd8eca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:17 crc kubenswrapper[5129]: I0314 09:50:17.967011 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af1b5d6b-7c27-46cc-932c-cccec1fd8eca" (UID: "af1b5d6b-7c27-46cc-932c-cccec1fd8eca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.010442 5129 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc194f90-df3e-4808-a5a6-8cbff23f04a0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.010507 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.010520 5129 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.010530 5129 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5d6b-7c27-46cc-932c-cccec1fd8eca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.070192 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.122451 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.149306 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: E0314 09:50:18.149867 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-metadata" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.149889 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-metadata" Mar 14 09:50:18 crc kubenswrapper[5129]: E0314 09:50:18.149904 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-api" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.149912 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-api" Mar 14 09:50:18 crc kubenswrapper[5129]: E0314 09:50:18.149950 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-log" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.149956 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-log" Mar 14 09:50:18 crc kubenswrapper[5129]: E0314 09:50:18.149965 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-log" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.149971 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-log" Mar 14 09:50:18 crc kubenswrapper[5129]: E0314 09:50:18.149979 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" containerName="nova-scheduler-scheduler" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.149985 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" containerName="nova-scheduler-scheduler" Mar 14 09:50:18 crc kubenswrapper[5129]: E0314 09:50:18.149993 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f79de3c-8258-4bcf-a312-057813424e32" containerName="nova-cell0-conductor-conductor" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.149999 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f79de3c-8258-4bcf-a312-057813424e32" containerName="nova-cell0-conductor-conductor" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.150197 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f79de3c-8258-4bcf-a312-057813424e32" containerName="nova-cell0-conductor-conductor" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.150213 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-api" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.150230 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-metadata" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.150275 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" containerName="nova-api-log" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.150284 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" containerName="nova-scheduler-scheduler" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.150293 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" containerName="nova-metadata-log" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.151104 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.153002 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.182107 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.217182 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1870a10-2b74-40a6-9906-b370d1a00d1b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.217323 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1870a10-2b74-40a6-9906-b370d1a00d1b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.217371 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnff\" (UniqueName: \"kubernetes.io/projected/b1870a10-2b74-40a6-9906-b370d1a00d1b-kube-api-access-rrnff\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.319888 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1870a10-2b74-40a6-9906-b370d1a00d1b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.319982 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnff\" (UniqueName: \"kubernetes.io/projected/b1870a10-2b74-40a6-9906-b370d1a00d1b-kube-api-access-rrnff\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.320089 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1870a10-2b74-40a6-9906-b370d1a00d1b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.329047 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1870a10-2b74-40a6-9906-b370d1a00d1b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.331562 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1870a10-2b74-40a6-9906-b370d1a00d1b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.339313 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnff\" (UniqueName: \"kubernetes.io/projected/b1870a10-2b74-40a6-9906-b370d1a00d1b-kube-api-access-rrnff\") pod \"nova-cell0-conductor-0\" (UID: \"b1870a10-2b74-40a6-9906-b370d1a00d1b\") " pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.475299 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.608708 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc194f90-df3e-4808-a5a6-8cbff23f04a0","Type":"ContainerDied","Data":"f8feb0337bea051e42aafdc8ef7645d42f7f95b44c47e1b553956263e88a3a80"} Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.609114 5129 scope.go:117] "RemoveContainer" containerID="180360d91d5ab64db48f11db1ed5a475e0f0bcbb71ddadbcad80fb051c73ecb0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.609272 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.620973 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"43a2480b-1483-449f-b24a-7d2213ad8a2f","Type":"ContainerStarted","Data":"d2b8a9e16a313dc1961cea0e320c292d9c1aca8e19fd755f5b8526f0c9d24867"} Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.621058 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.646650 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af1b5d6b-7c27-46cc-932c-cccec1fd8eca","Type":"ContainerDied","Data":"c433ca61254fc6f2e629a9a400f35f007ceab10a56bc9983afe3850841ddcdf0"} Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.646775 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.648702 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.674074 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.674199 5129 scope.go:117] "RemoveContainer" containerID="4bf71143b0559213036068f1f3d659044bcc0bf54ddd158b895ec54f9bae65f0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.695658 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.697592 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.697574635 podStartE2EDuration="3.697574635s" podCreationTimestamp="2026-03-14 09:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:50:18.678932749 +0000 UTC m=+10281.430847933" watchObservedRunningTime="2026-03-14 09:50:18.697574635 +0000 UTC m=+10281.449489819" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.809195 5129 scope.go:117] "RemoveContainer" containerID="cbdfb8b0ab3287febbf034c3a49f809061aa20d007aa0cba7d27620d21d4a9b7" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.819344 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.824860 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.837561 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.837852 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.855872 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.879928 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.885857 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.886065 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dngqx\" (UniqueName: \"kubernetes.io/projected/363a41dc-efe8-4ae5-9939-e11a752eaa7f-kube-api-access-dngqx\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.886202 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-config-data\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.886303 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.886515 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363a41dc-efe8-4ae5-9939-e11a752eaa7f-logs\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.887009 5129 scope.go:117] "RemoveContainer" containerID="9e8a4770203afb39f79326205afcb095edbeea6008e03a7fe92d9458e7d894a6" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.898980 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.910786 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.913277 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.916756 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.930127 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.940668 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.950022 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.960860 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.963488 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.966166 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.966299 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.966560 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.976109 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.988427 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-config-data\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.988788 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.988943 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86be847c-e666-4563-86ae-9c3e5300fe45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.988991 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363a41dc-efe8-4ae5-9939-e11a752eaa7f-logs\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.989022 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86be847c-e666-4563-86ae-9c3e5300fe45-config-data\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.989082 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.989109 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22dx2\" (UniqueName: \"kubernetes.io/projected/86be847c-e666-4563-86ae-9c3e5300fe45-kube-api-access-22dx2\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.989135 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dngqx\" (UniqueName: \"kubernetes.io/projected/363a41dc-efe8-4ae5-9939-e11a752eaa7f-kube-api-access-dngqx\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.989522 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/363a41dc-efe8-4ae5-9939-e11a752eaa7f-logs\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.995891 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.996179 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-config-data\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:18 crc kubenswrapper[5129]: I0314 09:50:18.996568 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/363a41dc-efe8-4ae5-9939-e11a752eaa7f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.005292 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dngqx\" (UniqueName: \"kubernetes.io/projected/363a41dc-efe8-4ae5-9939-e11a752eaa7f-kube-api-access-dngqx\") pod \"nova-metadata-0\" (UID: \"363a41dc-efe8-4ae5-9939-e11a752eaa7f\") " pod="openstack/nova-metadata-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.083426 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 09:50:19 crc kubenswrapper[5129]: W0314 09:50:19.084548 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1870a10_2b74_40a6_9906_b370d1a00d1b.slice/crio-a9bb54fc0e762904a678bc7b10aa251910d6239cdc191d398fcb29f6d2ea3aa0 WatchSource:0}: Error finding container a9bb54fc0e762904a678bc7b10aa251910d6239cdc191d398fcb29f6d2ea3aa0: Status 404 returned error can't find the container with id a9bb54fc0e762904a678bc7b10aa251910d6239cdc191d398fcb29f6d2ea3aa0 Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093353 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093404 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-config-data\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093521 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-logs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093623 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093681 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093732 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86be847c-e666-4563-86ae-9c3e5300fe45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093806 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86be847c-e666-4563-86ae-9c3e5300fe45-config-data\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.093934 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rqtq\" (UniqueName: \"kubernetes.io/projected/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-kube-api-access-8rqtq\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.094188 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22dx2\" (UniqueName: \"kubernetes.io/projected/86be847c-e666-4563-86ae-9c3e5300fe45-kube-api-access-22dx2\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.098650 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86be847c-e666-4563-86ae-9c3e5300fe45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.098757 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86be847c-e666-4563-86ae-9c3e5300fe45-config-data\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.117345 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22dx2\" (UniqueName: \"kubernetes.io/projected/86be847c-e666-4563-86ae-9c3e5300fe45-kube-api-access-22dx2\") pod \"nova-scheduler-0\" (UID: \"86be847c-e666-4563-86ae-9c3e5300fe45\") " pod="openstack/nova-scheduler-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.198576 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.198633 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-config-data\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.198674 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-logs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.198712 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.198736 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.198806 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rqtq\" (UniqueName: \"kubernetes.io/projected/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-kube-api-access-8rqtq\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.199144 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.203112 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.206185 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-logs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.208345 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-config-data\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.210459 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.210695 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.216260 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rqtq\" (UniqueName: \"kubernetes.io/projected/48f4f8c7-9270-42e8-aa7e-1ebe66a772e6-kube-api-access-8rqtq\") pod \"nova-api-0\" (UID: \"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6\") " pod="openstack/nova-api-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.789359 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 09:50:19 crc kubenswrapper[5129]: I0314 09:50:19.793407 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:19.868168 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b1870a10-2b74-40a6-9906-b370d1a00d1b","Type":"ContainerStarted","Data":"a9bb54fc0e762904a678bc7b10aa251910d6239cdc191d398fcb29f6d2ea3aa0"} Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.098772 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ba9afa-3daa-45ae-849c-ebfa4cbc08a5" path="/var/lib/kubelet/pods/03ba9afa-3daa-45ae-849c-ebfa4cbc08a5/volumes" Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.099738 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f79de3c-8258-4bcf-a312-057813424e32" path="/var/lib/kubelet/pods/6f79de3c-8258-4bcf-a312-057813424e32/volumes" Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.100344 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1b5d6b-7c27-46cc-932c-cccec1fd8eca" path="/var/lib/kubelet/pods/af1b5d6b-7c27-46cc-932c-cccec1fd8eca/volumes" Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.101512 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc194f90-df3e-4808-a5a6-8cbff23f04a0" path="/var/lib/kubelet/pods/fc194f90-df3e-4808-a5a6-8cbff23f04a0/volumes" Mar 14 09:50:20 crc kubenswrapper[5129]: W0314 09:50:20.824439 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod363a41dc_efe8_4ae5_9939_e11a752eaa7f.slice/crio-f0ac77c4cea951e06bb36ac2873ccf83a19f8ae10c77133e438ab37962ab1c85 WatchSource:0}: Error finding container f0ac77c4cea951e06bb36ac2873ccf83a19f8ae10c77133e438ab37962ab1c85: Status 404 returned error can't find the container with id f0ac77c4cea951e06bb36ac2873ccf83a19f8ae10c77133e438ab37962ab1c85 Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.827752 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.859681 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.871163 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 09:50:20 crc kubenswrapper[5129]: W0314 09:50:20.914409 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f4f8c7_9270_42e8_aa7e_1ebe66a772e6.slice/crio-0e44b9e355bf5592536664ba975037504cd9222d1eedf7b0936210e9c85756c3 WatchSource:0}: Error finding container 0e44b9e355bf5592536664ba975037504cd9222d1eedf7b0936210e9c85756c3: Status 404 returned error can't find the container with id 0e44b9e355bf5592536664ba975037504cd9222d1eedf7b0936210e9c85756c3 Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.919257 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86be847c-e666-4563-86ae-9c3e5300fe45","Type":"ContainerStarted","Data":"34e59575fc05031669ba0f7f6d7d216ad4f51deee4a50d3cb9845305440fa2cd"} Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.929271 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"363a41dc-efe8-4ae5-9939-e11a752eaa7f","Type":"ContainerStarted","Data":"f0ac77c4cea951e06bb36ac2873ccf83a19f8ae10c77133e438ab37962ab1c85"} Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.936775 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b1870a10-2b74-40a6-9906-b370d1a00d1b","Type":"ContainerStarted","Data":"0c9b1fc87ec2d03b994186efe616eff2f6e974ec87a38663c0481b8e133c5530"} Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.937032 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:20 crc kubenswrapper[5129]: I0314 09:50:20.956906 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.956884719 podStartE2EDuration="2.956884719s" podCreationTimestamp="2026-03-14 09:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:50:20.953912099 +0000 UTC m=+10283.705827283" watchObservedRunningTime="2026-03-14 09:50:20.956884719 +0000 UTC m=+10283.708799903" Mar 14 09:50:21 crc kubenswrapper[5129]: I0314 09:50:21.952395 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86be847c-e666-4563-86ae-9c3e5300fe45","Type":"ContainerStarted","Data":"f185ab479c07c125ae0db60655b34ab9c5f1fa08fa14286282acc8811f0404d6"} Mar 14 09:50:21 crc kubenswrapper[5129]: I0314 09:50:21.957936 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6","Type":"ContainerStarted","Data":"e0c46683ab53d8ccbce75536efa3bb7a8f86f9ece2489878eaca355fbcd3142d"} Mar 14 09:50:21 crc kubenswrapper[5129]: I0314 09:50:21.958029 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6","Type":"ContainerStarted","Data":"cf736ae315ff0a149ea9b88db80cd122587245a39f00876eefebb5e1e728f1a8"} Mar 14 09:50:21 crc kubenswrapper[5129]: I0314 09:50:21.958046 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48f4f8c7-9270-42e8-aa7e-1ebe66a772e6","Type":"ContainerStarted","Data":"0e44b9e355bf5592536664ba975037504cd9222d1eedf7b0936210e9c85756c3"} Mar 14 09:50:21 crc kubenswrapper[5129]: I0314 09:50:21.961910 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"363a41dc-efe8-4ae5-9939-e11a752eaa7f","Type":"ContainerStarted","Data":"c10802424c2beb79a5f02be7efc53e55f03409db7ec9a13f988d34cf5ce6c285"} Mar 14 09:50:21 crc kubenswrapper[5129]: I0314 09:50:21.961970 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"363a41dc-efe8-4ae5-9939-e11a752eaa7f","Type":"ContainerStarted","Data":"280b3c3d81d04bf769980fe3ef2bb4f4a219e2caf64c655e1d0c1678d31e090d"} Mar 14 09:50:21 crc kubenswrapper[5129]: I0314 09:50:21.988143 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.988121042 podStartE2EDuration="3.988121042s" podCreationTimestamp="2026-03-14 09:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:50:21.983382893 +0000 UTC m=+10284.735298097" watchObservedRunningTime="2026-03-14 09:50:21.988121042 +0000 UTC m=+10284.740036226" Mar 14 09:50:22 crc kubenswrapper[5129]: I0314 09:50:22.051776 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.051760818 podStartE2EDuration="4.051760818s" podCreationTimestamp="2026-03-14 09:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:50:22.043242727 +0000 UTC m=+10284.795157921" watchObservedRunningTime="2026-03-14 09:50:22.051760818 +0000 UTC m=+10284.803676002" Mar 14 09:50:22 crc kubenswrapper[5129]: I0314 09:50:22.053016 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.053009273 podStartE2EDuration="4.053009273s" podCreationTimestamp="2026-03-14 09:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:50:22.017706584 +0000 UTC m=+10284.769621788" watchObservedRunningTime="2026-03-14 09:50:22.053009273 +0000 UTC m=+10284.804924457" Mar 14 09:50:24 crc kubenswrapper[5129]: I0314 09:50:24.795709 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 09:50:26 crc kubenswrapper[5129]: I0314 09:50:26.087352 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 14 09:50:28 crc kubenswrapper[5129]: I0314 09:50:28.507218 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 14 09:50:29 crc kubenswrapper[5129]: I0314 09:50:29.200575 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 09:50:29 crc kubenswrapper[5129]: I0314 09:50:29.205054 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 09:50:29 crc kubenswrapper[5129]: I0314 09:50:29.802879 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 09:50:29 crc kubenswrapper[5129]: I0314 09:50:29.802960 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:50:29 crc kubenswrapper[5129]: I0314 09:50:29.802974 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 09:50:29 crc kubenswrapper[5129]: I0314 09:50:29.837797 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 09:50:30 crc kubenswrapper[5129]: I0314 09:50:30.114988 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 09:50:30 crc kubenswrapper[5129]: I0314 09:50:30.213790 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="363a41dc-efe8-4ae5-9939-e11a752eaa7f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.106:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:50:30 crc kubenswrapper[5129]: I0314 09:50:30.213796 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="363a41dc-efe8-4ae5-9939-e11a752eaa7f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.106:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:50:30 crc kubenswrapper[5129]: I0314 09:50:30.818797 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48f4f8c7-9270-42e8-aa7e-1ebe66a772e6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.108:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:50:30 crc kubenswrapper[5129]: I0314 09:50:30.818867 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48f4f8c7-9270-42e8-aa7e-1ebe66a772e6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.108:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 09:50:37 crc kubenswrapper[5129]: I0314 09:50:37.200591 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 09:50:37 crc kubenswrapper[5129]: I0314 09:50:37.205795 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 09:50:37 crc kubenswrapper[5129]: I0314 09:50:37.753433 5129 scope.go:117] "RemoveContainer" containerID="f614b3538de930975380c263fa0bccecbf741da9aba92c25e74789d20933f386" Mar 14 09:50:37 crc kubenswrapper[5129]: I0314 09:50:37.795852 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:50:37 crc kubenswrapper[5129]: I0314 09:50:37.796194 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 09:50:37 crc kubenswrapper[5129]: I0314 09:50:37.836040 5129 scope.go:117] "RemoveContainer" containerID="472634747f52059da3d761b3a3bdd41094664ee0ace37406a9752c847b14d94e" Mar 14 09:50:39 crc kubenswrapper[5129]: I0314 09:50:39.207523 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 09:50:39 crc kubenswrapper[5129]: I0314 09:50:39.211019 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 09:50:39 crc kubenswrapper[5129]: I0314 09:50:39.213759 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 09:50:39 crc kubenswrapper[5129]: I0314 09:50:39.806045 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 09:50:39 crc kubenswrapper[5129]: I0314 09:50:39.807933 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 09:50:39 crc kubenswrapper[5129]: I0314 09:50:39.814226 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 09:50:40 crc kubenswrapper[5129]: I0314 09:50:40.223333 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 09:50:40 crc kubenswrapper[5129]: I0314 09:50:40.228676 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.475802 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr"] Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.477970 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.480496 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.482757 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.483795 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t55bc" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.484023 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.484829 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.484953 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.485110 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.501141 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr"] Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568254 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568300 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568332 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568358 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htgrn\" (UniqueName: \"kubernetes.io/projected/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-kube-api-access-htgrn\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568389 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568409 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568454 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568562 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568588 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568633 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.568666 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672302 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672550 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672582 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672645 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672734 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672758 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672781 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672824 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htgrn\" (UniqueName: \"kubernetes.io/projected/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-kube-api-access-htgrn\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672850 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672876 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.672903 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.674365 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.679947 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.680101 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.680222 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.680494 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.680772 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.681189 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.681853 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.682944 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.685257 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.697781 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htgrn\" (UniqueName: \"kubernetes.io/projected/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-kube-api-access-htgrn\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:41 crc kubenswrapper[5129]: I0314 09:50:41.808412 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:50:42 crc kubenswrapper[5129]: I0314 09:50:42.424472 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr"] Mar 14 09:50:43 crc kubenswrapper[5129]: I0314 09:50:43.251933 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" event={"ID":"4dba6caf-580f-48ed-a541-cfa5b8d62d6d","Type":"ContainerStarted","Data":"cbea4dfb901a45c2a4d46f0a9a1e0f2646fad49a0962da69068077cb0dfe800c"} Mar 14 09:50:43 crc kubenswrapper[5129]: I0314 09:50:43.253306 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" event={"ID":"4dba6caf-580f-48ed-a541-cfa5b8d62d6d","Type":"ContainerStarted","Data":"5ad0cdea2564bba0827c5b3040b9c0a57409d7bcfd73b8e936db4f2ed3a50600"} Mar 14 09:51:19 crc kubenswrapper[5129]: I0314 09:51:19.574472 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:51:19 crc kubenswrapper[5129]: I0314 09:51:19.575046 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:51:49 crc kubenswrapper[5129]: I0314 09:51:49.574292 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:51:49 crc kubenswrapper[5129]: I0314 09:51:49.575155 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:51:50 crc kubenswrapper[5129]: I0314 09:51:50.822828 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" podStartSLOduration=69.399476382 podStartE2EDuration="1m9.822808336s" podCreationTimestamp="2026-03-14 09:50:41 +0000 UTC" firstStartedPulling="2026-03-14 09:50:42.428420199 +0000 UTC m=+10305.180335373" lastFinishedPulling="2026-03-14 09:50:42.851752143 +0000 UTC m=+10305.603667327" observedRunningTime="2026-03-14 09:50:43.273384172 +0000 UTC m=+10306.025299366" watchObservedRunningTime="2026-03-14 09:51:50.822808336 +0000 UTC m=+10373.574723520" Mar 14 09:51:50 crc kubenswrapper[5129]: I0314 09:51:50.829019 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wb7d8"] Mar 14 09:51:50 crc kubenswrapper[5129]: I0314 09:51:50.831404 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:50 crc kubenswrapper[5129]: I0314 09:51:50.861251 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb7d8"] Mar 14 09:51:50 crc kubenswrapper[5129]: I0314 09:51:50.902339 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-utilities\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:50 crc kubenswrapper[5129]: I0314 09:51:50.902392 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-catalog-content\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:50 crc kubenswrapper[5129]: I0314 09:51:50.902441 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwl7d\" (UniqueName: \"kubernetes.io/projected/09784c4d-fe01-4fc6-88c7-c91648818d4e-kube-api-access-bwl7d\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.004092 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-utilities\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.004141 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-catalog-content\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.004191 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwl7d\" (UniqueName: \"kubernetes.io/projected/09784c4d-fe01-4fc6-88c7-c91648818d4e-kube-api-access-bwl7d\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.005150 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-utilities\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.005375 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-catalog-content\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.028172 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwl7d\" (UniqueName: \"kubernetes.io/projected/09784c4d-fe01-4fc6-88c7-c91648818d4e-kube-api-access-bwl7d\") pod \"community-operators-wb7d8\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.158733 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:51:51 crc kubenswrapper[5129]: I0314 09:51:51.721805 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb7d8"] Mar 14 09:51:51 crc kubenswrapper[5129]: W0314 09:51:51.723732 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09784c4d_fe01_4fc6_88c7_c91648818d4e.slice/crio-aca61283fba023fb2767bd351c6c8b4c025af312cc2a31f784d6fc6e4e05a36e WatchSource:0}: Error finding container aca61283fba023fb2767bd351c6c8b4c025af312cc2a31f784d6fc6e4e05a36e: Status 404 returned error can't find the container with id aca61283fba023fb2767bd351c6c8b4c025af312cc2a31f784d6fc6e4e05a36e Mar 14 09:51:52 crc kubenswrapper[5129]: I0314 09:51:52.242690 5129 generic.go:334] "Generic (PLEG): container finished" podID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerID="d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0" exitCode=0 Mar 14 09:51:52 crc kubenswrapper[5129]: I0314 09:51:52.242768 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb7d8" event={"ID":"09784c4d-fe01-4fc6-88c7-c91648818d4e","Type":"ContainerDied","Data":"d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0"} Mar 14 09:51:52 crc kubenswrapper[5129]: I0314 09:51:52.242813 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb7d8" event={"ID":"09784c4d-fe01-4fc6-88c7-c91648818d4e","Type":"ContainerStarted","Data":"aca61283fba023fb2767bd351c6c8b4c025af312cc2a31f784d6fc6e4e05a36e"} Mar 14 09:51:54 crc kubenswrapper[5129]: I0314 09:51:54.275048 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb7d8" event={"ID":"09784c4d-fe01-4fc6-88c7-c91648818d4e","Type":"ContainerStarted","Data":"8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e"} Mar 14 09:51:55 crc kubenswrapper[5129]: I0314 09:51:55.290743 5129 generic.go:334] "Generic (PLEG): container finished" podID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerID="8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e" exitCode=0 Mar 14 09:51:55 crc kubenswrapper[5129]: I0314 09:51:55.290888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb7d8" event={"ID":"09784c4d-fe01-4fc6-88c7-c91648818d4e","Type":"ContainerDied","Data":"8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e"} Mar 14 09:51:56 crc kubenswrapper[5129]: I0314 09:51:56.305919 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb7d8" event={"ID":"09784c4d-fe01-4fc6-88c7-c91648818d4e","Type":"ContainerStarted","Data":"86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264"} Mar 14 09:51:56 crc kubenswrapper[5129]: I0314 09:51:56.331827 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wb7d8" podStartSLOduration=2.860312698 podStartE2EDuration="6.33180604s" podCreationTimestamp="2026-03-14 09:51:50 +0000 UTC" firstStartedPulling="2026-03-14 09:51:52.245927411 +0000 UTC m=+10374.997842635" lastFinishedPulling="2026-03-14 09:51:55.717420793 +0000 UTC m=+10378.469335977" observedRunningTime="2026-03-14 09:51:56.326112675 +0000 UTC m=+10379.078027879" watchObservedRunningTime="2026-03-14 09:51:56.33180604 +0000 UTC m=+10379.083721224" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.167785 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558032-j4lzr"] Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.172076 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-j4lzr" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.175175 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.176056 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.179909 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.190030 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-j4lzr"] Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.266953 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzs9\" (UniqueName: \"kubernetes.io/projected/bdb6ec71-6b9f-452d-a8ab-634bb53bf916-kube-api-access-fgzs9\") pod \"auto-csr-approver-29558032-j4lzr\" (UID: \"bdb6ec71-6b9f-452d-a8ab-634bb53bf916\") " pod="openshift-infra/auto-csr-approver-29558032-j4lzr" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.369468 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzs9\" (UniqueName: \"kubernetes.io/projected/bdb6ec71-6b9f-452d-a8ab-634bb53bf916-kube-api-access-fgzs9\") pod \"auto-csr-approver-29558032-j4lzr\" (UID: \"bdb6ec71-6b9f-452d-a8ab-634bb53bf916\") " pod="openshift-infra/auto-csr-approver-29558032-j4lzr" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.617885 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgzs9\" (UniqueName: \"kubernetes.io/projected/bdb6ec71-6b9f-452d-a8ab-634bb53bf916-kube-api-access-fgzs9\") pod \"auto-csr-approver-29558032-j4lzr\" (UID: \"bdb6ec71-6b9f-452d-a8ab-634bb53bf916\") " pod="openshift-infra/auto-csr-approver-29558032-j4lzr" Mar 14 09:52:00 crc kubenswrapper[5129]: I0314 09:52:00.808687 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-j4lzr" Mar 14 09:52:01 crc kubenswrapper[5129]: I0314 09:52:01.159307 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:52:01 crc kubenswrapper[5129]: I0314 09:52:01.159788 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:52:01 crc kubenswrapper[5129]: I0314 09:52:01.218500 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:52:01 crc kubenswrapper[5129]: I0314 09:52:01.396404 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-j4lzr"] Mar 14 09:52:01 crc kubenswrapper[5129]: I0314 09:52:01.503292 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:52:01 crc kubenswrapper[5129]: I0314 09:52:01.579486 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb7d8"] Mar 14 09:52:02 crc kubenswrapper[5129]: I0314 09:52:02.402299 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-j4lzr" event={"ID":"bdb6ec71-6b9f-452d-a8ab-634bb53bf916","Type":"ContainerStarted","Data":"adecbad15f72006236c973cbb6501676b09048a736a1eb5dca0dd888b93fc0a1"} Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.416419 5129 generic.go:334] "Generic (PLEG): container finished" podID="bdb6ec71-6b9f-452d-a8ab-634bb53bf916" containerID="66924eb9ecabc5f09f4a50773a0828a192d8213468b9c3288ca17170f5728c4d" exitCode=0 Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.416504 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-j4lzr" event={"ID":"bdb6ec71-6b9f-452d-a8ab-634bb53bf916","Type":"ContainerDied","Data":"66924eb9ecabc5f09f4a50773a0828a192d8213468b9c3288ca17170f5728c4d"} Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.417333 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wb7d8" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="registry-server" containerID="cri-o://86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264" gracePeriod=2 Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.879493 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.970838 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-catalog-content\") pod \"09784c4d-fe01-4fc6-88c7-c91648818d4e\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.971005 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-utilities\") pod \"09784c4d-fe01-4fc6-88c7-c91648818d4e\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.971036 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwl7d\" (UniqueName: \"kubernetes.io/projected/09784c4d-fe01-4fc6-88c7-c91648818d4e-kube-api-access-bwl7d\") pod \"09784c4d-fe01-4fc6-88c7-c91648818d4e\" (UID: \"09784c4d-fe01-4fc6-88c7-c91648818d4e\") " Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.972686 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-utilities" (OuterVolumeSpecName: "utilities") pod "09784c4d-fe01-4fc6-88c7-c91648818d4e" (UID: "09784c4d-fe01-4fc6-88c7-c91648818d4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:52:03 crc kubenswrapper[5129]: I0314 09:52:03.982981 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09784c4d-fe01-4fc6-88c7-c91648818d4e-kube-api-access-bwl7d" (OuterVolumeSpecName: "kube-api-access-bwl7d") pod "09784c4d-fe01-4fc6-88c7-c91648818d4e" (UID: "09784c4d-fe01-4fc6-88c7-c91648818d4e"). InnerVolumeSpecName "kube-api-access-bwl7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.055899 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09784c4d-fe01-4fc6-88c7-c91648818d4e" (UID: "09784c4d-fe01-4fc6-88c7-c91648818d4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.074250 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.074296 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09784c4d-fe01-4fc6-88c7-c91648818d4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.074309 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwl7d\" (UniqueName: \"kubernetes.io/projected/09784c4d-fe01-4fc6-88c7-c91648818d4e-kube-api-access-bwl7d\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.433854 5129 generic.go:334] "Generic (PLEG): container finished" podID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerID="86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264" exitCode=0 Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.434103 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb7d8" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.435072 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb7d8" event={"ID":"09784c4d-fe01-4fc6-88c7-c91648818d4e","Type":"ContainerDied","Data":"86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264"} Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.435123 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb7d8" event={"ID":"09784c4d-fe01-4fc6-88c7-c91648818d4e","Type":"ContainerDied","Data":"aca61283fba023fb2767bd351c6c8b4c025af312cc2a31f784d6fc6e4e05a36e"} Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.435142 5129 scope.go:117] "RemoveContainer" containerID="86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.484715 5129 scope.go:117] "RemoveContainer" containerID="8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.486673 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb7d8"] Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.501893 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wb7d8"] Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.546335 5129 scope.go:117] "RemoveContainer" containerID="d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.609766 5129 scope.go:117] "RemoveContainer" containerID="86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264" Mar 14 09:52:04 crc kubenswrapper[5129]: E0314 09:52:04.610549 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264\": container with ID starting with 86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264 not found: ID does not exist" containerID="86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.610643 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264"} err="failed to get container status \"86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264\": rpc error: code = NotFound desc = could not find container \"86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264\": container with ID starting with 86f79b1a6ae1de1d901c75a2b3f168a25425a61b8d8d5bfad4ffa250ef5fe264 not found: ID does not exist" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.610687 5129 scope.go:117] "RemoveContainer" containerID="8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e" Mar 14 09:52:04 crc kubenswrapper[5129]: E0314 09:52:04.611301 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e\": container with ID starting with 8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e not found: ID does not exist" containerID="8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.611367 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e"} err="failed to get container status \"8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e\": rpc error: code = NotFound desc = could not find container \"8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e\": container with ID starting with 8d3920aa03ffa332a23e891208dfb902ee341c45da82710592aa3f7cae34ec6e not found: ID does not exist" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.611408 5129 scope.go:117] "RemoveContainer" containerID="d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0" Mar 14 09:52:04 crc kubenswrapper[5129]: E0314 09:52:04.611849 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0\": container with ID starting with d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0 not found: ID does not exist" containerID="d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.611894 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0"} err="failed to get container status \"d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0\": rpc error: code = NotFound desc = could not find container \"d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0\": container with ID starting with d4e5574962fa8b7c82e33b637bfc11c3189ffe53dcf2c873161d5e8f5ef6c8e0 not found: ID does not exist" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.916283 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-j4lzr" Mar 14 09:52:04 crc kubenswrapper[5129]: I0314 09:52:04.997285 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgzs9\" (UniqueName: \"kubernetes.io/projected/bdb6ec71-6b9f-452d-a8ab-634bb53bf916-kube-api-access-fgzs9\") pod \"bdb6ec71-6b9f-452d-a8ab-634bb53bf916\" (UID: \"bdb6ec71-6b9f-452d-a8ab-634bb53bf916\") " Mar 14 09:52:05 crc kubenswrapper[5129]: I0314 09:52:05.005160 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb6ec71-6b9f-452d-a8ab-634bb53bf916-kube-api-access-fgzs9" (OuterVolumeSpecName: "kube-api-access-fgzs9") pod "bdb6ec71-6b9f-452d-a8ab-634bb53bf916" (UID: "bdb6ec71-6b9f-452d-a8ab-634bb53bf916"). InnerVolumeSpecName "kube-api-access-fgzs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:05 crc kubenswrapper[5129]: I0314 09:52:05.100862 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgzs9\" (UniqueName: \"kubernetes.io/projected/bdb6ec71-6b9f-452d-a8ab-634bb53bf916-kube-api-access-fgzs9\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:05 crc kubenswrapper[5129]: I0314 09:52:05.449331 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-j4lzr" Mar 14 09:52:05 crc kubenswrapper[5129]: I0314 09:52:05.449380 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-j4lzr" event={"ID":"bdb6ec71-6b9f-452d-a8ab-634bb53bf916","Type":"ContainerDied","Data":"adecbad15f72006236c973cbb6501676b09048a736a1eb5dca0dd888b93fc0a1"} Mar 14 09:52:05 crc kubenswrapper[5129]: I0314 09:52:05.449467 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adecbad15f72006236c973cbb6501676b09048a736a1eb5dca0dd888b93fc0a1" Mar 14 09:52:06 crc kubenswrapper[5129]: I0314 09:52:06.012379 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-dsq9s"] Mar 14 09:52:06 crc kubenswrapper[5129]: I0314 09:52:06.026951 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-dsq9s"] Mar 14 09:52:06 crc kubenswrapper[5129]: I0314 09:52:06.055101 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" path="/var/lib/kubelet/pods/09784c4d-fe01-4fc6-88c7-c91648818d4e/volumes" Mar 14 09:52:06 crc kubenswrapper[5129]: I0314 09:52:06.056749 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7bbc39-64bc-4d2f-b9c9-9a515ac64414" path="/var/lib/kubelet/pods/8c7bbc39-64bc-4d2f-b9c9-9a515ac64414/volumes" Mar 14 09:52:19 crc kubenswrapper[5129]: I0314 09:52:19.574196 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:52:19 crc kubenswrapper[5129]: I0314 09:52:19.575041 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:52:19 crc kubenswrapper[5129]: I0314 09:52:19.575157 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:52:19 crc kubenswrapper[5129]: I0314 09:52:19.576414 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39ea69b02689a042635c743d7d2e440d716c42761508e3ed743fe8476218b55d"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:52:19 crc kubenswrapper[5129]: I0314 09:52:19.576506 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://39ea69b02689a042635c743d7d2e440d716c42761508e3ed743fe8476218b55d" gracePeriod=600 Mar 14 09:52:20 crc kubenswrapper[5129]: I0314 09:52:20.654887 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="39ea69b02689a042635c743d7d2e440d716c42761508e3ed743fe8476218b55d" exitCode=0 Mar 14 09:52:20 crc kubenswrapper[5129]: I0314 09:52:20.654985 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"39ea69b02689a042635c743d7d2e440d716c42761508e3ed743fe8476218b55d"} Mar 14 09:52:20 crc kubenswrapper[5129]: I0314 09:52:20.656060 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd"} Mar 14 09:52:20 crc kubenswrapper[5129]: I0314 09:52:20.656112 5129 scope.go:117] "RemoveContainer" containerID="5ad265c7d3bf4de1ea8cb07ba8c6aa383d2b05f1e2d0443d21f387334f2214e6" Mar 14 09:52:38 crc kubenswrapper[5129]: I0314 09:52:38.094581 5129 scope.go:117] "RemoveContainer" containerID="726cbfa4d02721a7db3d794a673b33f7e05a56b25008eaa61cadbd47290ca253" Mar 14 09:53:42 crc kubenswrapper[5129]: I0314 09:53:42.780904 5129 generic.go:334] "Generic (PLEG): container finished" podID="4dba6caf-580f-48ed-a541-cfa5b8d62d6d" containerID="cbea4dfb901a45c2a4d46f0a9a1e0f2646fad49a0962da69068077cb0dfe800c" exitCode=0 Mar 14 09:53:42 crc kubenswrapper[5129]: I0314 09:53:42.780994 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" event={"ID":"4dba6caf-580f-48ed-a541-cfa5b8d62d6d","Type":"ContainerDied","Data":"cbea4dfb901a45c2a4d46f0a9a1e0f2646fad49a0962da69068077cb0dfe800c"} Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.386402 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558054 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htgrn\" (UniqueName: \"kubernetes.io/projected/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-kube-api-access-htgrn\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558135 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-3\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558203 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-combined-ca-bundle\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558254 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-1\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558305 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-ssh-key-openstack-cell1\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558375 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-inventory\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558452 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-1\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558522 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-0\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558576 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cells-global-config-0\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558698 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-0\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.558795 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-2\") pod \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\" (UID: \"4dba6caf-580f-48ed-a541-cfa5b8d62d6d\") " Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.565803 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.565930 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-kube-api-access-htgrn" (OuterVolumeSpecName: "kube-api-access-htgrn") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "kube-api-access-htgrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.594859 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.598243 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.602679 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.608014 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.613114 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.613248 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-inventory" (OuterVolumeSpecName: "inventory") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.615822 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.626446 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.631727 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4dba6caf-580f-48ed-a541-cfa5b8d62d6d" (UID: "4dba6caf-580f-48ed-a541-cfa5b8d62d6d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661256 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661296 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htgrn\" (UniqueName: \"kubernetes.io/projected/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-kube-api-access-htgrn\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661308 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661318 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661327 5129 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661336 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661348 5129 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661358 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661366 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661374 5129 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.661382 5129 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4dba6caf-580f-48ed-a541-cfa5b8d62d6d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.830373 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" event={"ID":"4dba6caf-580f-48ed-a541-cfa5b8d62d6d","Type":"ContainerDied","Data":"5ad0cdea2564bba0827c5b3040b9c0a57409d7bcfd73b8e936db4f2ed3a50600"} Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.830435 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad0cdea2564bba0827c5b3040b9c0a57409d7bcfd73b8e936db4f2ed3a50600" Mar 14 09:53:44 crc kubenswrapper[5129]: I0314 09:53:44.830586 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.152508 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558034-zbj58"] Mar 14 09:54:00 crc kubenswrapper[5129]: E0314 09:54:00.153823 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb6ec71-6b9f-452d-a8ab-634bb53bf916" containerName="oc" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.153838 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb6ec71-6b9f-452d-a8ab-634bb53bf916" containerName="oc" Mar 14 09:54:00 crc kubenswrapper[5129]: E0314 09:54:00.153864 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dba6caf-580f-48ed-a541-cfa5b8d62d6d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.153872 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dba6caf-580f-48ed-a541-cfa5b8d62d6d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 14 09:54:00 crc kubenswrapper[5129]: E0314 09:54:00.153886 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="extract-content" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.153893 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="extract-content" Mar 14 09:54:00 crc kubenswrapper[5129]: E0314 09:54:00.153907 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="extract-utilities" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.153916 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="extract-utilities" Mar 14 09:54:00 crc kubenswrapper[5129]: E0314 09:54:00.153930 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="registry-server" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.153936 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="registry-server" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.154184 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dba6caf-580f-48ed-a541-cfa5b8d62d6d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.154197 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb6ec71-6b9f-452d-a8ab-634bb53bf916" containerName="oc" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.154207 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="09784c4d-fe01-4fc6-88c7-c91648818d4e" containerName="registry-server" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.155061 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-zbj58" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.158891 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.160765 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.162780 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.164140 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-zbj58"] Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.292481 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69c7b\" (UniqueName: \"kubernetes.io/projected/f1f00c80-6977-43bf-9cb7-b96309f47f59-kube-api-access-69c7b\") pod \"auto-csr-approver-29558034-zbj58\" (UID: \"f1f00c80-6977-43bf-9cb7-b96309f47f59\") " pod="openshift-infra/auto-csr-approver-29558034-zbj58" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.395667 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69c7b\" (UniqueName: \"kubernetes.io/projected/f1f00c80-6977-43bf-9cb7-b96309f47f59-kube-api-access-69c7b\") pod \"auto-csr-approver-29558034-zbj58\" (UID: \"f1f00c80-6977-43bf-9cb7-b96309f47f59\") " pod="openshift-infra/auto-csr-approver-29558034-zbj58" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.428329 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69c7b\" (UniqueName: \"kubernetes.io/projected/f1f00c80-6977-43bf-9cb7-b96309f47f59-kube-api-access-69c7b\") pod \"auto-csr-approver-29558034-zbj58\" (UID: \"f1f00c80-6977-43bf-9cb7-b96309f47f59\") " pod="openshift-infra/auto-csr-approver-29558034-zbj58" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.485247 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-zbj58" Mar 14 09:54:00 crc kubenswrapper[5129]: I0314 09:54:00.987716 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-zbj58"] Mar 14 09:54:01 crc kubenswrapper[5129]: I0314 09:54:01.000485 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:54:01 crc kubenswrapper[5129]: I0314 09:54:01.046319 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558034-zbj58" event={"ID":"f1f00c80-6977-43bf-9cb7-b96309f47f59","Type":"ContainerStarted","Data":"fc4cbfe7bd24351135f53f102961a87f0a8ac58aec14e4d5c0e3c12dda35564e"} Mar 14 09:54:03 crc kubenswrapper[5129]: I0314 09:54:03.079128 5129 generic.go:334] "Generic (PLEG): container finished" podID="f1f00c80-6977-43bf-9cb7-b96309f47f59" containerID="3abd1062ee36117426b22332f9496a68e9035f34af79d86307178259309c1032" exitCode=0 Mar 14 09:54:03 crc kubenswrapper[5129]: I0314 09:54:03.079214 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558034-zbj58" event={"ID":"f1f00c80-6977-43bf-9cb7-b96309f47f59","Type":"ContainerDied","Data":"3abd1062ee36117426b22332f9496a68e9035f34af79d86307178259309c1032"} Mar 14 09:54:05 crc kubenswrapper[5129]: I0314 09:54:05.134835 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558034-zbj58" event={"ID":"f1f00c80-6977-43bf-9cb7-b96309f47f59","Type":"ContainerDied","Data":"fc4cbfe7bd24351135f53f102961a87f0a8ac58aec14e4d5c0e3c12dda35564e"} Mar 14 09:54:05 crc kubenswrapper[5129]: I0314 09:54:05.136635 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc4cbfe7bd24351135f53f102961a87f0a8ac58aec14e4d5c0e3c12dda35564e" Mar 14 09:54:05 crc kubenswrapper[5129]: I0314 09:54:05.169438 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-zbj58" Mar 14 09:54:05 crc kubenswrapper[5129]: I0314 09:54:05.338265 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69c7b\" (UniqueName: \"kubernetes.io/projected/f1f00c80-6977-43bf-9cb7-b96309f47f59-kube-api-access-69c7b\") pod \"f1f00c80-6977-43bf-9cb7-b96309f47f59\" (UID: \"f1f00c80-6977-43bf-9cb7-b96309f47f59\") " Mar 14 09:54:05 crc kubenswrapper[5129]: I0314 09:54:05.347219 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f00c80-6977-43bf-9cb7-b96309f47f59-kube-api-access-69c7b" (OuterVolumeSpecName: "kube-api-access-69c7b") pod "f1f00c80-6977-43bf-9cb7-b96309f47f59" (UID: "f1f00c80-6977-43bf-9cb7-b96309f47f59"). InnerVolumeSpecName "kube-api-access-69c7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:54:05 crc kubenswrapper[5129]: I0314 09:54:05.442135 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69c7b\" (UniqueName: \"kubernetes.io/projected/f1f00c80-6977-43bf-9cb7-b96309f47f59-kube-api-access-69c7b\") on node \"crc\" DevicePath \"\"" Mar 14 09:54:06 crc kubenswrapper[5129]: I0314 09:54:06.147576 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-zbj58" Mar 14 09:54:06 crc kubenswrapper[5129]: I0314 09:54:06.264919 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-9c48n"] Mar 14 09:54:06 crc kubenswrapper[5129]: I0314 09:54:06.282127 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-9c48n"] Mar 14 09:54:08 crc kubenswrapper[5129]: I0314 09:54:08.052285 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3134ce-77da-48c0-a333-5ba15fc92075" path="/var/lib/kubelet/pods/7e3134ce-77da-48c0-a333-5ba15fc92075/volumes" Mar 14 09:54:19 crc kubenswrapper[5129]: I0314 09:54:19.574669 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:54:19 crc kubenswrapper[5129]: I0314 09:54:19.575351 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:54:38 crc kubenswrapper[5129]: I0314 09:54:38.258240 5129 scope.go:117] "RemoveContainer" containerID="6bcc3d092f44b453c23c9935b660f292c545da33bae0cf773f0de7c6e3cbfa65" Mar 14 09:54:49 crc kubenswrapper[5129]: I0314 09:54:49.574302 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:54:49 crc kubenswrapper[5129]: I0314 09:54:49.574851 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:55:19 crc kubenswrapper[5129]: I0314 09:55:19.574661 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:55:19 crc kubenswrapper[5129]: I0314 09:55:19.576357 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:55:19 crc kubenswrapper[5129]: I0314 09:55:19.576525 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 09:55:19 crc kubenswrapper[5129]: I0314 09:55:19.577728 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:55:19 crc kubenswrapper[5129]: I0314 09:55:19.577883 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" gracePeriod=600 Mar 14 09:55:19 crc kubenswrapper[5129]: E0314 09:55:19.703282 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:55:20 crc kubenswrapper[5129]: I0314 09:55:20.011629 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" exitCode=0 Mar 14 09:55:20 crc kubenswrapper[5129]: I0314 09:55:20.011870 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd"} Mar 14 09:55:20 crc kubenswrapper[5129]: I0314 09:55:20.011992 5129 scope.go:117] "RemoveContainer" containerID="39ea69b02689a042635c743d7d2e440d716c42761508e3ed743fe8476218b55d" Mar 14 09:55:20 crc kubenswrapper[5129]: I0314 09:55:20.012825 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:55:20 crc kubenswrapper[5129]: E0314 09:55:20.013194 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:55:32 crc kubenswrapper[5129]: I0314 09:55:32.037096 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:55:32 crc kubenswrapper[5129]: E0314 09:55:32.037923 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:55:34 crc kubenswrapper[5129]: I0314 09:55:34.888352 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 14 09:55:34 crc kubenswrapper[5129]: I0314 09:55:34.889501 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="e61aa93b-36ea-424e-b43d-ff07a45e91f5" containerName="adoption" containerID="cri-o://00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8" gracePeriod=30 Mar 14 09:55:44 crc kubenswrapper[5129]: I0314 09:55:44.037469 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:55:44 crc kubenswrapper[5129]: E0314 09:55:44.039072 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:55:55 crc kubenswrapper[5129]: I0314 09:55:55.036509 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:55:55 crc kubenswrapper[5129]: E0314 09:55:55.037505 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.184971 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558036-bvw5m"] Mar 14 09:56:00 crc kubenswrapper[5129]: E0314 09:56:00.186806 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f00c80-6977-43bf-9cb7-b96309f47f59" containerName="oc" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.186846 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f00c80-6977-43bf-9cb7-b96309f47f59" containerName="oc" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.187506 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f00c80-6977-43bf-9cb7-b96309f47f59" containerName="oc" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.188950 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-bvw5m" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.191776 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.192368 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.197683 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.207704 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-bvw5m"] Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.334717 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthch\" (UniqueName: \"kubernetes.io/projected/bb247d9c-ddba-4c65-9a64-cc5382227f14-kube-api-access-wthch\") pod \"auto-csr-approver-29558036-bvw5m\" (UID: \"bb247d9c-ddba-4c65-9a64-cc5382227f14\") " pod="openshift-infra/auto-csr-approver-29558036-bvw5m" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.437139 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wthch\" (UniqueName: \"kubernetes.io/projected/bb247d9c-ddba-4c65-9a64-cc5382227f14-kube-api-access-wthch\") pod \"auto-csr-approver-29558036-bvw5m\" (UID: \"bb247d9c-ddba-4c65-9a64-cc5382227f14\") " pod="openshift-infra/auto-csr-approver-29558036-bvw5m" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.470167 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthch\" (UniqueName: \"kubernetes.io/projected/bb247d9c-ddba-4c65-9a64-cc5382227f14-kube-api-access-wthch\") pod \"auto-csr-approver-29558036-bvw5m\" (UID: \"bb247d9c-ddba-4c65-9a64-cc5382227f14\") " pod="openshift-infra/auto-csr-approver-29558036-bvw5m" Mar 14 09:56:00 crc kubenswrapper[5129]: I0314 09:56:00.539009 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-bvw5m" Mar 14 09:56:01 crc kubenswrapper[5129]: I0314 09:56:01.036112 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-bvw5m"] Mar 14 09:56:01 crc kubenswrapper[5129]: W0314 09:56:01.040439 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb247d9c_ddba_4c65_9a64_cc5382227f14.slice/crio-6bfd2b05a7311c16cbc264dd58fe0d2ca50ad11e026e8b57884b4949e4a95331 WatchSource:0}: Error finding container 6bfd2b05a7311c16cbc264dd58fe0d2ca50ad11e026e8b57884b4949e4a95331: Status 404 returned error can't find the container with id 6bfd2b05a7311c16cbc264dd58fe0d2ca50ad11e026e8b57884b4949e4a95331 Mar 14 09:56:01 crc kubenswrapper[5129]: I0314 09:56:01.536397 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558036-bvw5m" event={"ID":"bb247d9c-ddba-4c65-9a64-cc5382227f14","Type":"ContainerStarted","Data":"6bfd2b05a7311c16cbc264dd58fe0d2ca50ad11e026e8b57884b4949e4a95331"} Mar 14 09:56:02 crc kubenswrapper[5129]: I0314 09:56:02.549271 5129 generic.go:334] "Generic (PLEG): container finished" podID="bb247d9c-ddba-4c65-9a64-cc5382227f14" containerID="a97bd809eb10dc5907746765db0efedde2eb933e73ec0d722150b4ff9d080f21" exitCode=0 Mar 14 09:56:02 crc kubenswrapper[5129]: I0314 09:56:02.549367 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558036-bvw5m" event={"ID":"bb247d9c-ddba-4c65-9a64-cc5382227f14","Type":"ContainerDied","Data":"a97bd809eb10dc5907746765db0efedde2eb933e73ec0d722150b4ff9d080f21"} Mar 14 09:56:04 crc kubenswrapper[5129]: I0314 09:56:04.050488 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-bvw5m" Mar 14 09:56:04 crc kubenswrapper[5129]: I0314 09:56:04.174091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wthch\" (UniqueName: \"kubernetes.io/projected/bb247d9c-ddba-4c65-9a64-cc5382227f14-kube-api-access-wthch\") pod \"bb247d9c-ddba-4c65-9a64-cc5382227f14\" (UID: \"bb247d9c-ddba-4c65-9a64-cc5382227f14\") " Mar 14 09:56:04 crc kubenswrapper[5129]: I0314 09:56:04.182707 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb247d9c-ddba-4c65-9a64-cc5382227f14-kube-api-access-wthch" (OuterVolumeSpecName: "kube-api-access-wthch") pod "bb247d9c-ddba-4c65-9a64-cc5382227f14" (UID: "bb247d9c-ddba-4c65-9a64-cc5382227f14"). InnerVolumeSpecName "kube-api-access-wthch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:56:04 crc kubenswrapper[5129]: I0314 09:56:04.276409 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wthch\" (UniqueName: \"kubernetes.io/projected/bb247d9c-ddba-4c65-9a64-cc5382227f14-kube-api-access-wthch\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:04 crc kubenswrapper[5129]: I0314 09:56:04.584494 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558036-bvw5m" event={"ID":"bb247d9c-ddba-4c65-9a64-cc5382227f14","Type":"ContainerDied","Data":"6bfd2b05a7311c16cbc264dd58fe0d2ca50ad11e026e8b57884b4949e4a95331"} Mar 14 09:56:04 crc kubenswrapper[5129]: I0314 09:56:04.584575 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bfd2b05a7311c16cbc264dd58fe0d2ca50ad11e026e8b57884b4949e4a95331" Mar 14 09:56:04 crc kubenswrapper[5129]: I0314 09:56:04.584640 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-bvw5m" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.163380 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-l827m"] Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.175985 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-l827m"] Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.537121 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.600038 5129 generic.go:334] "Generic (PLEG): container finished" podID="e61aa93b-36ea-424e-b43d-ff07a45e91f5" containerID="00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8" exitCode=137 Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.600119 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e61aa93b-36ea-424e-b43d-ff07a45e91f5","Type":"ContainerDied","Data":"00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8"} Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.600167 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.600204 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e61aa93b-36ea-424e-b43d-ff07a45e91f5","Type":"ContainerDied","Data":"a432328f72157ceffec8575fc3e687d500ae043a3ba78122ff461f8cd183a68b"} Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.600233 5129 scope.go:117] "RemoveContainer" containerID="00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.631698 5129 scope.go:117] "RemoveContainer" containerID="00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8" Mar 14 09:56:05 crc kubenswrapper[5129]: E0314 09:56:05.632368 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8\": container with ID starting with 00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8 not found: ID does not exist" containerID="00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.632406 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8"} err="failed to get container status \"00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8\": rpc error: code = NotFound desc = could not find container \"00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8\": container with ID starting with 00969cec8b7b4cd1545e12370a7b226bcdc8afe3505183d3ca8595bf51c5b5e8 not found: ID does not exist" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.721716 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx2kx\" (UniqueName: \"kubernetes.io/projected/e61aa93b-36ea-424e-b43d-ff07a45e91f5-kube-api-access-tx2kx\") pod \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.722708 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\") pod \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\" (UID: \"e61aa93b-36ea-424e-b43d-ff07a45e91f5\") " Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.730213 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61aa93b-36ea-424e-b43d-ff07a45e91f5-kube-api-access-tx2kx" (OuterVolumeSpecName: "kube-api-access-tx2kx") pod "e61aa93b-36ea-424e-b43d-ff07a45e91f5" (UID: "e61aa93b-36ea-424e-b43d-ff07a45e91f5"). InnerVolumeSpecName "kube-api-access-tx2kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.743216 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852" (OuterVolumeSpecName: "mariadb-data") pod "e61aa93b-36ea-424e-b43d-ff07a45e91f5" (UID: "e61aa93b-36ea-424e-b43d-ff07a45e91f5"). InnerVolumeSpecName "pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.825464 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx2kx\" (UniqueName: \"kubernetes.io/projected/e61aa93b-36ea-424e-b43d-ff07a45e91f5-kube-api-access-tx2kx\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.825553 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\") on node \"crc\" " Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.851237 5129 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.851593 5129 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852") on node "crc" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.930870 5129 reconciler_common.go:293] "Volume detached for volume \"pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c402f6-3b85-47df-ad54-4b5bcf5f6852\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.959264 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 14 09:56:05 crc kubenswrapper[5129]: I0314 09:56:05.972546 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Mar 14 09:56:06 crc kubenswrapper[5129]: I0314 09:56:06.036097 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:56:06 crc kubenswrapper[5129]: E0314 09:56:06.036378 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:56:06 crc kubenswrapper[5129]: I0314 09:56:06.053878 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25" path="/var/lib/kubelet/pods/c4d00e8e-a3ff-464e-87bf-b5cfbd4d0c25/volumes" Mar 14 09:56:06 crc kubenswrapper[5129]: I0314 09:56:06.054665 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61aa93b-36ea-424e-b43d-ff07a45e91f5" path="/var/lib/kubelet/pods/e61aa93b-36ea-424e-b43d-ff07a45e91f5/volumes" Mar 14 09:56:06 crc kubenswrapper[5129]: I0314 09:56:06.615761 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 14 09:56:06 crc kubenswrapper[5129]: I0314 09:56:06.615944 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="25dd8792-1393-4803-9806-21f7292348fa" containerName="adoption" containerID="cri-o://3a0f42f7bf412981c76a99dff2bc54b03126680968e96ba6091f2fdc212216a1" gracePeriod=30 Mar 14 09:56:21 crc kubenswrapper[5129]: I0314 09:56:21.037964 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:56:21 crc kubenswrapper[5129]: E0314 09:56:21.039490 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.206713 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbxh6"] Mar 14 09:56:33 crc kubenswrapper[5129]: E0314 09:56:33.208940 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61aa93b-36ea-424e-b43d-ff07a45e91f5" containerName="adoption" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.208978 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61aa93b-36ea-424e-b43d-ff07a45e91f5" containerName="adoption" Mar 14 09:56:33 crc kubenswrapper[5129]: E0314 09:56:33.208990 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb247d9c-ddba-4c65-9a64-cc5382227f14" containerName="oc" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.212575 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb247d9c-ddba-4c65-9a64-cc5382227f14" containerName="oc" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.213472 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61aa93b-36ea-424e-b43d-ff07a45e91f5" containerName="adoption" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.213533 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb247d9c-ddba-4c65-9a64-cc5382227f14" containerName="oc" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.215913 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.238045 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbxh6"] Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.327034 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-utilities\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.327184 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjnn\" (UniqueName: \"kubernetes.io/projected/0e7ae695-5887-4261-9583-b03e376f8399-kube-api-access-dhjnn\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.327256 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-catalog-content\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.429861 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-utilities\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.431005 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjnn\" (UniqueName: \"kubernetes.io/projected/0e7ae695-5887-4261-9583-b03e376f8399-kube-api-access-dhjnn\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.431123 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-catalog-content\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.431463 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-catalog-content\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.430884 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-utilities\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.468024 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjnn\" (UniqueName: \"kubernetes.io/projected/0e7ae695-5887-4261-9583-b03e376f8399-kube-api-access-dhjnn\") pod \"redhat-marketplace-sbxh6\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:33 crc kubenswrapper[5129]: I0314 09:56:33.540852 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:34 crc kubenswrapper[5129]: I0314 09:56:34.121591 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbxh6"] Mar 14 09:56:34 crc kubenswrapper[5129]: I0314 09:56:34.998866 5129 generic.go:334] "Generic (PLEG): container finished" podID="0e7ae695-5887-4261-9583-b03e376f8399" containerID="4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e" exitCode=0 Mar 14 09:56:34 crc kubenswrapper[5129]: I0314 09:56:34.998983 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbxh6" event={"ID":"0e7ae695-5887-4261-9583-b03e376f8399","Type":"ContainerDied","Data":"4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e"} Mar 14 09:56:34 crc kubenswrapper[5129]: I0314 09:56:34.999356 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbxh6" event={"ID":"0e7ae695-5887-4261-9583-b03e376f8399","Type":"ContainerStarted","Data":"61de10f78879b4de780c334b9aca95c70db9ed512ae7e18eb3f775fccf4d4eed"} Mar 14 09:56:35 crc kubenswrapper[5129]: I0314 09:56:35.036503 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:56:35 crc kubenswrapper[5129]: E0314 09:56:35.036883 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:56:36 crc kubenswrapper[5129]: I0314 09:56:36.010864 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbxh6" event={"ID":"0e7ae695-5887-4261-9583-b03e376f8399","Type":"ContainerStarted","Data":"37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34"} Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.044324 5129 generic.go:334] "Generic (PLEG): container finished" podID="25dd8792-1393-4803-9806-21f7292348fa" containerID="3a0f42f7bf412981c76a99dff2bc54b03126680968e96ba6091f2fdc212216a1" exitCode=137 Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.044441 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"25dd8792-1393-4803-9806-21f7292348fa","Type":"ContainerDied","Data":"3a0f42f7bf412981c76a99dff2bc54b03126680968e96ba6091f2fdc212216a1"} Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.049055 5129 generic.go:334] "Generic (PLEG): container finished" podID="0e7ae695-5887-4261-9583-b03e376f8399" containerID="37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34" exitCode=0 Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.049097 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbxh6" event={"ID":"0e7ae695-5887-4261-9583-b03e376f8399","Type":"ContainerDied","Data":"37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34"} Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.253398 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.335900 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\") pod \"25dd8792-1393-4803-9806-21f7292348fa\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.336084 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7clt6\" (UniqueName: \"kubernetes.io/projected/25dd8792-1393-4803-9806-21f7292348fa-kube-api-access-7clt6\") pod \"25dd8792-1393-4803-9806-21f7292348fa\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.336139 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/25dd8792-1393-4803-9806-21f7292348fa-ovn-data-cert\") pod \"25dd8792-1393-4803-9806-21f7292348fa\" (UID: \"25dd8792-1393-4803-9806-21f7292348fa\") " Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.350015 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25dd8792-1393-4803-9806-21f7292348fa-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "25dd8792-1393-4803-9806-21f7292348fa" (UID: "25dd8792-1393-4803-9806-21f7292348fa"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.361348 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25dd8792-1393-4803-9806-21f7292348fa-kube-api-access-7clt6" (OuterVolumeSpecName: "kube-api-access-7clt6") pod "25dd8792-1393-4803-9806-21f7292348fa" (UID: "25dd8792-1393-4803-9806-21f7292348fa"). InnerVolumeSpecName "kube-api-access-7clt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.367010 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33" (OuterVolumeSpecName: "ovn-data") pod "25dd8792-1393-4803-9806-21f7292348fa" (UID: "25dd8792-1393-4803-9806-21f7292348fa"). InnerVolumeSpecName "pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.438642 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\") on node \"crc\" " Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.438679 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7clt6\" (UniqueName: \"kubernetes.io/projected/25dd8792-1393-4803-9806-21f7292348fa-kube-api-access-7clt6\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.438690 5129 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/25dd8792-1393-4803-9806-21f7292348fa-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.465297 5129 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.465441 5129 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33") on node "crc" Mar 14 09:56:37 crc kubenswrapper[5129]: I0314 09:56:37.539998 5129 reconciler_common.go:293] "Volume detached for volume \"pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aee7aa5a-3c64-4b84-9c85-04f6d2c42e33\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.070855 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbxh6" event={"ID":"0e7ae695-5887-4261-9583-b03e376f8399","Type":"ContainerStarted","Data":"55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7"} Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.078776 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"25dd8792-1393-4803-9806-21f7292348fa","Type":"ContainerDied","Data":"983c4e5e01f2bb37d3eccfc8df582b82b32421964016bcabfa9003fddad3fa81"} Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.078873 5129 scope.go:117] "RemoveContainer" containerID="3a0f42f7bf412981c76a99dff2bc54b03126680968e96ba6091f2fdc212216a1" Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.079046 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.111102 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbxh6" podStartSLOduration=2.632479452 podStartE2EDuration="5.111075731s" podCreationTimestamp="2026-03-14 09:56:33 +0000 UTC" firstStartedPulling="2026-03-14 09:56:35.000776304 +0000 UTC m=+10657.752691488" lastFinishedPulling="2026-03-14 09:56:37.479372583 +0000 UTC m=+10660.231287767" observedRunningTime="2026-03-14 09:56:38.102645812 +0000 UTC m=+10660.854561006" watchObservedRunningTime="2026-03-14 09:56:38.111075731 +0000 UTC m=+10660.862990905" Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.144724 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.155122 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Mar 14 09:56:38 crc kubenswrapper[5129]: I0314 09:56:38.419710 5129 scope.go:117] "RemoveContainer" containerID="f97b2630266a6a99f241ea7371794aa2573a6f8651c7d6b6c5bf8d37f6ee3c5e" Mar 14 09:56:40 crc kubenswrapper[5129]: I0314 09:56:40.047387 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25dd8792-1393-4803-9806-21f7292348fa" path="/var/lib/kubelet/pods/25dd8792-1393-4803-9806-21f7292348fa/volumes" Mar 14 09:56:40 crc kubenswrapper[5129]: I0314 09:56:40.418753 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerName="galera" probeResult="failure" output="command timed out" Mar 14 09:56:40 crc kubenswrapper[5129]: I0314 09:56:40.419521 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="8731d474-6329-4a56-be08-fe3d12bb33cd" containerName="galera" probeResult="failure" output="command timed out" Mar 14 09:56:43 crc kubenswrapper[5129]: I0314 09:56:43.542032 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:43 crc kubenswrapper[5129]: I0314 09:56:43.543167 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:43 crc kubenswrapper[5129]: I0314 09:56:43.620537 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:44 crc kubenswrapper[5129]: I0314 09:56:44.213741 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:44 crc kubenswrapper[5129]: I0314 09:56:44.289333 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbxh6"] Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.179361 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbxh6" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="registry-server" containerID="cri-o://55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7" gracePeriod=2 Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.705040 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.882218 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-catalog-content\") pod \"0e7ae695-5887-4261-9583-b03e376f8399\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.882307 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-utilities\") pod \"0e7ae695-5887-4261-9583-b03e376f8399\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.882346 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhjnn\" (UniqueName: \"kubernetes.io/projected/0e7ae695-5887-4261-9583-b03e376f8399-kube-api-access-dhjnn\") pod \"0e7ae695-5887-4261-9583-b03e376f8399\" (UID: \"0e7ae695-5887-4261-9583-b03e376f8399\") " Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.884255 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-utilities" (OuterVolumeSpecName: "utilities") pod "0e7ae695-5887-4261-9583-b03e376f8399" (UID: "0e7ae695-5887-4261-9583-b03e376f8399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.895116 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7ae695-5887-4261-9583-b03e376f8399-kube-api-access-dhjnn" (OuterVolumeSpecName: "kube-api-access-dhjnn") pod "0e7ae695-5887-4261-9583-b03e376f8399" (UID: "0e7ae695-5887-4261-9583-b03e376f8399"). InnerVolumeSpecName "kube-api-access-dhjnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.934505 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e7ae695-5887-4261-9583-b03e376f8399" (UID: "0e7ae695-5887-4261-9583-b03e376f8399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.984862 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.984918 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e7ae695-5887-4261-9583-b03e376f8399-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:46 crc kubenswrapper[5129]: I0314 09:56:46.984932 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhjnn\" (UniqueName: \"kubernetes.io/projected/0e7ae695-5887-4261-9583-b03e376f8399-kube-api-access-dhjnn\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.194351 5129 generic.go:334] "Generic (PLEG): container finished" podID="0e7ae695-5887-4261-9583-b03e376f8399" containerID="55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7" exitCode=0 Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.194434 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbxh6" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.194455 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbxh6" event={"ID":"0e7ae695-5887-4261-9583-b03e376f8399","Type":"ContainerDied","Data":"55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7"} Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.194905 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbxh6" event={"ID":"0e7ae695-5887-4261-9583-b03e376f8399","Type":"ContainerDied","Data":"61de10f78879b4de780c334b9aca95c70db9ed512ae7e18eb3f775fccf4d4eed"} Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.194938 5129 scope.go:117] "RemoveContainer" containerID="55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.228746 5129 scope.go:117] "RemoveContainer" containerID="37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.238223 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbxh6"] Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.264928 5129 scope.go:117] "RemoveContainer" containerID="4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.273521 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbxh6"] Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.329661 5129 scope.go:117] "RemoveContainer" containerID="55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7" Mar 14 09:56:47 crc kubenswrapper[5129]: E0314 09:56:47.330559 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7\": container with ID starting with 55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7 not found: ID does not exist" containerID="55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.330631 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7"} err="failed to get container status \"55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7\": rpc error: code = NotFound desc = could not find container \"55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7\": container with ID starting with 55854578b4a9a6eac0bf6202bed19380839fede465ed34edc6ec283d5b6dedc7 not found: ID does not exist" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.330675 5129 scope.go:117] "RemoveContainer" containerID="37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34" Mar 14 09:56:47 crc kubenswrapper[5129]: E0314 09:56:47.331303 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34\": container with ID starting with 37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34 not found: ID does not exist" containerID="37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.331351 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34"} err="failed to get container status \"37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34\": rpc error: code = NotFound desc = could not find container \"37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34\": container with ID starting with 37dae0f6d39e29a416dfbc177fb7dca84450b293ec5d60698480d306bf35cd34 not found: ID does not exist" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.331377 5129 scope.go:117] "RemoveContainer" containerID="4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e" Mar 14 09:56:47 crc kubenswrapper[5129]: E0314 09:56:47.331795 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e\": container with ID starting with 4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e not found: ID does not exist" containerID="4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e" Mar 14 09:56:47 crc kubenswrapper[5129]: I0314 09:56:47.331832 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e"} err="failed to get container status \"4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e\": rpc error: code = NotFound desc = could not find container \"4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e\": container with ID starting with 4fa6d337ef3a4e869525e759e4dd5735c0f5f14abed43340911e6a7ea3b0f61e not found: ID does not exist" Mar 14 09:56:48 crc kubenswrapper[5129]: I0314 09:56:48.045956 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:56:48 crc kubenswrapper[5129]: E0314 09:56:48.046364 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:56:48 crc kubenswrapper[5129]: I0314 09:56:48.059429 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7ae695-5887-4261-9583-b03e376f8399" path="/var/lib/kubelet/pods/0e7ae695-5887-4261-9583-b03e376f8399/volumes" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.429109 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-5d6j7"] Mar 14 09:56:49 crc kubenswrapper[5129]: E0314 09:56:49.430174 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dd8792-1393-4803-9806-21f7292348fa" containerName="adoption" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.430198 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dd8792-1393-4803-9806-21f7292348fa" containerName="adoption" Mar 14 09:56:49 crc kubenswrapper[5129]: E0314 09:56:49.430213 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="extract-utilities" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.430220 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="extract-utilities" Mar 14 09:56:49 crc kubenswrapper[5129]: E0314 09:56:49.430233 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="registry-server" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.430240 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="registry-server" Mar 14 09:56:49 crc kubenswrapper[5129]: E0314 09:56:49.430260 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="extract-content" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.430267 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="extract-content" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.430449 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="25dd8792-1393-4803-9806-21f7292348fa" containerName="adoption" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.430467 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7ae695-5887-4261-9583-b03e376f8399" containerName="registry-server" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.431273 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.448376 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.454166 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.463279 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-5d6j7"] Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.464263 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-swiftconf\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.464387 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.464526 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d74a8e38-8c5b-48c9-8e82-ac058852be9d-etc-swift\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.464802 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-dispersionconf\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.466969 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-scripts\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.467248 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-ring-data-devices\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.467450 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z824f\" (UniqueName: \"kubernetes.io/projected/d74a8e38-8c5b-48c9-8e82-ac058852be9d-kube-api-access-z824f\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.569291 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-scripts\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.569428 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-ring-data-devices\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.569500 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z824f\" (UniqueName: \"kubernetes.io/projected/d74a8e38-8c5b-48c9-8e82-ac058852be9d-kube-api-access-z824f\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.569729 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-swiftconf\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.569773 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.569897 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d74a8e38-8c5b-48c9-8e82-ac058852be9d-etc-swift\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.570025 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-dispersionconf\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.570991 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-ring-data-devices\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.571226 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-scripts\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.572026 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d74a8e38-8c5b-48c9-8e82-ac058852be9d-etc-swift\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.579258 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-swiftconf\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.580849 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-dispersionconf\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.588887 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.599014 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z824f\" (UniqueName: \"kubernetes.io/projected/d74a8e38-8c5b-48c9-8e82-ac058852be9d-kube-api-access-z824f\") pod \"swift-ring-rebalance-debug-5d6j7\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:49 crc kubenswrapper[5129]: I0314 09:56:49.779282 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:50 crc kubenswrapper[5129]: I0314 09:56:50.302355 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-5d6j7"] Mar 14 09:56:51 crc kubenswrapper[5129]: I0314 09:56:51.258965 5129 generic.go:334] "Generic (PLEG): container finished" podID="d74a8e38-8c5b-48c9-8e82-ac058852be9d" containerID="55330a24029e668a39ff6ceffecc3a851c5af715d6e5d9f9918c53706d5aab99" exitCode=0 Mar 14 09:56:51 crc kubenswrapper[5129]: I0314 09:56:51.259081 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-5d6j7" event={"ID":"d74a8e38-8c5b-48c9-8e82-ac058852be9d","Type":"ContainerDied","Data":"55330a24029e668a39ff6ceffecc3a851c5af715d6e5d9f9918c53706d5aab99"} Mar 14 09:56:51 crc kubenswrapper[5129]: I0314 09:56:51.260035 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-5d6j7" event={"ID":"d74a8e38-8c5b-48c9-8e82-ac058852be9d","Type":"ContainerStarted","Data":"bc25f8fec92beb8460cf5e25430e963e5813ad83c7bd42851518d65d997ced51"} Mar 14 09:56:51 crc kubenswrapper[5129]: I0314 09:56:51.318287 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-5d6j7"] Mar 14 09:56:51 crc kubenswrapper[5129]: I0314 09:56:51.331093 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-5d6j7"] Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.473286 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-x92mj"] Mar 14 09:56:52 crc kubenswrapper[5129]: E0314 09:56:52.474103 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74a8e38-8c5b-48c9-8e82-ac058852be9d" containerName="swift-ring-rebalance" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.474126 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74a8e38-8c5b-48c9-8e82-ac058852be9d" containerName="swift-ring-rebalance" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.474412 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74a8e38-8c5b-48c9-8e82-ac058852be9d" containerName="swift-ring-rebalance" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.475722 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.493169 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-x92mj"] Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.656272 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8fn\" (UniqueName: \"kubernetes.io/projected/a71108e7-9074-4ce0-ba67-f3976bc2ef83-kube-api-access-vq8fn\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.656736 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-dispersionconf\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.656799 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a71108e7-9074-4ce0-ba67-f3976bc2ef83-etc-swift\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.656904 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-scripts\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.656936 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-swiftconf\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.657017 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.657045 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-ring-data-devices\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.744936 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.758452 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8fn\" (UniqueName: \"kubernetes.io/projected/a71108e7-9074-4ce0-ba67-f3976bc2ef83-kube-api-access-vq8fn\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.758530 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-dispersionconf\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.758585 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a71108e7-9074-4ce0-ba67-f3976bc2ef83-etc-swift\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.758711 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-swiftconf\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.758749 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-scripts\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.758799 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.758824 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-ring-data-devices\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.759660 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-ring-data-devices\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.759750 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a71108e7-9074-4ce0-ba67-f3976bc2ef83-etc-swift\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.760529 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-scripts\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.765153 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-swiftconf\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.765668 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-dispersionconf\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.766027 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.777768 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8fn\" (UniqueName: \"kubernetes.io/projected/a71108e7-9074-4ce0-ba67-f3976bc2ef83-kube-api-access-vq8fn\") pod \"swift-ring-rebalance-debug-x92mj\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.815327 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.860664 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-dispersionconf\") pod \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.860735 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-combined-ca-bundle\") pod \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.860767 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-scripts\") pod \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.860919 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z824f\" (UniqueName: \"kubernetes.io/projected/d74a8e38-8c5b-48c9-8e82-ac058852be9d-kube-api-access-z824f\") pod \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.861033 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d74a8e38-8c5b-48c9-8e82-ac058852be9d-etc-swift\") pod \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.861060 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-ring-data-devices\") pod \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.861123 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-swiftconf\") pod \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\" (UID: \"d74a8e38-8c5b-48c9-8e82-ac058852be9d\") " Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.864268 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d74a8e38-8c5b-48c9-8e82-ac058852be9d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d74a8e38-8c5b-48c9-8e82-ac058852be9d" (UID: "d74a8e38-8c5b-48c9-8e82-ac058852be9d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.867179 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d74a8e38-8c5b-48c9-8e82-ac058852be9d" (UID: "d74a8e38-8c5b-48c9-8e82-ac058852be9d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.867520 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74a8e38-8c5b-48c9-8e82-ac058852be9d-kube-api-access-z824f" (OuterVolumeSpecName: "kube-api-access-z824f") pod "d74a8e38-8c5b-48c9-8e82-ac058852be9d" (UID: "d74a8e38-8c5b-48c9-8e82-ac058852be9d"). InnerVolumeSpecName "kube-api-access-z824f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.888121 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-scripts" (OuterVolumeSpecName: "scripts") pod "d74a8e38-8c5b-48c9-8e82-ac058852be9d" (UID: "d74a8e38-8c5b-48c9-8e82-ac058852be9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.889267 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d74a8e38-8c5b-48c9-8e82-ac058852be9d" (UID: "d74a8e38-8c5b-48c9-8e82-ac058852be9d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.894913 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d74a8e38-8c5b-48c9-8e82-ac058852be9d" (UID: "d74a8e38-8c5b-48c9-8e82-ac058852be9d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.903130 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d74a8e38-8c5b-48c9-8e82-ac058852be9d" (UID: "d74a8e38-8c5b-48c9-8e82-ac058852be9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.963870 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z824f\" (UniqueName: \"kubernetes.io/projected/d74a8e38-8c5b-48c9-8e82-ac058852be9d-kube-api-access-z824f\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.964238 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d74a8e38-8c5b-48c9-8e82-ac058852be9d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.964252 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.964263 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.964273 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.964286 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74a8e38-8c5b-48c9-8e82-ac058852be9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:52 crc kubenswrapper[5129]: I0314 09:56:52.964299 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d74a8e38-8c5b-48c9-8e82-ac058852be9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:53 crc kubenswrapper[5129]: I0314 09:56:53.259859 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-x92mj"] Mar 14 09:56:53 crc kubenswrapper[5129]: I0314 09:56:53.289169 5129 scope.go:117] "RemoveContainer" containerID="55330a24029e668a39ff6ceffecc3a851c5af715d6e5d9f9918c53706d5aab99" Mar 14 09:56:53 crc kubenswrapper[5129]: I0314 09:56:53.289403 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-5d6j7" Mar 14 09:56:53 crc kubenswrapper[5129]: I0314 09:56:53.298361 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-x92mj" event={"ID":"a71108e7-9074-4ce0-ba67-f3976bc2ef83","Type":"ContainerStarted","Data":"018a4420e382c43ec7759372986ed62673111988ae727124ae5312b86a8db187"} Mar 14 09:56:54 crc kubenswrapper[5129]: I0314 09:56:54.048584 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74a8e38-8c5b-48c9-8e82-ac058852be9d" path="/var/lib/kubelet/pods/d74a8e38-8c5b-48c9-8e82-ac058852be9d/volumes" Mar 14 09:56:54 crc kubenswrapper[5129]: I0314 09:56:54.311504 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-x92mj" event={"ID":"a71108e7-9074-4ce0-ba67-f3976bc2ef83","Type":"ContainerStarted","Data":"e17aff6ac6c88e1a3eb5abc357e2dc0d191f481503c712ee39d489c265b9ea0d"} Mar 14 09:56:54 crc kubenswrapper[5129]: I0314 09:56:54.342384 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-x92mj" podStartSLOduration=2.3423630380000002 podStartE2EDuration="2.342363038s" podCreationTimestamp="2026-03-14 09:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:56:54.333938499 +0000 UTC m=+10677.085853723" watchObservedRunningTime="2026-03-14 09:56:54.342363038 +0000 UTC m=+10677.094278222" Mar 14 09:57:00 crc kubenswrapper[5129]: I0314 09:57:00.037065 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:57:00 crc kubenswrapper[5129]: E0314 09:57:00.038209 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:57:13 crc kubenswrapper[5129]: I0314 09:57:13.036643 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:57:13 crc kubenswrapper[5129]: E0314 09:57:13.037664 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:57:23 crc kubenswrapper[5129]: I0314 09:57:23.650672 5129 generic.go:334] "Generic (PLEG): container finished" podID="a71108e7-9074-4ce0-ba67-f3976bc2ef83" containerID="e17aff6ac6c88e1a3eb5abc357e2dc0d191f481503c712ee39d489c265b9ea0d" exitCode=0 Mar 14 09:57:23 crc kubenswrapper[5129]: I0314 09:57:23.650762 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-x92mj" event={"ID":"a71108e7-9074-4ce0-ba67-f3976bc2ef83","Type":"ContainerDied","Data":"e17aff6ac6c88e1a3eb5abc357e2dc0d191f481503c712ee39d489c265b9ea0d"} Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.036793 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:57:25 crc kubenswrapper[5129]: E0314 09:57:25.037799 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.098759 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.152919 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-x92mj"] Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.170986 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-x92mj"] Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.230845 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq8fn\" (UniqueName: \"kubernetes.io/projected/a71108e7-9074-4ce0-ba67-f3976bc2ef83-kube-api-access-vq8fn\") pod \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.230917 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-scripts\") pod \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.230960 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-swiftconf\") pod \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.231001 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-ring-data-devices\") pod \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.231775 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-combined-ca-bundle\") pod \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.231901 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a71108e7-9074-4ce0-ba67-f3976bc2ef83" (UID: "a71108e7-9074-4ce0-ba67-f3976bc2ef83"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.231914 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-dispersionconf\") pod \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.231998 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a71108e7-9074-4ce0-ba67-f3976bc2ef83-etc-swift\") pod \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\" (UID: \"a71108e7-9074-4ce0-ba67-f3976bc2ef83\") " Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.232656 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.233380 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a71108e7-9074-4ce0-ba67-f3976bc2ef83-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a71108e7-9074-4ce0-ba67-f3976bc2ef83" (UID: "a71108e7-9074-4ce0-ba67-f3976bc2ef83"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.237253 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71108e7-9074-4ce0-ba67-f3976bc2ef83-kube-api-access-vq8fn" (OuterVolumeSpecName: "kube-api-access-vq8fn") pod "a71108e7-9074-4ce0-ba67-f3976bc2ef83" (UID: "a71108e7-9074-4ce0-ba67-f3976bc2ef83"). InnerVolumeSpecName "kube-api-access-vq8fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.264468 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-scripts" (OuterVolumeSpecName: "scripts") pod "a71108e7-9074-4ce0-ba67-f3976bc2ef83" (UID: "a71108e7-9074-4ce0-ba67-f3976bc2ef83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.264479 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a71108e7-9074-4ce0-ba67-f3976bc2ef83" (UID: "a71108e7-9074-4ce0-ba67-f3976bc2ef83"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.266389 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a71108e7-9074-4ce0-ba67-f3976bc2ef83" (UID: "a71108e7-9074-4ce0-ba67-f3976bc2ef83"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.268015 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a71108e7-9074-4ce0-ba67-f3976bc2ef83" (UID: "a71108e7-9074-4ce0-ba67-f3976bc2ef83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.335011 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq8fn\" (UniqueName: \"kubernetes.io/projected/a71108e7-9074-4ce0-ba67-f3976bc2ef83-kube-api-access-vq8fn\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.335051 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a71108e7-9074-4ce0-ba67-f3976bc2ef83-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.335065 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.335081 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.335095 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a71108e7-9074-4ce0-ba67-f3976bc2ef83-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.335110 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a71108e7-9074-4ce0-ba67-f3976bc2ef83-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.677970 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018a4420e382c43ec7759372986ed62673111988ae727124ae5312b86a8db187" Mar 14 09:57:25 crc kubenswrapper[5129]: I0314 09:57:25.678043 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-x92mj" Mar 14 09:57:26 crc kubenswrapper[5129]: I0314 09:57:26.052739 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71108e7-9074-4ce0-ba67-f3976bc2ef83" path="/var/lib/kubelet/pods/a71108e7-9074-4ce0-ba67-f3976bc2ef83/volumes" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.471610 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 14 09:57:27 crc kubenswrapper[5129]: E0314 09:57:27.472729 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71108e7-9074-4ce0-ba67-f3976bc2ef83" containerName="swift-ring-rebalance" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.472755 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71108e7-9074-4ce0-ba67-f3976bc2ef83" containerName="swift-ring-rebalance" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.473148 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71108e7-9074-4ce0-ba67-f3976bc2ef83" containerName="swift-ring-rebalance" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.480576 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.485014 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.507844 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.545298 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-2"] Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.554180 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.588978 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-cache\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.589148 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-lock\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.589198 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab123352-28f6-4693-9e6b-0699f035df83\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab123352-28f6-4693-9e6b-0699f035df83\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.589222 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmbq\" (UniqueName: \"kubernetes.io/projected/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-kube-api-access-4vmbq\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.589314 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.589435 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-etc-swift\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.597465 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-1"] Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.607352 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.612677 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-1"] Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.648689 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-2"] Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691589 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrn47\" (UniqueName: \"kubernetes.io/projected/40db9059-374b-4b74-8e84-5c360deb5e34-kube-api-access-lrn47\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691674 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab123352-28f6-4693-9e6b-0699f035df83\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab123352-28f6-4693-9e6b-0699f035df83\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691703 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmbq\" (UniqueName: \"kubernetes.io/projected/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-kube-api-access-4vmbq\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691811 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691851 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9149055f-2604-48d3-85d4-7e109aeb13db-combined-ca-bundle\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691869 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40db9059-374b-4b74-8e84-5c360deb5e34-etc-swift\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691900 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691927 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/40db9059-374b-4b74-8e84-5c360deb5e34-cache\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.691959 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40db9059-374b-4b74-8e84-5c360deb5e34-combined-ca-bundle\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692011 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/40db9059-374b-4b74-8e84-5c360deb5e34-lock\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692132 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-etc-swift\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692161 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9149055f-2604-48d3-85d4-7e109aeb13db-etc-swift\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692181 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxfgc\" (UniqueName: \"kubernetes.io/projected/9149055f-2604-48d3-85d4-7e109aeb13db-kube-api-access-gxfgc\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692635 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9149055f-2604-48d3-85d4-7e109aeb13db-cache\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692725 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692751 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9149055f-2604-48d3-85d4-7e109aeb13db-lock\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692777 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-cache\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.692805 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-lock\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.693790 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-lock\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.695940 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-cache\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.712048 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.712099 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab123352-28f6-4693-9e6b-0699f035df83\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab123352-28f6-4693-9e6b-0699f035df83\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2881b9452f6a8b616e2589df6dc891d1573ae546c77e6e175a88b893714e441e/globalmount\"" pod="openstack/swift-storage-0" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.794775 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9149055f-2604-48d3-85d4-7e109aeb13db-etc-swift\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.794826 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxfgc\" (UniqueName: \"kubernetes.io/projected/9149055f-2604-48d3-85d4-7e109aeb13db-kube-api-access-gxfgc\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.794867 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9149055f-2604-48d3-85d4-7e109aeb13db-cache\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.794926 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.794952 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9149055f-2604-48d3-85d4-7e109aeb13db-lock\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.794993 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrn47\" (UniqueName: \"kubernetes.io/projected/40db9059-374b-4b74-8e84-5c360deb5e34-kube-api-access-lrn47\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.795080 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.795130 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9149055f-2604-48d3-85d4-7e109aeb13db-combined-ca-bundle\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.795153 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40db9059-374b-4b74-8e84-5c360deb5e34-etc-swift\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.795190 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/40db9059-374b-4b74-8e84-5c360deb5e34-cache\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.795217 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40db9059-374b-4b74-8e84-5c360deb5e34-combined-ca-bundle\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.795253 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/40db9059-374b-4b74-8e84-5c360deb5e34-lock\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.795879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/40db9059-374b-4b74-8e84-5c360deb5e34-lock\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.797482 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9149055f-2604-48d3-85d4-7e109aeb13db-cache\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.798004 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9149055f-2604-48d3-85d4-7e109aeb13db-lock\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.799277 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/40db9059-374b-4b74-8e84-5c360deb5e34-cache\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.799724 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.799756 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a803e0b35ac466896a32785ecf241ed82868683e436ad5ed460d787256046a80/globalmount\"" pod="openstack/swift-storage-2" Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.806387 5129 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:57:27 crc kubenswrapper[5129]: I0314 09:57:27.806460 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd7f048a67979e289d22256203d284f98bb97c5ff6fd0209df86863dc006903e/globalmount\"" pod="openstack/swift-storage-1" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.218470 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9149055f-2604-48d3-85d4-7e109aeb13db-etc-swift\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.219083 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40db9059-374b-4b74-8e84-5c360deb5e34-etc-swift\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.219380 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmbq\" (UniqueName: \"kubernetes.io/projected/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-kube-api-access-4vmbq\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.220021 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-etc-swift\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.220067 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74005c3b-1ed5-4e99-ae27-26b92fdee7a1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.222142 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40db9059-374b-4b74-8e84-5c360deb5e34-combined-ca-bundle\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.223950 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9149055f-2604-48d3-85d4-7e109aeb13db-combined-ca-bundle\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.224940 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxfgc\" (UniqueName: \"kubernetes.io/projected/9149055f-2604-48d3-85d4-7e109aeb13db-kube-api-access-gxfgc\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.226522 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrn47\" (UniqueName: \"kubernetes.io/projected/40db9059-374b-4b74-8e84-5c360deb5e34-kube-api-access-lrn47\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.298164 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec3290e-5c4e-4f4f-9698-aad0603aca44\") pod \"swift-storage-2\" (UID: \"9149055f-2604-48d3-85d4-7e109aeb13db\") " pod="openstack/swift-storage-2" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.302242 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab123352-28f6-4693-9e6b-0699f035df83\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab123352-28f6-4693-9e6b-0699f035df83\") pod \"swift-storage-0\" (UID: \"74005c3b-1ed5-4e99-ae27-26b92fdee7a1\") " pod="openstack/swift-storage-0" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.321205 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8eb6fe64-5186-4890-9a89-78a495c7a291\") pod \"swift-storage-1\" (UID: \"40db9059-374b-4b74-8e84-5c360deb5e34\") " pod="openstack/swift-storage-1" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.427109 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.502015 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-2" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.542747 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-1" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.582776 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-kzhjd"] Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.602535 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-kzhjd"] Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.630191 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tbxqz"] Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.631463 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.633367 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.633660 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.649899 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tbxqz"] Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.720134 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-scripts\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.720196 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-dispersionconf\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.720222 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-ring-data-devices\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.720245 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-combined-ca-bundle\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.720288 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b53ff87d-b7f7-4d68-834c-7fa95020d95a-etc-swift\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.720308 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-swiftconf\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.720343 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl5xm\" (UniqueName: \"kubernetes.io/projected/b53ff87d-b7f7-4d68-834c-7fa95020d95a-kube-api-access-wl5xm\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.822210 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b53ff87d-b7f7-4d68-834c-7fa95020d95a-etc-swift\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.822663 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b53ff87d-b7f7-4d68-834c-7fa95020d95a-etc-swift\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.822699 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-swiftconf\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.822773 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl5xm\" (UniqueName: \"kubernetes.io/projected/b53ff87d-b7f7-4d68-834c-7fa95020d95a-kube-api-access-wl5xm\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.823218 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-scripts\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.823259 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-dispersionconf\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.823285 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-ring-data-devices\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.823310 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-combined-ca-bundle\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.824170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-scripts\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.824577 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-ring-data-devices\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.830369 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-combined-ca-bundle\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.831368 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-swiftconf\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.832879 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-dispersionconf\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.850540 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl5xm\" (UniqueName: \"kubernetes.io/projected/b53ff87d-b7f7-4d68-834c-7fa95020d95a-kube-api-access-wl5xm\") pod \"swift-ring-rebalance-tbxqz\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:28 crc kubenswrapper[5129]: I0314 09:57:28.962709 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:29 crc kubenswrapper[5129]: I0314 09:57:29.212210 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 09:57:29 crc kubenswrapper[5129]: I0314 09:57:29.408592 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-1"] Mar 14 09:57:29 crc kubenswrapper[5129]: I0314 09:57:29.476278 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tbxqz"] Mar 14 09:57:29 crc kubenswrapper[5129]: I0314 09:57:29.726305 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbxqz" event={"ID":"b53ff87d-b7f7-4d68-834c-7fa95020d95a","Type":"ContainerStarted","Data":"f1d2c80dde86ae40ea12c56e0c93c73da153b4821319ae4ca8e07a2c9d60c2bd"} Mar 14 09:57:29 crc kubenswrapper[5129]: I0314 09:57:29.727989 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"8484a28a367b5fc10f5c034587cd1e8cf19819e9a5042d0b5a48780ca3504bf4"} Mar 14 09:57:29 crc kubenswrapper[5129]: I0314 09:57:29.734673 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"5d88320f9498a02d00575191de1f420e531150cef830ee62ee76a171c213a097"} Mar 14 09:57:29 crc kubenswrapper[5129]: I0314 09:57:29.877262 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-2"] Mar 14 09:57:29 crc kubenswrapper[5129]: W0314 09:57:29.887050 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9149055f_2604_48d3_85d4_7e109aeb13db.slice/crio-aa91338716de697ee43223421ac89c620974407ea042c99d6170b3a178dd95bf WatchSource:0}: Error finding container aa91338716de697ee43223421ac89c620974407ea042c99d6170b3a178dd95bf: Status 404 returned error can't find the container with id aa91338716de697ee43223421ac89c620974407ea042c99d6170b3a178dd95bf Mar 14 09:57:30 crc kubenswrapper[5129]: I0314 09:57:30.059373 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f98c195-d11b-4583-bdc6-5708e9e72c03" path="/var/lib/kubelet/pods/7f98c195-d11b-4583-bdc6-5708e9e72c03/volumes" Mar 14 09:57:30 crc kubenswrapper[5129]: I0314 09:57:30.750392 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbxqz" event={"ID":"b53ff87d-b7f7-4d68-834c-7fa95020d95a","Type":"ContainerStarted","Data":"1b825bcb68cc7fca831caf8ecbbd4fa26870de4dbc06ce886ffaa3ace6f597a7"} Mar 14 09:57:30 crc kubenswrapper[5129]: I0314 09:57:30.759852 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"0c6f8cd0472fbb18d60c6f468a52b58564ef3b5e02f383a6cc4c423ea5c4af18"} Mar 14 09:57:30 crc kubenswrapper[5129]: I0314 09:57:30.762438 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"0eb50472feaa8e17491fd0227d9ce1d76d80746ffc442644e68ac2fcf7ae9f7e"} Mar 14 09:57:30 crc kubenswrapper[5129]: I0314 09:57:30.762467 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"aa91338716de697ee43223421ac89c620974407ea042c99d6170b3a178dd95bf"} Mar 14 09:57:30 crc kubenswrapper[5129]: I0314 09:57:30.770875 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"496bec0583b9dd1f25e88439ca1fe9abfb6308e3cccf7934504328a0c2b872ef"} Mar 14 09:57:30 crc kubenswrapper[5129]: I0314 09:57:30.798816 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tbxqz" podStartSLOduration=2.79879347 podStartE2EDuration="2.79879347s" podCreationTimestamp="2026-03-14 09:57:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:57:30.778180249 +0000 UTC m=+10713.530095433" watchObservedRunningTime="2026-03-14 09:57:30.79879347 +0000 UTC m=+10713.550708664" Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.865973 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"bb9196d8415b00a132a58e1e92e578bbeb450990a0147ac539e17a16d78385db"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.866307 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"8991c2ace0ce439e11a06ff5b9564a0f214fe93a92b046c7f35d986a4171518d"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.866324 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"24c4c4c2a8507721573a802c34afe69c33ffd9eabd2a49594c04ea49277323f8"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.884097 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"e6a896a692df65c28623032f139733a184dd13d6f2dbf7bbe97951a856b5a200"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.884153 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"d03c0f7f3d380d740892aed83451214f93bd21a0162958f71e72ab231c3fbb0e"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.884165 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"d5646f694848796520399c87d0b985e741805ce25ebad6fbf418199eb053a1a6"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.910388 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"f474ef54e572ec76193f75a335a839ea5e253051b703cfe3818ff60a3719812c"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.910440 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"14f8918aa59ba207ae80d6b94cb4364c12c21c2b13c06b5491bd99af7beaf016"} Mar 14 09:57:31 crc kubenswrapper[5129]: I0314 09:57:31.910455 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"4f7b9202ab84829456abb3c3a98d0bac038e65d4ce74cf0dde18e6e49e092674"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.948106 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"6b36ee9b6e167b598ae92cfd20c27febcf20e847255779b682d64f86ec40bf56"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.948726 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"96369234e7fc4a80ddbb2f95bfbfd2ba421a119ec4f864122b3e156dd58a9368"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.948740 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"d031ea78c859b1b1e114d97e87d6408d3c96d86232d68678a964cd27604779ee"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.950486 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"233039ea489d588f003c66de52bc60a6e9cb9bb5b9e5d3d7945dc86b560d85fa"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.950515 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"8e3f520b01ee9f38df6a64d92c5794cddc7bbd7e20cf68191c8a35e65cb49abe"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.950528 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"558ba0d084fbd8ff3152d374cd5a1df852a235b2ba02a8e5e827b2e5a7d3614c"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.953959 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"ba046939f41cdec8435207a39f61e1fae431718055a0dc90550e3449a40174c7"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.953992 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"5a1b7db95b6a3a8fdea702d6e351bde472abd7674bc6780b6d7c3732a15e113c"} Mar 14 09:57:33 crc kubenswrapper[5129]: I0314 09:57:33.954006 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"162928a3243f8e57b823f891c35eedb8a0b524d218805b71679a442906b7ad2a"} Mar 14 09:57:34 crc kubenswrapper[5129]: I0314 09:57:34.969212 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"df6a3433fcd931bc0074a2d6280410fd3261f2013c53f38dae92bf86b6240abf"} Mar 14 09:57:34 crc kubenswrapper[5129]: I0314 09:57:34.975276 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"df191907ab4654c32115e84e8b8f18b9a5bf0f4fef449277b5400cdbc58c08bf"} Mar 14 09:57:34 crc kubenswrapper[5129]: I0314 09:57:34.979286 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"07ec673e98179eb5fd375f2abb9b205cf31fc20b42213b076040110e3277de2b"} Mar 14 09:57:36 crc kubenswrapper[5129]: I0314 09:57:36.025962 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"7dc9aafb4829b2bb851a490fddb8d22ae31703529ac4527ae0b0d1f13b98a749"} Mar 14 09:57:36 crc kubenswrapper[5129]: I0314 09:57:36.026303 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"e520d54cfaa13cd37271d66d3c76fc1ecddb8596b297ac1d4aef310771abd271"} Mar 14 09:57:36 crc kubenswrapper[5129]: I0314 09:57:36.026314 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"2b44870a8e7b3daaac9b8261620db918a462fc5642d31d5931f8d29b966d5ce5"} Mar 14 09:57:36 crc kubenswrapper[5129]: I0314 09:57:36.033618 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"fd5f2f636d1169ff5de346927e3ecb193dc34377933187c1d2862496d192badd"} Mar 14 09:57:36 crc kubenswrapper[5129]: I0314 09:57:36.033650 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"a164006ea3fa7ee1d198a47626760a6bc3b0eabef199448a6188eb34b71ca02a"} Mar 14 09:57:36 crc kubenswrapper[5129]: I0314 09:57:36.124366 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"5dd556720699b04bdfe129d505a37605f77aadec2691759648f6eccfa8b2dddd"} Mar 14 09:57:36 crc kubenswrapper[5129]: I0314 09:57:36.124416 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"d8084bbd9e257285910c304b69710b4cc3dcaa973e79ede1e1ae39ec5af2f891"} Mar 14 09:57:37 crc kubenswrapper[5129]: I0314 09:57:37.037233 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:57:37 crc kubenswrapper[5129]: E0314 09:57:37.038283 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:57:37 crc kubenswrapper[5129]: I0314 09:57:37.130934 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"6452c4fe18daffbb8572e2b4f8624cb2bb2754a573b440b5df586aaa92777d69"} Mar 14 09:57:37 crc kubenswrapper[5129]: I0314 09:57:37.131027 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"182dbcb1c299992d9c4d6b5fde079fcf014bcbd9c490a50fc8468efa0e6fa127"} Mar 14 09:57:37 crc kubenswrapper[5129]: I0314 09:57:37.141346 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"74114bb0ef07b7a313bffab26978b01a3a09efd3a70c5aa30b2213b2e78e4993"} Mar 14 09:57:37 crc kubenswrapper[5129]: I0314 09:57:37.152069 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"4b02fab6f39f998aeb39f3b7c08dab741d68e9117a715391ded58d71354515c6"} Mar 14 09:57:37 crc kubenswrapper[5129]: I0314 09:57:37.152124 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"e1333b3a12927955c64aaa6ee65755a70547e9cd84903ecec0dd3f37ff6ea676"} Mar 14 09:57:38 crc kubenswrapper[5129]: I0314 09:57:38.167998 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"b258bcdf93734662c18c1373b2ff98149faf9dccd5e36f8c50abd0d203176c1a"} Mar 14 09:57:38 crc kubenswrapper[5129]: I0314 09:57:38.168337 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"4b4ecc4c161c73a6f416c92b43949a986f02eb912305f13ce1f39bcd319a4c23"} Mar 14 09:57:38 crc kubenswrapper[5129]: I0314 09:57:38.183999 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"05263e7a0b8d1d68414d3597baa122d633356a503e148f677c9954bcad5dba95"} Mar 14 09:57:38 crc kubenswrapper[5129]: I0314 09:57:38.184069 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"012a24ae9bfd926e7a78e0ec8c198946f7a4142be3aae816ecdb0e208ef581a8"} Mar 14 09:57:38 crc kubenswrapper[5129]: I0314 09:57:38.190667 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"de9a1987b14b1a378e2d77c62cb4b24ead25a5da95d71d2779e4e41226181764"} Mar 14 09:57:38 crc kubenswrapper[5129]: I0314 09:57:38.190713 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"93b6c3fe46942759cc01e8d07dc06ed2e337ab28c1adf413dee97f4118ae1b38"} Mar 14 09:57:38 crc kubenswrapper[5129]: I0314 09:57:38.563743 5129 scope.go:117] "RemoveContainer" containerID="0f35e53b5e6b5b25ca0e102dffddff07fca6d934bf209786fa74497323d689fc" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.216324 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-2" event={"ID":"9149055f-2604-48d3-85d4-7e109aeb13db","Type":"ContainerStarted","Data":"761098411bd7b4103d72070a070c13b1a5ba4e4808d767480bc296518f373b3f"} Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.228050 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-1" event={"ID":"40db9059-374b-4b74-8e84-5c360deb5e34","Type":"ContainerStarted","Data":"089d86b0193d26e0b8d3155833ec874c41168fae67685affaf04e26bcab115dd"} Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.236966 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74005c3b-1ed5-4e99-ae27-26b92fdee7a1","Type":"ContainerStarted","Data":"1f330e3eca2fda277d4dc0afc764dd1f2eae4a0e8831472b1ad70b87f1b68d47"} Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.288580 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-2" podStartSLOduration=8.318230864 podStartE2EDuration="13.288553539s" podCreationTimestamp="2026-03-14 09:57:26 +0000 UTC" firstStartedPulling="2026-03-14 09:57:29.891695038 +0000 UTC m=+10712.643610222" lastFinishedPulling="2026-03-14 09:57:34.862017713 +0000 UTC m=+10717.613932897" observedRunningTime="2026-03-14 09:57:39.258760999 +0000 UTC m=+10722.010676213" watchObservedRunningTime="2026-03-14 09:57:39.288553539 +0000 UTC m=+10722.040468723" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.344632 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-1" podStartSLOduration=7.901063066 podStartE2EDuration="13.344591142s" podCreationTimestamp="2026-03-14 09:57:26 +0000 UTC" firstStartedPulling="2026-03-14 09:57:29.417577732 +0000 UTC m=+10712.169492916" lastFinishedPulling="2026-03-14 09:57:34.861105808 +0000 UTC m=+10717.613020992" observedRunningTime="2026-03-14 09:57:39.337451679 +0000 UTC m=+10722.089366863" watchObservedRunningTime="2026-03-14 09:57:39.344591142 +0000 UTC m=+10722.096506326" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.419688 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=7.766194243 podStartE2EDuration="13.419661083s" podCreationTimestamp="2026-03-14 09:57:26 +0000 UTC" firstStartedPulling="2026-03-14 09:57:29.212679044 +0000 UTC m=+10711.964594238" lastFinishedPulling="2026-03-14 09:57:34.866145894 +0000 UTC m=+10717.618061078" observedRunningTime="2026-03-14 09:57:39.402485746 +0000 UTC m=+10722.154400940" watchObservedRunningTime="2026-03-14 09:57:39.419661083 +0000 UTC m=+10722.171576267" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.658217 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d559866ff-tlhc4"] Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.660512 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.663278 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.677828 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d559866ff-tlhc4"] Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.709287 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d559866ff-tlhc4"] Mar 14 09:57:39 crc kubenswrapper[5129]: E0314 09:57:39.728527 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-9j7gh openstack-cell1 openstack-networker ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[config dns-svc dns-swift-storage-0 kube-api-access-9j7gh openstack-cell1 openstack-networker ovsdbserver-nb ovsdbserver-sb]: context canceled" pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" podUID="d6d7b531-694a-4388-b1f2-9c24a6049141" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.749023 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc88b7669-hjn88"] Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756316 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756448 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-config\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756531 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-svc\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756558 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-cell1\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756655 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j7gh\" (UniqueName: \"kubernetes.io/projected/d6d7b531-694a-4388-b1f2-9c24a6049141-kube-api-access-9j7gh\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756717 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-networker\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756799 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-sb\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756846 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-swift-storage-0\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.756881 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-nb\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.768014 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-1" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.768438 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-2" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.788323 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc88b7669-hjn88"] Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859129 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-cell1\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859201 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-svc\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859257 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j7gh\" (UniqueName: \"kubernetes.io/projected/d6d7b531-694a-4388-b1f2-9c24a6049141-kube-api-access-9j7gh\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859294 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-openstack-cell1\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859345 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-networker\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859382 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859409 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859455 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-2\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-2\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859484 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn4kj\" (UniqueName: \"kubernetes.io/projected/4fb66716-a152-4be1-a683-93241f8397d9-kube-api-access-mn4kj\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859503 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-1\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-1\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859520 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-openstack-networker\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859543 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-config\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859585 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-sb\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859652 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859686 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-swift-storage-0\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859720 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-nb\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859793 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-config\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.859848 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-svc\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.860899 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-svc\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.861743 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-cell1\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.862829 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-networker\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.863578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-sb\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.864291 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-swift-storage-0\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.865100 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-nb\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.865368 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-config\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.922423 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j7gh\" (UniqueName: \"kubernetes.io/projected/d6d7b531-694a-4388-b1f2-9c24a6049141-kube-api-access-9j7gh\") pod \"dnsmasq-dns-6d559866ff-tlhc4\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961234 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-2\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-2\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961286 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn4kj\" (UniqueName: \"kubernetes.io/projected/4fb66716-a152-4be1-a683-93241f8397d9-kube-api-access-mn4kj\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961308 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-1\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-1\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961327 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-openstack-networker\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961352 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-config\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961391 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961478 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-svc\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961523 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-openstack-cell1\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961560 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.961577 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.962407 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.963292 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.963550 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-1\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-1\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.963663 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-openstack-networker\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.964129 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-2\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-swift-storage-2\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.965328 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-openstack-cell1\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.965393 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-config\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.966279 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.967017 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb66716-a152-4be1-a683-93241f8397d9-dns-svc\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:39 crc kubenswrapper[5129]: I0314 09:57:39.987661 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn4kj\" (UniqueName: \"kubernetes.io/projected/4fb66716-a152-4be1-a683-93241f8397d9-kube-api-access-mn4kj\") pod \"dnsmasq-dns-7fc88b7669-hjn88\" (UID: \"4fb66716-a152-4be1-a683-93241f8397d9\") " pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.017355 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-2xsrq"] Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.018800 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.056319 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-2xsrq"] Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.063565 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-dispersionconf\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.063912 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.064017 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-ring-data-devices\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.064162 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4zq\" (UniqueName: \"kubernetes.io/projected/d93deb70-fbfb-4a00-979a-de6056276a55-kube-api-access-bx4zq\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.064370 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-swiftconf\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.064476 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d93deb70-fbfb-4a00-979a-de6056276a55-etc-swift\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.064634 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-scripts\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.085932 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.167123 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.167167 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-ring-data-devices\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.167218 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4zq\" (UniqueName: \"kubernetes.io/projected/d93deb70-fbfb-4a00-979a-de6056276a55-kube-api-access-bx4zq\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.167304 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-swiftconf\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.167323 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d93deb70-fbfb-4a00-979a-de6056276a55-etc-swift\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.168146 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-scripts\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.168254 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-dispersionconf\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.168034 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-ring-data-devices\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.168101 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d93deb70-fbfb-4a00-979a-de6056276a55-etc-swift\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.169350 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-scripts\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.172168 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-dispersionconf\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.174345 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.175619 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-swiftconf\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.186104 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4zq\" (UniqueName: \"kubernetes.io/projected/d93deb70-fbfb-4a00-979a-de6056276a55-kube-api-access-bx4zq\") pod \"swift-ring-rebalance-debug-2xsrq\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.247464 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.304923 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.373729 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-networker\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.373765 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-sb\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.373906 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-nb\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.373978 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j7gh\" (UniqueName: \"kubernetes.io/projected/d6d7b531-694a-4388-b1f2-9c24a6049141-kube-api-access-9j7gh\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.374026 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-svc\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.374100 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-config\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.374145 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-swift-storage-0\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.374174 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-cell1\") pod \"d6d7b531-694a-4388-b1f2-9c24a6049141\" (UID: \"d6d7b531-694a-4388-b1f2-9c24a6049141\") " Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.376694 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.377002 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-config" (OuterVolumeSpecName: "config") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.377049 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.377023 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.377824 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.377914 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.377990 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.378090 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.388781 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d7b531-694a-4388-b1f2-9c24a6049141-kube-api-access-9j7gh" (OuterVolumeSpecName: "kube-api-access-9j7gh") pod "d6d7b531-694a-4388-b1f2-9c24a6049141" (UID: "d6d7b531-694a-4388-b1f2-9c24a6049141"). InnerVolumeSpecName "kube-api-access-9j7gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480541 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480588 5129 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480639 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480656 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480673 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480717 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480730 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j7gh\" (UniqueName: \"kubernetes.io/projected/d6d7b531-694a-4388-b1f2-9c24a6049141-kube-api-access-9j7gh\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.480742 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6d7b531-694a-4388-b1f2-9c24a6049141-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.706197 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc88b7669-hjn88"] Mar 14 09:57:40 crc kubenswrapper[5129]: W0314 09:57:40.713781 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb66716_a152_4be1_a683_93241f8397d9.slice/crio-7b9dfa563ca20b841bc203248df080bcda744adec1e96668d02227528126cb71 WatchSource:0}: Error finding container 7b9dfa563ca20b841bc203248df080bcda744adec1e96668d02227528126cb71: Status 404 returned error can't find the container with id 7b9dfa563ca20b841bc203248df080bcda744adec1e96668d02227528126cb71 Mar 14 09:57:40 crc kubenswrapper[5129]: W0314 09:57:40.965247 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd93deb70_fbfb_4a00_979a_de6056276a55.slice/crio-5ae6cbe6112a1536df29257bb2fe4695a9bb2d5ed04282be2152b609b6ea6fdc WatchSource:0}: Error finding container 5ae6cbe6112a1536df29257bb2fe4695a9bb2d5ed04282be2152b609b6ea6fdc: Status 404 returned error can't find the container with id 5ae6cbe6112a1536df29257bb2fe4695a9bb2d5ed04282be2152b609b6ea6fdc Mar 14 09:57:40 crc kubenswrapper[5129]: I0314 09:57:40.970040 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-2xsrq"] Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.078135 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-2xsrq"] Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.260523 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-2xsrq" event={"ID":"d93deb70-fbfb-4a00-979a-de6056276a55","Type":"ContainerStarted","Data":"05ec201ac0bc772f2cc230173ca6640d09318f9f60f7a012fba3d5dcb5ca470d"} Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.261785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-2xsrq" event={"ID":"d93deb70-fbfb-4a00-979a-de6056276a55","Type":"ContainerStarted","Data":"5ae6cbe6112a1536df29257bb2fe4695a9bb2d5ed04282be2152b609b6ea6fdc"} Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.262228 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-ring-rebalance-debug-2xsrq" podUID="d93deb70-fbfb-4a00-979a-de6056276a55" containerName="swift-ring-rebalance" containerID="cri-o://05ec201ac0bc772f2cc230173ca6640d09318f9f60f7a012fba3d5dcb5ca470d" gracePeriod=30 Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.265746 5129 generic.go:334] "Generic (PLEG): container finished" podID="4fb66716-a152-4be1-a683-93241f8397d9" containerID="1e77b30c1389e96d5a16bd1084e2b6645d7915469643677ae7ae033b2bf2f68b" exitCode=0 Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.267427 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d559866ff-tlhc4" Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.266440 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" event={"ID":"4fb66716-a152-4be1-a683-93241f8397d9","Type":"ContainerDied","Data":"1e77b30c1389e96d5a16bd1084e2b6645d7915469643677ae7ae033b2bf2f68b"} Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.269888 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" event={"ID":"4fb66716-a152-4be1-a683-93241f8397d9","Type":"ContainerStarted","Data":"7b9dfa563ca20b841bc203248df080bcda744adec1e96668d02227528126cb71"} Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.295296 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-2xsrq" podStartSLOduration=2.295261354 podStartE2EDuration="2.295261354s" podCreationTimestamp="2026-03-14 09:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:57:41.278640762 +0000 UTC m=+10724.030555956" watchObservedRunningTime="2026-03-14 09:57:41.295261354 +0000 UTC m=+10724.047176538" Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.498185 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d559866ff-tlhc4"] Mar 14 09:57:41 crc kubenswrapper[5129]: I0314 09:57:41.503751 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d559866ff-tlhc4"] Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.053264 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d7b531-694a-4388-b1f2-9c24a6049141" path="/var/lib/kubelet/pods/d6d7b531-694a-4388-b1f2-9c24a6049141/volumes" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.280641 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" event={"ID":"4fb66716-a152-4be1-a683-93241f8397d9","Type":"ContainerStarted","Data":"c85b563f0ee7fa081cd1954c9afb24a77a2909fd12b1d52f84dd0011e87f3940"} Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.280777 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.284104 5129 generic.go:334] "Generic (PLEG): container finished" podID="b53ff87d-b7f7-4d68-834c-7fa95020d95a" containerID="1b825bcb68cc7fca831caf8ecbbd4fa26870de4dbc06ce886ffaa3ace6f597a7" exitCode=0 Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.284176 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbxqz" event={"ID":"b53ff87d-b7f7-4d68-834c-7fa95020d95a","Type":"ContainerDied","Data":"1b825bcb68cc7fca831caf8ecbbd4fa26870de4dbc06ce886ffaa3ace6f597a7"} Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.285787 5129 generic.go:334] "Generic (PLEG): container finished" podID="d93deb70-fbfb-4a00-979a-de6056276a55" containerID="05ec201ac0bc772f2cc230173ca6640d09318f9f60f7a012fba3d5dcb5ca470d" exitCode=2 Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.285817 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-2xsrq" event={"ID":"d93deb70-fbfb-4a00-979a-de6056276a55","Type":"ContainerDied","Data":"05ec201ac0bc772f2cc230173ca6640d09318f9f60f7a012fba3d5dcb5ca470d"} Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.285833 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-2xsrq" event={"ID":"d93deb70-fbfb-4a00-979a-de6056276a55","Type":"ContainerDied","Data":"5ae6cbe6112a1536df29257bb2fe4695a9bb2d5ed04282be2152b609b6ea6fdc"} Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.285845 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae6cbe6112a1536df29257bb2fe4695a9bb2d5ed04282be2152b609b6ea6fdc" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.291303 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.317678 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" podStartSLOduration=3.317649569 podStartE2EDuration="3.317649569s" podCreationTimestamp="2026-03-14 09:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:57:42.305086238 +0000 UTC m=+10725.057001432" watchObservedRunningTime="2026-03-14 09:57:42.317649569 +0000 UTC m=+10725.069564753" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.324270 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d93deb70-fbfb-4a00-979a-de6056276a55-etc-swift\") pod \"d93deb70-fbfb-4a00-979a-de6056276a55\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.324367 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-dispersionconf\") pod \"d93deb70-fbfb-4a00-979a-de6056276a55\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.324461 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-combined-ca-bundle\") pod \"d93deb70-fbfb-4a00-979a-de6056276a55\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.324588 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-ring-data-devices\") pod \"d93deb70-fbfb-4a00-979a-de6056276a55\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.324699 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-swiftconf\") pod \"d93deb70-fbfb-4a00-979a-de6056276a55\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.325010 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d93deb70-fbfb-4a00-979a-de6056276a55" (UID: "d93deb70-fbfb-4a00-979a-de6056276a55"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.325282 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-scripts\") pod \"d93deb70-fbfb-4a00-979a-de6056276a55\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.325370 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx4zq\" (UniqueName: \"kubernetes.io/projected/d93deb70-fbfb-4a00-979a-de6056276a55-kube-api-access-bx4zq\") pod \"d93deb70-fbfb-4a00-979a-de6056276a55\" (UID: \"d93deb70-fbfb-4a00-979a-de6056276a55\") " Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.325569 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93deb70-fbfb-4a00-979a-de6056276a55-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d93deb70-fbfb-4a00-979a-de6056276a55" (UID: "d93deb70-fbfb-4a00-979a-de6056276a55"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.326138 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d93deb70-fbfb-4a00-979a-de6056276a55-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.326177 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.332397 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93deb70-fbfb-4a00-979a-de6056276a55-kube-api-access-bx4zq" (OuterVolumeSpecName: "kube-api-access-bx4zq") pod "d93deb70-fbfb-4a00-979a-de6056276a55" (UID: "d93deb70-fbfb-4a00-979a-de6056276a55"). InnerVolumeSpecName "kube-api-access-bx4zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.363220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d93deb70-fbfb-4a00-979a-de6056276a55" (UID: "d93deb70-fbfb-4a00-979a-de6056276a55"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.363403 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d93deb70-fbfb-4a00-979a-de6056276a55" (UID: "d93deb70-fbfb-4a00-979a-de6056276a55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.375737 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d93deb70-fbfb-4a00-979a-de6056276a55" (UID: "d93deb70-fbfb-4a00-979a-de6056276a55"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.386790 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-scripts" (OuterVolumeSpecName: "scripts") pod "d93deb70-fbfb-4a00-979a-de6056276a55" (UID: "d93deb70-fbfb-4a00-979a-de6056276a55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.428535 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d93deb70-fbfb-4a00-979a-de6056276a55-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.428568 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx4zq\" (UniqueName: \"kubernetes.io/projected/d93deb70-fbfb-4a00-979a-de6056276a55-kube-api-access-bx4zq\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.428581 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.428590 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.428614 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d93deb70-fbfb-4a00-979a-de6056276a55-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.744265 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pj76t"] Mar 14 09:57:42 crc kubenswrapper[5129]: E0314 09:57:42.744974 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93deb70-fbfb-4a00-979a-de6056276a55" containerName="swift-ring-rebalance" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.745011 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93deb70-fbfb-4a00-979a-de6056276a55" containerName="swift-ring-rebalance" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.745379 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93deb70-fbfb-4a00-979a-de6056276a55" containerName="swift-ring-rebalance" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.747918 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.773964 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pj76t"] Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.807938 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-2xsrq"] Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.850999 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-utilities\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.851163 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-catalog-content\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.851231 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gf2\" (UniqueName: \"kubernetes.io/projected/ef8c633e-31da-486e-8e83-0c4a0c893b2a-kube-api-access-86gf2\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.865226 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-2xsrq"] Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.954241 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-catalog-content\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.954385 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86gf2\" (UniqueName: \"kubernetes.io/projected/ef8c633e-31da-486e-8e83-0c4a0c893b2a-kube-api-access-86gf2\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.954672 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-utilities\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.955041 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-catalog-content\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.955199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-utilities\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:42 crc kubenswrapper[5129]: I0314 09:57:42.971637 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gf2\" (UniqueName: \"kubernetes.io/projected/ef8c633e-31da-486e-8e83-0c4a0c893b2a-kube-api-access-86gf2\") pod \"redhat-operators-pj76t\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:43 crc kubenswrapper[5129]: I0314 09:57:43.082263 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:43 crc kubenswrapper[5129]: I0314 09:57:43.303825 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2xsrq" Mar 14 09:57:43 crc kubenswrapper[5129]: I0314 09:57:43.804449 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pj76t"] Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.050832 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93deb70-fbfb-4a00-979a-de6056276a55" path="/var/lib/kubelet/pods/d93deb70-fbfb-4a00-979a-de6056276a55/volumes" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.055979 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.197680 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-dispersionconf\") pod \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.197740 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-combined-ca-bundle\") pod \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.197776 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-swiftconf\") pod \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.197796 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl5xm\" (UniqueName: \"kubernetes.io/projected/b53ff87d-b7f7-4d68-834c-7fa95020d95a-kube-api-access-wl5xm\") pod \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.197829 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-ring-data-devices\") pod \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.197937 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-scripts\") pod \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.197982 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b53ff87d-b7f7-4d68-834c-7fa95020d95a-etc-swift\") pod \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\" (UID: \"b53ff87d-b7f7-4d68-834c-7fa95020d95a\") " Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.200709 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53ff87d-b7f7-4d68-834c-7fa95020d95a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b53ff87d-b7f7-4d68-834c-7fa95020d95a" (UID: "b53ff87d-b7f7-4d68-834c-7fa95020d95a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.202521 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b53ff87d-b7f7-4d68-834c-7fa95020d95a" (UID: "b53ff87d-b7f7-4d68-834c-7fa95020d95a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.223473 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53ff87d-b7f7-4d68-834c-7fa95020d95a-kube-api-access-wl5xm" (OuterVolumeSpecName: "kube-api-access-wl5xm") pod "b53ff87d-b7f7-4d68-834c-7fa95020d95a" (UID: "b53ff87d-b7f7-4d68-834c-7fa95020d95a"). InnerVolumeSpecName "kube-api-access-wl5xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.228904 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-scripts" (OuterVolumeSpecName: "scripts") pod "b53ff87d-b7f7-4d68-834c-7fa95020d95a" (UID: "b53ff87d-b7f7-4d68-834c-7fa95020d95a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.238151 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b53ff87d-b7f7-4d68-834c-7fa95020d95a" (UID: "b53ff87d-b7f7-4d68-834c-7fa95020d95a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.267016 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b53ff87d-b7f7-4d68-834c-7fa95020d95a" (UID: "b53ff87d-b7f7-4d68-834c-7fa95020d95a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.273258 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b53ff87d-b7f7-4d68-834c-7fa95020d95a" (UID: "b53ff87d-b7f7-4d68-834c-7fa95020d95a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.300961 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.301000 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b53ff87d-b7f7-4d68-834c-7fa95020d95a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.301010 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.301019 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.301028 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl5xm\" (UniqueName: \"kubernetes.io/projected/b53ff87d-b7f7-4d68-834c-7fa95020d95a-kube-api-access-wl5xm\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.301041 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b53ff87d-b7f7-4d68-834c-7fa95020d95a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.301050 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b53ff87d-b7f7-4d68-834c-7fa95020d95a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.314087 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbxqz" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.314102 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbxqz" event={"ID":"b53ff87d-b7f7-4d68-834c-7fa95020d95a","Type":"ContainerDied","Data":"f1d2c80dde86ae40ea12c56e0c93c73da153b4821319ae4ca8e07a2c9d60c2bd"} Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.314381 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d2c80dde86ae40ea12c56e0c93c73da153b4821319ae4ca8e07a2c9d60c2bd" Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.315901 5129 generic.go:334] "Generic (PLEG): container finished" podID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerID="ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32" exitCode=0 Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.315946 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pj76t" event={"ID":"ef8c633e-31da-486e-8e83-0c4a0c893b2a","Type":"ContainerDied","Data":"ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32"} Mar 14 09:57:44 crc kubenswrapper[5129]: I0314 09:57:44.315976 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pj76t" event={"ID":"ef8c633e-31da-486e-8e83-0c4a0c893b2a","Type":"ContainerStarted","Data":"0e5c3f193747ae0c143bb01425bad5fea09bd893bb22e08b9e0e6f4941873191"} Mar 14 09:57:45 crc kubenswrapper[5129]: I0314 09:57:45.329065 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pj76t" event={"ID":"ef8c633e-31da-486e-8e83-0c4a0c893b2a","Type":"ContainerStarted","Data":"e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4"} Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.039869 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:57:50 crc kubenswrapper[5129]: E0314 09:57:50.042705 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.087914 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc88b7669-hjn88" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.187264 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc4b66b87-lpkvz"] Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.187509 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" podUID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerName="dnsmasq-dns" containerID="cri-o://6d0016b4051a055bef1eedd7c861f429722b0d3c7a3ef2d21f8dd9c7bc64dd03" gracePeriod=10 Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.398264 5129 generic.go:334] "Generic (PLEG): container finished" podID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerID="6d0016b4051a055bef1eedd7c861f429722b0d3c7a3ef2d21f8dd9c7bc64dd03" exitCode=0 Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.398361 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" event={"ID":"52f5f08c-6373-4ac4-8fd9-278bb200b1ef","Type":"ContainerDied","Data":"6d0016b4051a055bef1eedd7c861f429722b0d3c7a3ef2d21f8dd9c7bc64dd03"} Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.400457 5129 generic.go:334] "Generic (PLEG): container finished" podID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerID="e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4" exitCode=0 Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.400506 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pj76t" event={"ID":"ef8c633e-31da-486e-8e83-0c4a0c893b2a","Type":"ContainerDied","Data":"e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4"} Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.745984 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.868420 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-dns-svc\") pod \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.868873 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxp7b\" (UniqueName: \"kubernetes.io/projected/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-kube-api-access-vxp7b\") pod \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.869631 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-sb\") pod \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.869882 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-networker\") pod \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.870074 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-cell1\") pod \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.870247 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-nb\") pod \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.870385 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-config\") pod \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\" (UID: \"52f5f08c-6373-4ac4-8fd9-278bb200b1ef\") " Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.880045 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-kube-api-access-vxp7b" (OuterVolumeSpecName: "kube-api-access-vxp7b") pod "52f5f08c-6373-4ac4-8fd9-278bb200b1ef" (UID: "52f5f08c-6373-4ac4-8fd9-278bb200b1ef"). InnerVolumeSpecName "kube-api-access-vxp7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.931226 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "52f5f08c-6373-4ac4-8fd9-278bb200b1ef" (UID: "52f5f08c-6373-4ac4-8fd9-278bb200b1ef"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.931839 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "52f5f08c-6373-4ac4-8fd9-278bb200b1ef" (UID: "52f5f08c-6373-4ac4-8fd9-278bb200b1ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.932368 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "52f5f08c-6373-4ac4-8fd9-278bb200b1ef" (UID: "52f5f08c-6373-4ac4-8fd9-278bb200b1ef"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.938448 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "52f5f08c-6373-4ac4-8fd9-278bb200b1ef" (UID: "52f5f08c-6373-4ac4-8fd9-278bb200b1ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.943108 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-config" (OuterVolumeSpecName: "config") pod "52f5f08c-6373-4ac4-8fd9-278bb200b1ef" (UID: "52f5f08c-6373-4ac4-8fd9-278bb200b1ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.950333 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "52f5f08c-6373-4ac4-8fd9-278bb200b1ef" (UID: "52f5f08c-6373-4ac4-8fd9-278bb200b1ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.973671 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.973714 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.973723 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.973734 5129 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.973744 5129 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.973752 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxp7b\" (UniqueName: \"kubernetes.io/projected/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-kube-api-access-vxp7b\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:50 crc kubenswrapper[5129]: I0314 09:57:50.973760 5129 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52f5f08c-6373-4ac4-8fd9-278bb200b1ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.418710 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" event={"ID":"52f5f08c-6373-4ac4-8fd9-278bb200b1ef","Type":"ContainerDied","Data":"dd6e6a0244dfacf1495215e3c0a6d987426454575814991f5492564561c5ff48"} Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.419225 5129 scope.go:117] "RemoveContainer" containerID="6d0016b4051a055bef1eedd7c861f429722b0d3c7a3ef2d21f8dd9c7bc64dd03" Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.418782 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc4b66b87-lpkvz" Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.422288 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pj76t" event={"ID":"ef8c633e-31da-486e-8e83-0c4a0c893b2a","Type":"ContainerStarted","Data":"f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36"} Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.476901 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pj76t" podStartSLOduration=2.996656384 podStartE2EDuration="9.476870903s" podCreationTimestamp="2026-03-14 09:57:42 +0000 UTC" firstStartedPulling="2026-03-14 09:57:44.318109664 +0000 UTC m=+10727.070024838" lastFinishedPulling="2026-03-14 09:57:50.798324173 +0000 UTC m=+10733.550239357" observedRunningTime="2026-03-14 09:57:51.448952734 +0000 UTC m=+10734.200867918" watchObservedRunningTime="2026-03-14 09:57:51.476870903 +0000 UTC m=+10734.228786127" Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.487946 5129 scope.go:117] "RemoveContainer" containerID="3e953d7615c6c835cb5b26484f6ee8e9e3aef2506bbdd2dd58068567eb332220" Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.506845 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc4b66b87-lpkvz"] Mar 14 09:57:51 crc kubenswrapper[5129]: I0314 09:57:51.519539 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bc4b66b87-lpkvz"] Mar 14 09:57:52 crc kubenswrapper[5129]: I0314 09:57:52.050749 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" path="/var/lib/kubelet/pods/52f5f08c-6373-4ac4-8fd9-278bb200b1ef/volumes" Mar 14 09:57:53 crc kubenswrapper[5129]: I0314 09:57:53.083182 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:53 crc kubenswrapper[5129]: I0314 09:57:53.084539 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:57:54 crc kubenswrapper[5129]: I0314 09:57:54.151959 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pj76t" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="registry-server" probeResult="failure" output=< Mar 14 09:57:54 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:57:54 crc kubenswrapper[5129]: > Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.154379 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558038-b2dhq"] Mar 14 09:58:00 crc kubenswrapper[5129]: E0314 09:58:00.156173 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerName="init" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.156208 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerName="init" Mar 14 09:58:00 crc kubenswrapper[5129]: E0314 09:58:00.156265 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerName="dnsmasq-dns" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.156283 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerName="dnsmasq-dns" Mar 14 09:58:00 crc kubenswrapper[5129]: E0314 09:58:00.156360 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53ff87d-b7f7-4d68-834c-7fa95020d95a" containerName="swift-ring-rebalance" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.156383 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53ff87d-b7f7-4d68-834c-7fa95020d95a" containerName="swift-ring-rebalance" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.157046 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f5f08c-6373-4ac4-8fd9-278bb200b1ef" containerName="dnsmasq-dns" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.157090 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53ff87d-b7f7-4d68-834c-7fa95020d95a" containerName="swift-ring-rebalance" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.158764 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.161116 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.161437 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.163046 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.165322 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558038-b2dhq"] Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.234287 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25mdv\" (UniqueName: \"kubernetes.io/projected/757e9740-209f-4a0a-9fbd-e658341ba827-kube-api-access-25mdv\") pod \"auto-csr-approver-29558038-b2dhq\" (UID: \"757e9740-209f-4a0a-9fbd-e658341ba827\") " pod="openshift-infra/auto-csr-approver-29558038-b2dhq" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.337297 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25mdv\" (UniqueName: \"kubernetes.io/projected/757e9740-209f-4a0a-9fbd-e658341ba827-kube-api-access-25mdv\") pod \"auto-csr-approver-29558038-b2dhq\" (UID: \"757e9740-209f-4a0a-9fbd-e658341ba827\") " pod="openshift-infra/auto-csr-approver-29558038-b2dhq" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.358511 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25mdv\" (UniqueName: \"kubernetes.io/projected/757e9740-209f-4a0a-9fbd-e658341ba827-kube-api-access-25mdv\") pod \"auto-csr-approver-29558038-b2dhq\" (UID: \"757e9740-209f-4a0a-9fbd-e658341ba827\") " pod="openshift-infra/auto-csr-approver-29558038-b2dhq" Mar 14 09:58:00 crc kubenswrapper[5129]: I0314 09:58:00.519333 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" Mar 14 09:58:01 crc kubenswrapper[5129]: I0314 09:58:01.040476 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558038-b2dhq"] Mar 14 09:58:01 crc kubenswrapper[5129]: W0314 09:58:01.052521 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod757e9740_209f_4a0a_9fbd_e658341ba827.slice/crio-62977bfeafd41e2655edc08950021199655e6c282df5c4c8d1efa4dc62cd4a5b WatchSource:0}: Error finding container 62977bfeafd41e2655edc08950021199655e6c282df5c4c8d1efa4dc62cd4a5b: Status 404 returned error can't find the container with id 62977bfeafd41e2655edc08950021199655e6c282df5c4c8d1efa4dc62cd4a5b Mar 14 09:58:01 crc kubenswrapper[5129]: I0314 09:58:01.559453 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" event={"ID":"757e9740-209f-4a0a-9fbd-e658341ba827","Type":"ContainerStarted","Data":"62977bfeafd41e2655edc08950021199655e6c282df5c4c8d1efa4dc62cd4a5b"} Mar 14 09:58:02 crc kubenswrapper[5129]: I0314 09:58:02.037126 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:58:02 crc kubenswrapper[5129]: E0314 09:58:02.037581 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:58:02 crc kubenswrapper[5129]: I0314 09:58:02.580051 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" event={"ID":"757e9740-209f-4a0a-9fbd-e658341ba827","Type":"ContainerStarted","Data":"ca3d2b9c21ed43d9bca45926aa0338fa92281b455121d922b84ccf600f4931a6"} Mar 14 09:58:02 crc kubenswrapper[5129]: I0314 09:58:02.601211 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" podStartSLOduration=1.503941281 podStartE2EDuration="2.601190711s" podCreationTimestamp="2026-03-14 09:58:00 +0000 UTC" firstStartedPulling="2026-03-14 09:58:01.055623588 +0000 UTC m=+10743.807538812" lastFinishedPulling="2026-03-14 09:58:02.152873058 +0000 UTC m=+10744.904788242" observedRunningTime="2026-03-14 09:58:02.596878683 +0000 UTC m=+10745.348793887" watchObservedRunningTime="2026-03-14 09:58:02.601190711 +0000 UTC m=+10745.353105905" Mar 14 09:58:04 crc kubenswrapper[5129]: I0314 09:58:04.608833 5129 generic.go:334] "Generic (PLEG): container finished" podID="757e9740-209f-4a0a-9fbd-e658341ba827" containerID="ca3d2b9c21ed43d9bca45926aa0338fa92281b455121d922b84ccf600f4931a6" exitCode=0 Mar 14 09:58:04 crc kubenswrapper[5129]: I0314 09:58:04.608918 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" event={"ID":"757e9740-209f-4a0a-9fbd-e658341ba827","Type":"ContainerDied","Data":"ca3d2b9c21ed43d9bca45926aa0338fa92281b455121d922b84ccf600f4931a6"} Mar 14 09:58:04 crc kubenswrapper[5129]: I0314 09:58:04.795667 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pj76t" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="registry-server" probeResult="failure" output=< Mar 14 09:58:04 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:58:04 crc kubenswrapper[5129]: > Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.129947 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.301665 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25mdv\" (UniqueName: \"kubernetes.io/projected/757e9740-209f-4a0a-9fbd-e658341ba827-kube-api-access-25mdv\") pod \"757e9740-209f-4a0a-9fbd-e658341ba827\" (UID: \"757e9740-209f-4a0a-9fbd-e658341ba827\") " Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.316833 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757e9740-209f-4a0a-9fbd-e658341ba827-kube-api-access-25mdv" (OuterVolumeSpecName: "kube-api-access-25mdv") pod "757e9740-209f-4a0a-9fbd-e658341ba827" (UID: "757e9740-209f-4a0a-9fbd-e658341ba827"). InnerVolumeSpecName "kube-api-access-25mdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.404088 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25mdv\" (UniqueName: \"kubernetes.io/projected/757e9740-209f-4a0a-9fbd-e658341ba827-kube-api-access-25mdv\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.643906 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" event={"ID":"757e9740-209f-4a0a-9fbd-e658341ba827","Type":"ContainerDied","Data":"62977bfeafd41e2655edc08950021199655e6c282df5c4c8d1efa4dc62cd4a5b"} Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.644219 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62977bfeafd41e2655edc08950021199655e6c282df5c4c8d1efa4dc62cd4a5b" Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.644067 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-b2dhq" Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.763683 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-j4lzr"] Mar 14 09:58:06 crc kubenswrapper[5129]: I0314 09:58:06.777295 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-j4lzr"] Mar 14 09:58:08 crc kubenswrapper[5129]: I0314 09:58:08.057989 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb6ec71-6b9f-452d-a8ab-634bb53bf916" path="/var/lib/kubelet/pods/bdb6ec71-6b9f-452d-a8ab-634bb53bf916/volumes" Mar 14 09:58:14 crc kubenswrapper[5129]: I0314 09:58:14.168274 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pj76t" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="registry-server" probeResult="failure" output=< Mar 14 09:58:14 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 09:58:14 crc kubenswrapper[5129]: > Mar 14 09:58:15 crc kubenswrapper[5129]: I0314 09:58:15.036833 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:58:15 crc kubenswrapper[5129]: E0314 09:58:15.037161 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:58:23 crc kubenswrapper[5129]: I0314 09:58:23.161276 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:58:23 crc kubenswrapper[5129]: I0314 09:58:23.221000 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:58:23 crc kubenswrapper[5129]: I0314 09:58:23.419409 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pj76t"] Mar 14 09:58:24 crc kubenswrapper[5129]: I0314 09:58:24.947795 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pj76t" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="registry-server" containerID="cri-o://f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36" gracePeriod=2 Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.469149 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.588418 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-utilities\") pod \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.588582 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-catalog-content\") pod \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.588659 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86gf2\" (UniqueName: \"kubernetes.io/projected/ef8c633e-31da-486e-8e83-0c4a0c893b2a-kube-api-access-86gf2\") pod \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\" (UID: \"ef8c633e-31da-486e-8e83-0c4a0c893b2a\") " Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.589370 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-utilities" (OuterVolumeSpecName: "utilities") pod "ef8c633e-31da-486e-8e83-0c4a0c893b2a" (UID: "ef8c633e-31da-486e-8e83-0c4a0c893b2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.599854 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8c633e-31da-486e-8e83-0c4a0c893b2a-kube-api-access-86gf2" (OuterVolumeSpecName: "kube-api-access-86gf2") pod "ef8c633e-31da-486e-8e83-0c4a0c893b2a" (UID: "ef8c633e-31da-486e-8e83-0c4a0c893b2a"). InnerVolumeSpecName "kube-api-access-86gf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.691955 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86gf2\" (UniqueName: \"kubernetes.io/projected/ef8c633e-31da-486e-8e83-0c4a0c893b2a-kube-api-access-86gf2\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.691998 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.742692 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef8c633e-31da-486e-8e83-0c4a0c893b2a" (UID: "ef8c633e-31da-486e-8e83-0c4a0c893b2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.793454 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef8c633e-31da-486e-8e83-0c4a0c893b2a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.976731 5129 generic.go:334] "Generic (PLEG): container finished" podID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerID="f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36" exitCode=0 Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.976809 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pj76t" event={"ID":"ef8c633e-31da-486e-8e83-0c4a0c893b2a","Type":"ContainerDied","Data":"f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36"} Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.976854 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pj76t" event={"ID":"ef8c633e-31da-486e-8e83-0c4a0c893b2a","Type":"ContainerDied","Data":"0e5c3f193747ae0c143bb01425bad5fea09bd893bb22e08b9e0e6f4941873191"} Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.976909 5129 scope.go:117] "RemoveContainer" containerID="f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36" Mar 14 09:58:25 crc kubenswrapper[5129]: I0314 09:58:25.977015 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pj76t" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.017886 5129 scope.go:117] "RemoveContainer" containerID="e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.070157 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pj76t"] Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.070219 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pj76t"] Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.073993 5129 scope.go:117] "RemoveContainer" containerID="ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.128426 5129 scope.go:117] "RemoveContainer" containerID="f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36" Mar 14 09:58:26 crc kubenswrapper[5129]: E0314 09:58:26.128913 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36\": container with ID starting with f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36 not found: ID does not exist" containerID="f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.128966 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36"} err="failed to get container status \"f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36\": rpc error: code = NotFound desc = could not find container \"f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36\": container with ID starting with f4697643afb76b25fe255957b674585a3adce4ff449c0db79e43b73a6208bf36 not found: ID does not exist" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.128997 5129 scope.go:117] "RemoveContainer" containerID="e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4" Mar 14 09:58:26 crc kubenswrapper[5129]: E0314 09:58:26.129308 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4\": container with ID starting with e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4 not found: ID does not exist" containerID="e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.129354 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4"} err="failed to get container status \"e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4\": rpc error: code = NotFound desc = could not find container \"e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4\": container with ID starting with e82663e9568ff5edd1a4bd9e6b52ce90e83b23f73c2fd9b15bc4219bc2f00dc4 not found: ID does not exist" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.129380 5129 scope.go:117] "RemoveContainer" containerID="ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32" Mar 14 09:58:26 crc kubenswrapper[5129]: E0314 09:58:26.130365 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32\": container with ID starting with ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32 not found: ID does not exist" containerID="ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32" Mar 14 09:58:26 crc kubenswrapper[5129]: I0314 09:58:26.130411 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32"} err="failed to get container status \"ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32\": rpc error: code = NotFound desc = could not find container \"ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32\": container with ID starting with ade768eb02acdfb4a648e7f4bc7bd80108d356f3564378ffdb813f81b5f2ed32 not found: ID does not exist" Mar 14 09:58:28 crc kubenswrapper[5129]: I0314 09:58:28.048928 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" path="/var/lib/kubelet/pods/ef8c633e-31da-486e-8e83-0c4a0c893b2a/volumes" Mar 14 09:58:30 crc kubenswrapper[5129]: I0314 09:58:30.037210 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:58:30 crc kubenswrapper[5129]: E0314 09:58:30.038084 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:58:38 crc kubenswrapper[5129]: I0314 09:58:38.686684 5129 scope.go:117] "RemoveContainer" containerID="66924eb9ecabc5f09f4a50773a0828a192d8213468b9c3288ca17170f5728c4d" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.062837 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-v8qzx"] Mar 14 09:58:43 crc kubenswrapper[5129]: E0314 09:58:43.064554 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757e9740-209f-4a0a-9fbd-e658341ba827" containerName="oc" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.064575 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="757e9740-209f-4a0a-9fbd-e658341ba827" containerName="oc" Mar 14 09:58:43 crc kubenswrapper[5129]: E0314 09:58:43.064629 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="extract-content" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.064639 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="extract-content" Mar 14 09:58:43 crc kubenswrapper[5129]: E0314 09:58:43.064681 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="registry-server" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.064690 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="registry-server" Mar 14 09:58:43 crc kubenswrapper[5129]: E0314 09:58:43.064714 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="extract-utilities" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.064723 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="extract-utilities" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.065067 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="757e9740-209f-4a0a-9fbd-e658341ba827" containerName="oc" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.065104 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8c633e-31da-486e-8e83-0c4a0c893b2a" containerName="registry-server" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.066392 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.070831 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.071010 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.082040 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-v8qzx"] Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.114812 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-dispersionconf\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.114992 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.115320 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-swiftconf\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.115421 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.115548 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d521c5eb-7149-4ebf-baec-358ff9b0a712-etc-swift\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.115676 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49jh\" (UniqueName: \"kubernetes.io/projected/d521c5eb-7149-4ebf-baec-358ff9b0a712-kube-api-access-w49jh\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.115772 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-scripts\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.218586 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.218691 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d521c5eb-7149-4ebf-baec-358ff9b0a712-etc-swift\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.218751 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w49jh\" (UniqueName: \"kubernetes.io/projected/d521c5eb-7149-4ebf-baec-358ff9b0a712-kube-api-access-w49jh\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.218786 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-scripts\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.218854 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-dispersionconf\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.218889 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.219060 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-swiftconf\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.219464 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d521c5eb-7149-4ebf-baec-358ff9b0a712-etc-swift\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.220578 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.220768 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-scripts\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.226536 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-swiftconf\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.227047 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.228286 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-dispersionconf\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.247509 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49jh\" (UniqueName: \"kubernetes.io/projected/d521c5eb-7149-4ebf-baec-358ff9b0a712-kube-api-access-w49jh\") pod \"swift-ring-rebalance-debug-v8qzx\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:43 crc kubenswrapper[5129]: I0314 09:58:43.426477 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:44 crc kubenswrapper[5129]: I0314 09:58:44.040683 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:58:44 crc kubenswrapper[5129]: E0314 09:58:44.042683 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:58:44 crc kubenswrapper[5129]: I0314 09:58:44.063778 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-v8qzx"] Mar 14 09:58:44 crc kubenswrapper[5129]: I0314 09:58:44.266101 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-v8qzx" event={"ID":"d521c5eb-7149-4ebf-baec-358ff9b0a712","Type":"ContainerStarted","Data":"cbeb507c7af015ef181a05df22b5e34e817d8ad7988f0906b75b4e2394489dca"} Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.278488 5129 generic.go:334] "Generic (PLEG): container finished" podID="d521c5eb-7149-4ebf-baec-358ff9b0a712" containerID="e079b6adfb82a363e57ecc6c6ef05c80434f75b7cd7af5275a7013528b299977" exitCode=0 Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.278564 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-v8qzx" event={"ID":"d521c5eb-7149-4ebf-baec-358ff9b0a712","Type":"ContainerDied","Data":"e079b6adfb82a363e57ecc6c6ef05c80434f75b7cd7af5275a7013528b299977"} Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.331990 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-v8qzx"] Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.355135 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-v8qzx"] Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.818145 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-4r4qm"] Mar 14 09:58:45 crc kubenswrapper[5129]: E0314 09:58:45.818974 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d521c5eb-7149-4ebf-baec-358ff9b0a712" containerName="swift-ring-rebalance" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.818999 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d521c5eb-7149-4ebf-baec-358ff9b0a712" containerName="swift-ring-rebalance" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.819291 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d521c5eb-7149-4ebf-baec-358ff9b0a712" containerName="swift-ring-rebalance" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.820185 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.847000 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-4r4qm"] Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.883815 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2x7p\" (UniqueName: \"kubernetes.io/projected/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-kube-api-access-m2x7p\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.883898 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-dispersionconf\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.884158 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-scripts\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.884339 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-ring-data-devices\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.884497 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-etc-swift\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.884994 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-swiftconf\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.885096 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.988056 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-swiftconf\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.988135 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.988217 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2x7p\" (UniqueName: \"kubernetes.io/projected/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-kube-api-access-m2x7p\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.988295 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-dispersionconf\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.988414 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-scripts\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.988515 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-ring-data-devices\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.988660 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-etc-swift\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.989258 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-etc-swift\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.989574 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-scripts\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:45 crc kubenswrapper[5129]: I0314 09:58:45.990124 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-ring-data-devices\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.020857 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.029011 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-swiftconf\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.031508 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-dispersionconf\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.031551 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2x7p\" (UniqueName: \"kubernetes.io/projected/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-kube-api-access-m2x7p\") pod \"swift-ring-rebalance-debug-4r4qm\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.174301 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.631194 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-4r4qm"] Mar 14 09:58:46 crc kubenswrapper[5129]: W0314 09:58:46.634221 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1626fc6_bbd1_4a55_9cc6_288b4af1616d.slice/crio-4f9cfbc106e7e6a6a08c1cc2ab6e7b766c870ff8f6fb4bfbcff586aa96ce4078 WatchSource:0}: Error finding container 4f9cfbc106e7e6a6a08c1cc2ab6e7b766c870ff8f6fb4bfbcff586aa96ce4078: Status 404 returned error can't find the container with id 4f9cfbc106e7e6a6a08c1cc2ab6e7b766c870ff8f6fb4bfbcff586aa96ce4078 Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.680874 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.810488 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-scripts\") pod \"d521c5eb-7149-4ebf-baec-358ff9b0a712\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.810553 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-swiftconf\") pod \"d521c5eb-7149-4ebf-baec-358ff9b0a712\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.810618 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d521c5eb-7149-4ebf-baec-358ff9b0a712-etc-swift\") pod \"d521c5eb-7149-4ebf-baec-358ff9b0a712\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.810711 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-combined-ca-bundle\") pod \"d521c5eb-7149-4ebf-baec-358ff9b0a712\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.810744 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-ring-data-devices\") pod \"d521c5eb-7149-4ebf-baec-358ff9b0a712\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.810899 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w49jh\" (UniqueName: \"kubernetes.io/projected/d521c5eb-7149-4ebf-baec-358ff9b0a712-kube-api-access-w49jh\") pod \"d521c5eb-7149-4ebf-baec-358ff9b0a712\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.810986 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-dispersionconf\") pod \"d521c5eb-7149-4ebf-baec-358ff9b0a712\" (UID: \"d521c5eb-7149-4ebf-baec-358ff9b0a712\") " Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.811260 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d521c5eb-7149-4ebf-baec-358ff9b0a712" (UID: "d521c5eb-7149-4ebf-baec-358ff9b0a712"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.811529 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d521c5eb-7149-4ebf-baec-358ff9b0a712-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d521c5eb-7149-4ebf-baec-358ff9b0a712" (UID: "d521c5eb-7149-4ebf-baec-358ff9b0a712"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.811904 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.811923 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d521c5eb-7149-4ebf-baec-358ff9b0a712-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.815194 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d521c5eb-7149-4ebf-baec-358ff9b0a712-kube-api-access-w49jh" (OuterVolumeSpecName: "kube-api-access-w49jh") pod "d521c5eb-7149-4ebf-baec-358ff9b0a712" (UID: "d521c5eb-7149-4ebf-baec-358ff9b0a712"). InnerVolumeSpecName "kube-api-access-w49jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.839270 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-scripts" (OuterVolumeSpecName: "scripts") pod "d521c5eb-7149-4ebf-baec-358ff9b0a712" (UID: "d521c5eb-7149-4ebf-baec-358ff9b0a712"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.841382 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d521c5eb-7149-4ebf-baec-358ff9b0a712" (UID: "d521c5eb-7149-4ebf-baec-358ff9b0a712"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.842088 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d521c5eb-7149-4ebf-baec-358ff9b0a712" (UID: "d521c5eb-7149-4ebf-baec-358ff9b0a712"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.861279 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d521c5eb-7149-4ebf-baec-358ff9b0a712" (UID: "d521c5eb-7149-4ebf-baec-358ff9b0a712"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.913507 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.913792 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d521c5eb-7149-4ebf-baec-358ff9b0a712-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.913851 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.913924 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d521c5eb-7149-4ebf-baec-358ff9b0a712-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:46 crc kubenswrapper[5129]: I0314 09:58:46.913981 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w49jh\" (UniqueName: \"kubernetes.io/projected/d521c5eb-7149-4ebf-baec-358ff9b0a712-kube-api-access-w49jh\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:47 crc kubenswrapper[5129]: I0314 09:58:47.315502 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-v8qzx" Mar 14 09:58:47 crc kubenswrapper[5129]: I0314 09:58:47.315555 5129 scope.go:117] "RemoveContainer" containerID="e079b6adfb82a363e57ecc6c6ef05c80434f75b7cd7af5275a7013528b299977" Mar 14 09:58:47 crc kubenswrapper[5129]: I0314 09:58:47.320352 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4r4qm" event={"ID":"b1626fc6-bbd1-4a55-9cc6-288b4af1616d","Type":"ContainerStarted","Data":"7b22a6b7c9caa347b790a13142460412e261c536f5f066154d22f712243ee599"} Mar 14 09:58:47 crc kubenswrapper[5129]: I0314 09:58:47.320399 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4r4qm" event={"ID":"b1626fc6-bbd1-4a55-9cc6-288b4af1616d","Type":"ContainerStarted","Data":"4f9cfbc106e7e6a6a08c1cc2ab6e7b766c870ff8f6fb4bfbcff586aa96ce4078"} Mar 14 09:58:47 crc kubenswrapper[5129]: I0314 09:58:47.373630 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-4r4qm" podStartSLOduration=2.373590379 podStartE2EDuration="2.373590379s" podCreationTimestamp="2026-03-14 09:58:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:58:47.350235095 +0000 UTC m=+10790.102150289" watchObservedRunningTime="2026-03-14 09:58:47.373590379 +0000 UTC m=+10790.125505563" Mar 14 09:58:48 crc kubenswrapper[5129]: I0314 09:58:48.063795 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d521c5eb-7149-4ebf-baec-358ff9b0a712" path="/var/lib/kubelet/pods/d521c5eb-7149-4ebf-baec-358ff9b0a712/volumes" Mar 14 09:58:55 crc kubenswrapper[5129]: I0314 09:58:55.462165 5129 generic.go:334] "Generic (PLEG): container finished" podID="b1626fc6-bbd1-4a55-9cc6-288b4af1616d" containerID="7b22a6b7c9caa347b790a13142460412e261c536f5f066154d22f712243ee599" exitCode=0 Mar 14 09:58:55 crc kubenswrapper[5129]: I0314 09:58:55.462271 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4r4qm" event={"ID":"b1626fc6-bbd1-4a55-9cc6-288b4af1616d","Type":"ContainerDied","Data":"7b22a6b7c9caa347b790a13142460412e261c536f5f066154d22f712243ee599"} Mar 14 09:58:57 crc kubenswrapper[5129]: I0314 09:58:57.964126 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.010442 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-4r4qm"] Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.022404 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-4r4qm"] Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.051559 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:58:58 crc kubenswrapper[5129]: E0314 09:58:58.052045 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.085718 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2x7p\" (UniqueName: \"kubernetes.io/projected/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-kube-api-access-m2x7p\") pod \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.086089 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-dispersionconf\") pod \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.086144 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-combined-ca-bundle\") pod \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.086238 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-ring-data-devices\") pod \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.086259 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-etc-swift\") pod \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.086366 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-scripts\") pod \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.086410 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-swiftconf\") pod \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\" (UID: \"b1626fc6-bbd1-4a55-9cc6-288b4af1616d\") " Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.086772 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b1626fc6-bbd1-4a55-9cc6-288b4af1616d" (UID: "b1626fc6-bbd1-4a55-9cc6-288b4af1616d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.087146 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.087634 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b1626fc6-bbd1-4a55-9cc6-288b4af1616d" (UID: "b1626fc6-bbd1-4a55-9cc6-288b4af1616d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.093239 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-kube-api-access-m2x7p" (OuterVolumeSpecName: "kube-api-access-m2x7p") pod "b1626fc6-bbd1-4a55-9cc6-288b4af1616d" (UID: "b1626fc6-bbd1-4a55-9cc6-288b4af1616d"). InnerVolumeSpecName "kube-api-access-m2x7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.117974 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b1626fc6-bbd1-4a55-9cc6-288b4af1616d" (UID: "b1626fc6-bbd1-4a55-9cc6-288b4af1616d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.125221 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-scripts" (OuterVolumeSpecName: "scripts") pod "b1626fc6-bbd1-4a55-9cc6-288b4af1616d" (UID: "b1626fc6-bbd1-4a55-9cc6-288b4af1616d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.128696 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1626fc6-bbd1-4a55-9cc6-288b4af1616d" (UID: "b1626fc6-bbd1-4a55-9cc6-288b4af1616d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.143764 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b1626fc6-bbd1-4a55-9cc6-288b4af1616d" (UID: "b1626fc6-bbd1-4a55-9cc6-288b4af1616d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.189425 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.189671 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.189748 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2x7p\" (UniqueName: \"kubernetes.io/projected/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-kube-api-access-m2x7p\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.189830 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.189908 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.189966 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1626fc6-bbd1-4a55-9cc6-288b4af1616d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.523798 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4r4qm" Mar 14 09:58:58 crc kubenswrapper[5129]: I0314 09:58:58.523829 5129 scope.go:117] "RemoveContainer" containerID="7b22a6b7c9caa347b790a13142460412e261c536f5f066154d22f712243ee599" Mar 14 09:59:00 crc kubenswrapper[5129]: I0314 09:59:00.059670 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1626fc6-bbd1-4a55-9cc6-288b4af1616d" path="/var/lib/kubelet/pods/b1626fc6-bbd1-4a55-9cc6-288b4af1616d/volumes" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.846512 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-tpnq4"] Mar 14 09:59:01 crc kubenswrapper[5129]: E0314 09:59:01.847513 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1626fc6-bbd1-4a55-9cc6-288b4af1616d" containerName="swift-ring-rebalance" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.847532 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1626fc6-bbd1-4a55-9cc6-288b4af1616d" containerName="swift-ring-rebalance" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.847843 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1626fc6-bbd1-4a55-9cc6-288b4af1616d" containerName="swift-ring-rebalance" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.848839 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.852394 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.854539 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.864826 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-tpnq4"] Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.982364 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.982455 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmz5z\" (UniqueName: \"kubernetes.io/projected/9814064c-6693-488c-acd2-d40db3f541dc-kube-api-access-dmz5z\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.982660 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-scripts\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.982737 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9814064c-6693-488c-acd2-d40db3f541dc-etc-swift\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.982789 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-dispersionconf\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.982865 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-ring-data-devices\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:01 crc kubenswrapper[5129]: I0314 09:59:01.982966 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-swiftconf\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.085207 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-scripts\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.085271 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9814064c-6693-488c-acd2-d40db3f541dc-etc-swift\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.085301 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-dispersionconf\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.085387 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-ring-data-devices\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.085467 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-swiftconf\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.085505 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.085551 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmz5z\" (UniqueName: \"kubernetes.io/projected/9814064c-6693-488c-acd2-d40db3f541dc-kube-api-access-dmz5z\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.086073 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9814064c-6693-488c-acd2-d40db3f541dc-etc-swift\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.086388 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-scripts\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.091027 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-ring-data-devices\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.092807 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.093111 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-dispersionconf\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.093307 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-swiftconf\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.115268 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmz5z\" (UniqueName: \"kubernetes.io/projected/9814064c-6693-488c-acd2-d40db3f541dc-kube-api-access-dmz5z\") pod \"swift-ring-rebalance-debug-tpnq4\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.187294 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:02 crc kubenswrapper[5129]: I0314 09:59:02.649306 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-tpnq4"] Mar 14 09:59:03 crc kubenswrapper[5129]: I0314 09:59:03.611022 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-tpnq4" event={"ID":"9814064c-6693-488c-acd2-d40db3f541dc","Type":"ContainerStarted","Data":"05f9f9b8c8374c2c5bd3b8b8756c9637e8531364b7d0cd91cc56643e040f7bbf"} Mar 14 09:59:03 crc kubenswrapper[5129]: I0314 09:59:03.611345 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-tpnq4" event={"ID":"9814064c-6693-488c-acd2-d40db3f541dc","Type":"ContainerStarted","Data":"da3f4f8a42b038f050c526a4025378f8c9fdc72924309e2832ffb4ed7f22ff98"} Mar 14 09:59:03 crc kubenswrapper[5129]: I0314 09:59:03.642727 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-tpnq4" podStartSLOduration=2.642706734 podStartE2EDuration="2.642706734s" podCreationTimestamp="2026-03-14 09:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:59:03.636498235 +0000 UTC m=+10806.388413429" watchObservedRunningTime="2026-03-14 09:59:03.642706734 +0000 UTC m=+10806.394621928" Mar 14 09:59:11 crc kubenswrapper[5129]: I0314 09:59:11.713414 5129 generic.go:334] "Generic (PLEG): container finished" podID="9814064c-6693-488c-acd2-d40db3f541dc" containerID="05f9f9b8c8374c2c5bd3b8b8756c9637e8531364b7d0cd91cc56643e040f7bbf" exitCode=0 Mar 14 09:59:11 crc kubenswrapper[5129]: I0314 09:59:11.713493 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-tpnq4" event={"ID":"9814064c-6693-488c-acd2-d40db3f541dc","Type":"ContainerDied","Data":"05f9f9b8c8374c2c5bd3b8b8756c9637e8531364b7d0cd91cc56643e040f7bbf"} Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.037493 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:59:13 crc kubenswrapper[5129]: E0314 09:59:13.038403 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.513563 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.568687 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-tpnq4"] Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.578146 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-tpnq4"] Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.660942 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-scripts\") pod \"9814064c-6693-488c-acd2-d40db3f541dc\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.661096 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmz5z\" (UniqueName: \"kubernetes.io/projected/9814064c-6693-488c-acd2-d40db3f541dc-kube-api-access-dmz5z\") pod \"9814064c-6693-488c-acd2-d40db3f541dc\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.661177 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-combined-ca-bundle\") pod \"9814064c-6693-488c-acd2-d40db3f541dc\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.661235 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9814064c-6693-488c-acd2-d40db3f541dc-etc-swift\") pod \"9814064c-6693-488c-acd2-d40db3f541dc\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.661262 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-swiftconf\") pod \"9814064c-6693-488c-acd2-d40db3f541dc\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.661289 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-dispersionconf\") pod \"9814064c-6693-488c-acd2-d40db3f541dc\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.661306 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-ring-data-devices\") pod \"9814064c-6693-488c-acd2-d40db3f541dc\" (UID: \"9814064c-6693-488c-acd2-d40db3f541dc\") " Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.662374 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9814064c-6693-488c-acd2-d40db3f541dc" (UID: "9814064c-6693-488c-acd2-d40db3f541dc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.662765 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9814064c-6693-488c-acd2-d40db3f541dc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9814064c-6693-488c-acd2-d40db3f541dc" (UID: "9814064c-6693-488c-acd2-d40db3f541dc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.667142 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9814064c-6693-488c-acd2-d40db3f541dc-kube-api-access-dmz5z" (OuterVolumeSpecName: "kube-api-access-dmz5z") pod "9814064c-6693-488c-acd2-d40db3f541dc" (UID: "9814064c-6693-488c-acd2-d40db3f541dc"). InnerVolumeSpecName "kube-api-access-dmz5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.693168 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9814064c-6693-488c-acd2-d40db3f541dc" (UID: "9814064c-6693-488c-acd2-d40db3f541dc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.709143 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9814064c-6693-488c-acd2-d40db3f541dc" (UID: "9814064c-6693-488c-acd2-d40db3f541dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.712561 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-scripts" (OuterVolumeSpecName: "scripts") pod "9814064c-6693-488c-acd2-d40db3f541dc" (UID: "9814064c-6693-488c-acd2-d40db3f541dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.719205 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9814064c-6693-488c-acd2-d40db3f541dc" (UID: "9814064c-6693-488c-acd2-d40db3f541dc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.740500 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da3f4f8a42b038f050c526a4025378f8c9fdc72924309e2832ffb4ed7f22ff98" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.740567 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-tpnq4" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.765288 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.765334 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9814064c-6693-488c-acd2-d40db3f541dc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.765352 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.765369 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9814064c-6693-488c-acd2-d40db3f541dc-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.765386 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.765402 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9814064c-6693-488c-acd2-d40db3f541dc-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:13 crc kubenswrapper[5129]: I0314 09:59:13.765420 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmz5z\" (UniqueName: \"kubernetes.io/projected/9814064c-6693-488c-acd2-d40db3f541dc-kube-api-access-dmz5z\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:14 crc kubenswrapper[5129]: I0314 09:59:14.057165 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9814064c-6693-488c-acd2-d40db3f541dc" path="/var/lib/kubelet/pods/9814064c-6693-488c-acd2-d40db3f541dc/volumes" Mar 14 09:59:28 crc kubenswrapper[5129]: I0314 09:59:28.048284 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:59:28 crc kubenswrapper[5129]: E0314 09:59:28.049118 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:59:43 crc kubenswrapper[5129]: I0314 09:59:43.036275 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:59:43 crc kubenswrapper[5129]: E0314 09:59:43.037133 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 09:59:55 crc kubenswrapper[5129]: I0314 09:59:55.037574 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 09:59:55 crc kubenswrapper[5129]: E0314 09:59:55.039070 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.159969 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558040-6bqj2"] Mar 14 10:00:00 crc kubenswrapper[5129]: E0314 10:00:00.161375 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9814064c-6693-488c-acd2-d40db3f541dc" containerName="swift-ring-rebalance" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.161430 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="9814064c-6693-488c-acd2-d40db3f541dc" containerName="swift-ring-rebalance" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.161806 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="9814064c-6693-488c-acd2-d40db3f541dc" containerName="swift-ring-rebalance" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.162979 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.167491 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.167633 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.168208 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.176667 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs"] Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.178591 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.181538 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.181945 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.191524 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558040-6bqj2"] Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.197970 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4nb\" (UniqueName: \"kubernetes.io/projected/2fbb2ad7-729a-418f-a7be-e8db9ab1bb15-kube-api-access-cm4nb\") pod \"auto-csr-approver-29558040-6bqj2\" (UID: \"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15\") " pod="openshift-infra/auto-csr-approver-29558040-6bqj2" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.198094 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr2p5\" (UniqueName: \"kubernetes.io/projected/38749edd-4240-47f4-b744-227e1cfee8e4-kube-api-access-qr2p5\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.198239 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38749edd-4240-47f4-b744-227e1cfee8e4-secret-volume\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.198269 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38749edd-4240-47f4-b744-227e1cfee8e4-config-volume\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.205795 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs"] Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.300406 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38749edd-4240-47f4-b744-227e1cfee8e4-secret-volume\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.300480 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38749edd-4240-47f4-b744-227e1cfee8e4-config-volume\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.300560 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4nb\" (UniqueName: \"kubernetes.io/projected/2fbb2ad7-729a-418f-a7be-e8db9ab1bb15-kube-api-access-cm4nb\") pod \"auto-csr-approver-29558040-6bqj2\" (UID: \"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15\") " pod="openshift-infra/auto-csr-approver-29558040-6bqj2" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.300674 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr2p5\" (UniqueName: \"kubernetes.io/projected/38749edd-4240-47f4-b744-227e1cfee8e4-kube-api-access-qr2p5\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.301674 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38749edd-4240-47f4-b744-227e1cfee8e4-config-volume\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.306671 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38749edd-4240-47f4-b744-227e1cfee8e4-secret-volume\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.317527 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4nb\" (UniqueName: \"kubernetes.io/projected/2fbb2ad7-729a-418f-a7be-e8db9ab1bb15-kube-api-access-cm4nb\") pod \"auto-csr-approver-29558040-6bqj2\" (UID: \"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15\") " pod="openshift-infra/auto-csr-approver-29558040-6bqj2" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.319461 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr2p5\" (UniqueName: \"kubernetes.io/projected/38749edd-4240-47f4-b744-227e1cfee8e4-kube-api-access-qr2p5\") pod \"collect-profiles-29558040-tnfxs\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.493687 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" Mar 14 10:00:00 crc kubenswrapper[5129]: I0314 10:00:00.503902 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:01 crc kubenswrapper[5129]: I0314 10:00:01.033907 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558040-6bqj2"] Mar 14 10:00:01 crc kubenswrapper[5129]: I0314 10:00:01.040639 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:00:01 crc kubenswrapper[5129]: I0314 10:00:01.173129 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs"] Mar 14 10:00:01 crc kubenswrapper[5129]: W0314 10:00:01.174183 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38749edd_4240_47f4_b744_227e1cfee8e4.slice/crio-0a8015b7e5238c28f10574b7b421134c375a0bae2e4e98cfb828bc7efc8e54a2 WatchSource:0}: Error finding container 0a8015b7e5238c28f10574b7b421134c375a0bae2e4e98cfb828bc7efc8e54a2: Status 404 returned error can't find the container with id 0a8015b7e5238c28f10574b7b421134c375a0bae2e4e98cfb828bc7efc8e54a2 Mar 14 10:00:01 crc kubenswrapper[5129]: I0314 10:00:01.364555 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" event={"ID":"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15","Type":"ContainerStarted","Data":"a0027f21d1929e2692670d4167822670e1bb7a38f2c0711ef20173d0e894063b"} Mar 14 10:00:01 crc kubenswrapper[5129]: I0314 10:00:01.367054 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" event={"ID":"38749edd-4240-47f4-b744-227e1cfee8e4","Type":"ContainerStarted","Data":"1278758130bfc7c6566c3fcb95afbe01ccc6940e11bc660cee618d8eefb4e058"} Mar 14 10:00:01 crc kubenswrapper[5129]: I0314 10:00:01.367113 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" event={"ID":"38749edd-4240-47f4-b744-227e1cfee8e4","Type":"ContainerStarted","Data":"0a8015b7e5238c28f10574b7b421134c375a0bae2e4e98cfb828bc7efc8e54a2"} Mar 14 10:00:01 crc kubenswrapper[5129]: I0314 10:00:01.395911 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" podStartSLOduration=1.3958883229999999 podStartE2EDuration="1.395888323s" podCreationTimestamp="2026-03-14 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:00:01.387115554 +0000 UTC m=+10864.139030738" watchObservedRunningTime="2026-03-14 10:00:01.395888323 +0000 UTC m=+10864.147803507" Mar 14 10:00:02 crc kubenswrapper[5129]: I0314 10:00:02.377237 5129 generic.go:334] "Generic (PLEG): container finished" podID="38749edd-4240-47f4-b744-227e1cfee8e4" containerID="1278758130bfc7c6566c3fcb95afbe01ccc6940e11bc660cee618d8eefb4e058" exitCode=0 Mar 14 10:00:02 crc kubenswrapper[5129]: I0314 10:00:02.377358 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" event={"ID":"38749edd-4240-47f4-b744-227e1cfee8e4","Type":"ContainerDied","Data":"1278758130bfc7c6566c3fcb95afbe01ccc6940e11bc660cee618d8eefb4e058"} Mar 14 10:00:03 crc kubenswrapper[5129]: I0314 10:00:03.989768 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.091050 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38749edd-4240-47f4-b744-227e1cfee8e4-config-volume\") pod \"38749edd-4240-47f4-b744-227e1cfee8e4\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.091278 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr2p5\" (UniqueName: \"kubernetes.io/projected/38749edd-4240-47f4-b744-227e1cfee8e4-kube-api-access-qr2p5\") pod \"38749edd-4240-47f4-b744-227e1cfee8e4\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.091337 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38749edd-4240-47f4-b744-227e1cfee8e4-secret-volume\") pod \"38749edd-4240-47f4-b744-227e1cfee8e4\" (UID: \"38749edd-4240-47f4-b744-227e1cfee8e4\") " Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.091798 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38749edd-4240-47f4-b744-227e1cfee8e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "38749edd-4240-47f4-b744-227e1cfee8e4" (UID: "38749edd-4240-47f4-b744-227e1cfee8e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.093126 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38749edd-4240-47f4-b744-227e1cfee8e4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.101407 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38749edd-4240-47f4-b744-227e1cfee8e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "38749edd-4240-47f4-b744-227e1cfee8e4" (UID: "38749edd-4240-47f4-b744-227e1cfee8e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.104272 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38749edd-4240-47f4-b744-227e1cfee8e4-kube-api-access-qr2p5" (OuterVolumeSpecName: "kube-api-access-qr2p5") pod "38749edd-4240-47f4-b744-227e1cfee8e4" (UID: "38749edd-4240-47f4-b744-227e1cfee8e4"). InnerVolumeSpecName "kube-api-access-qr2p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.195036 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr2p5\" (UniqueName: \"kubernetes.io/projected/38749edd-4240-47f4-b744-227e1cfee8e4-kube-api-access-qr2p5\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.195068 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38749edd-4240-47f4-b744-227e1cfee8e4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.254148 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm"] Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.265942 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-wlwrm"] Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.410726 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" event={"ID":"38749edd-4240-47f4-b744-227e1cfee8e4","Type":"ContainerDied","Data":"0a8015b7e5238c28f10574b7b421134c375a0bae2e4e98cfb828bc7efc8e54a2"} Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.410783 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs" Mar 14 10:00:04 crc kubenswrapper[5129]: I0314 10:00:04.410808 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a8015b7e5238c28f10574b7b421134c375a0bae2e4e98cfb828bc7efc8e54a2" Mar 14 10:00:06 crc kubenswrapper[5129]: I0314 10:00:06.051482 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631a6666-44c1-4fdf-8624-a24d98b17cd0" path="/var/lib/kubelet/pods/631a6666-44c1-4fdf-8624-a24d98b17cd0/volumes" Mar 14 10:00:06 crc kubenswrapper[5129]: I0314 10:00:06.442684 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" event={"ID":"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15","Type":"ContainerStarted","Data":"665bee379c6ae7e4c2953e5bf670794a88b968eefe0f434b944fbff91557e6bd"} Mar 14 10:00:06 crc kubenswrapper[5129]: I0314 10:00:06.474171 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" podStartSLOduration=1.520695844 podStartE2EDuration="6.47414801s" podCreationTimestamp="2026-03-14 10:00:00 +0000 UTC" firstStartedPulling="2026-03-14 10:00:01.040402722 +0000 UTC m=+10863.792317906" lastFinishedPulling="2026-03-14 10:00:05.993854888 +0000 UTC m=+10868.745770072" observedRunningTime="2026-03-14 10:00:06.460319575 +0000 UTC m=+10869.212234769" watchObservedRunningTime="2026-03-14 10:00:06.47414801 +0000 UTC m=+10869.226063204" Mar 14 10:00:07 crc kubenswrapper[5129]: I0314 10:00:07.453739 5129 generic.go:334] "Generic (PLEG): container finished" podID="2fbb2ad7-729a-418f-a7be-e8db9ab1bb15" containerID="665bee379c6ae7e4c2953e5bf670794a88b968eefe0f434b944fbff91557e6bd" exitCode=0 Mar 14 10:00:07 crc kubenswrapper[5129]: I0314 10:00:07.453785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" event={"ID":"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15","Type":"ContainerDied","Data":"665bee379c6ae7e4c2953e5bf670794a88b968eefe0f434b944fbff91557e6bd"} Mar 14 10:00:08 crc kubenswrapper[5129]: I0314 10:00:08.053345 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 10:00:08 crc kubenswrapper[5129]: E0314 10:00:08.053735 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.019004 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.101971 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm4nb\" (UniqueName: \"kubernetes.io/projected/2fbb2ad7-729a-418f-a7be-e8db9ab1bb15-kube-api-access-cm4nb\") pod \"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15\" (UID: \"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15\") " Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.112774 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbb2ad7-729a-418f-a7be-e8db9ab1bb15-kube-api-access-cm4nb" (OuterVolumeSpecName: "kube-api-access-cm4nb") pod "2fbb2ad7-729a-418f-a7be-e8db9ab1bb15" (UID: "2fbb2ad7-729a-418f-a7be-e8db9ab1bb15"). InnerVolumeSpecName "kube-api-access-cm4nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.205341 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm4nb\" (UniqueName: \"kubernetes.io/projected/2fbb2ad7-729a-418f-a7be-e8db9ab1bb15-kube-api-access-cm4nb\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.484940 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" event={"ID":"2fbb2ad7-729a-418f-a7be-e8db9ab1bb15","Type":"ContainerDied","Data":"a0027f21d1929e2692670d4167822670e1bb7a38f2c0711ef20173d0e894063b"} Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.485229 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0027f21d1929e2692670d4167822670e1bb7a38f2c0711ef20173d0e894063b" Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.485336 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-6bqj2" Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.535154 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-zbj58"] Mar 14 10:00:09 crc kubenswrapper[5129]: I0314 10:00:09.548250 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-zbj58"] Mar 14 10:00:10 crc kubenswrapper[5129]: I0314 10:00:10.050753 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f00c80-6977-43bf-9cb7-b96309f47f59" path="/var/lib/kubelet/pods/f1f00c80-6977-43bf-9cb7-b96309f47f59/volumes" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.759905 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-2zhst"] Mar 14 10:00:13 crc kubenswrapper[5129]: E0314 10:00:13.760991 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbb2ad7-729a-418f-a7be-e8db9ab1bb15" containerName="oc" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.761017 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbb2ad7-729a-418f-a7be-e8db9ab1bb15" containerName="oc" Mar 14 10:00:13 crc kubenswrapper[5129]: E0314 10:00:13.761053 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38749edd-4240-47f4-b744-227e1cfee8e4" containerName="collect-profiles" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.761064 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="38749edd-4240-47f4-b744-227e1cfee8e4" containerName="collect-profiles" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.761381 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbb2ad7-729a-418f-a7be-e8db9ab1bb15" containerName="oc" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.761424 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="38749edd-4240-47f4-b744-227e1cfee8e4" containerName="collect-profiles" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.762501 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.766293 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.766317 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.776491 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-2zhst"] Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.904651 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.904846 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-scripts\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.904926 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-swiftconf\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.905054 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-ring-data-devices\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.905153 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-dispersionconf\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.905802 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7b8f797-6510-487a-ba50-ead7e8768dff-etc-swift\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:13 crc kubenswrapper[5129]: I0314 10:00:13.906129 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwncx\" (UniqueName: \"kubernetes.io/projected/a7b8f797-6510-487a-ba50-ead7e8768dff-kube-api-access-lwncx\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.008869 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwncx\" (UniqueName: \"kubernetes.io/projected/a7b8f797-6510-487a-ba50-ead7e8768dff-kube-api-access-lwncx\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.009077 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.009281 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-scripts\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.009393 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-swiftconf\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.009577 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-ring-data-devices\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.009637 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-dispersionconf\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.009713 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7b8f797-6510-487a-ba50-ead7e8768dff-etc-swift\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.010045 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-scripts\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.010419 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7b8f797-6510-487a-ba50-ead7e8768dff-etc-swift\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.011085 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-ring-data-devices\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.018783 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.021333 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-dispersionconf\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.024859 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-swiftconf\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.029522 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwncx\" (UniqueName: \"kubernetes.io/projected/a7b8f797-6510-487a-ba50-ead7e8768dff-kube-api-access-lwncx\") pod \"swift-ring-rebalance-debug-2zhst\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.113925 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:14 crc kubenswrapper[5129]: I0314 10:00:14.635501 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-2zhst"] Mar 14 10:00:14 crc kubenswrapper[5129]: W0314 10:00:14.637734 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7b8f797_6510_487a_ba50_ead7e8768dff.slice/crio-d9dbced9120171edc7b926cf10816c2f40657bef05834fc5a1bf93c0d52d0140 WatchSource:0}: Error finding container d9dbced9120171edc7b926cf10816c2f40657bef05834fc5a1bf93c0d52d0140: Status 404 returned error can't find the container with id d9dbced9120171edc7b926cf10816c2f40657bef05834fc5a1bf93c0d52d0140 Mar 14 10:00:15 crc kubenswrapper[5129]: I0314 10:00:15.574660 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-2zhst" event={"ID":"a7b8f797-6510-487a-ba50-ead7e8768dff","Type":"ContainerStarted","Data":"03571b17cb0547261e15b98f55bae0f240fba5eaeb90ada894e4b42479455bc1"} Mar 14 10:00:15 crc kubenswrapper[5129]: I0314 10:00:15.575068 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-2zhst" event={"ID":"a7b8f797-6510-487a-ba50-ead7e8768dff","Type":"ContainerStarted","Data":"d9dbced9120171edc7b926cf10816c2f40657bef05834fc5a1bf93c0d52d0140"} Mar 14 10:00:15 crc kubenswrapper[5129]: I0314 10:00:15.608498 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-2zhst" podStartSLOduration=2.608478988 podStartE2EDuration="2.608478988s" podCreationTimestamp="2026-03-14 10:00:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:00:15.595924957 +0000 UTC m=+10878.347840171" watchObservedRunningTime="2026-03-14 10:00:15.608478988 +0000 UTC m=+10878.360394182" Mar 14 10:00:21 crc kubenswrapper[5129]: I0314 10:00:21.038993 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 10:00:21 crc kubenswrapper[5129]: I0314 10:00:21.656949 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"41973fc889c20df973d49b23f7c22998ca1537f65950641326a14bf146e131d6"} Mar 14 10:00:23 crc kubenswrapper[5129]: I0314 10:00:23.683501 5129 generic.go:334] "Generic (PLEG): container finished" podID="a7b8f797-6510-487a-ba50-ead7e8768dff" containerID="03571b17cb0547261e15b98f55bae0f240fba5eaeb90ada894e4b42479455bc1" exitCode=0 Mar 14 10:00:23 crc kubenswrapper[5129]: I0314 10:00:23.683663 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-2zhst" event={"ID":"a7b8f797-6510-487a-ba50-ead7e8768dff","Type":"ContainerDied","Data":"03571b17cb0547261e15b98f55bae0f240fba5eaeb90ada894e4b42479455bc1"} Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.594048 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.650660 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-2zhst"] Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.662053 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-2zhst"] Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.689728 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-scripts\") pod \"a7b8f797-6510-487a-ba50-ead7e8768dff\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.689816 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-combined-ca-bundle\") pod \"a7b8f797-6510-487a-ba50-ead7e8768dff\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.689850 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-swiftconf\") pod \"a7b8f797-6510-487a-ba50-ead7e8768dff\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.689913 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwncx\" (UniqueName: \"kubernetes.io/projected/a7b8f797-6510-487a-ba50-ead7e8768dff-kube-api-access-lwncx\") pod \"a7b8f797-6510-487a-ba50-ead7e8768dff\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.690877 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-ring-data-devices\") pod \"a7b8f797-6510-487a-ba50-ead7e8768dff\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.690971 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-dispersionconf\") pod \"a7b8f797-6510-487a-ba50-ead7e8768dff\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.691145 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7b8f797-6510-487a-ba50-ead7e8768dff-etc-swift\") pod \"a7b8f797-6510-487a-ba50-ead7e8768dff\" (UID: \"a7b8f797-6510-487a-ba50-ead7e8768dff\") " Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.691984 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a7b8f797-6510-487a-ba50-ead7e8768dff" (UID: "a7b8f797-6510-487a-ba50-ead7e8768dff"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.698023 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7b8f797-6510-487a-ba50-ead7e8768dff-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a7b8f797-6510-487a-ba50-ead7e8768dff" (UID: "a7b8f797-6510-487a-ba50-ead7e8768dff"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.698796 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7b8f797-6510-487a-ba50-ead7e8768dff-kube-api-access-lwncx" (OuterVolumeSpecName: "kube-api-access-lwncx") pod "a7b8f797-6510-487a-ba50-ead7e8768dff" (UID: "a7b8f797-6510-487a-ba50-ead7e8768dff"). InnerVolumeSpecName "kube-api-access-lwncx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.708401 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9dbced9120171edc7b926cf10816c2f40657bef05834fc5a1bf93c0d52d0140" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.708479 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-2zhst" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.730634 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7b8f797-6510-487a-ba50-ead7e8768dff" (UID: "a7b8f797-6510-487a-ba50-ead7e8768dff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.731525 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a7b8f797-6510-487a-ba50-ead7e8768dff" (UID: "a7b8f797-6510-487a-ba50-ead7e8768dff"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.736056 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-scripts" (OuterVolumeSpecName: "scripts") pod "a7b8f797-6510-487a-ba50-ead7e8768dff" (UID: "a7b8f797-6510-487a-ba50-ead7e8768dff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.753747 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a7b8f797-6510-487a-ba50-ead7e8768dff" (UID: "a7b8f797-6510-487a-ba50-ead7e8768dff"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.793918 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.793972 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.794033 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.794047 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwncx\" (UniqueName: \"kubernetes.io/projected/a7b8f797-6510-487a-ba50-ead7e8768dff-kube-api-access-lwncx\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.794061 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7b8f797-6510-487a-ba50-ead7e8768dff-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.794073 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7b8f797-6510-487a-ba50-ead7e8768dff-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:25 crc kubenswrapper[5129]: I0314 10:00:25.794083 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7b8f797-6510-487a-ba50-ead7e8768dff-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:26 crc kubenswrapper[5129]: I0314 10:00:26.054242 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7b8f797-6510-487a-ba50-ead7e8768dff" path="/var/lib/kubelet/pods/a7b8f797-6510-487a-ba50-ead7e8768dff/volumes" Mar 14 10:00:38 crc kubenswrapper[5129]: I0314 10:00:38.920056 5129 scope.go:117] "RemoveContainer" containerID="3abd1062ee36117426b22332f9496a68e9035f34af79d86307178259309c1032" Mar 14 10:00:38 crc kubenswrapper[5129]: I0314 10:00:38.994432 5129 scope.go:117] "RemoveContainer" containerID="f1793d1eb809b9a7d523ecbacb2b6d2a88459ad7ae4c6cb13cec735cff050bb2" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.183641 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29558041-fprmt"] Mar 14 10:01:00 crc kubenswrapper[5129]: E0314 10:01:00.188533 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b8f797-6510-487a-ba50-ead7e8768dff" containerName="swift-ring-rebalance" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.188723 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b8f797-6510-487a-ba50-ead7e8768dff" containerName="swift-ring-rebalance" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.189134 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b8f797-6510-487a-ba50-ead7e8768dff" containerName="swift-ring-rebalance" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.190155 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.206455 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29558041-fprmt"] Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.384707 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-combined-ca-bundle\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.384980 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-config-data\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.385190 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dvz\" (UniqueName: \"kubernetes.io/projected/d2815225-50bd-4399-a38d-732afc1e06be-kube-api-access-74dvz\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.385419 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-fernet-keys\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.487529 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-config-data\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.487789 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dvz\" (UniqueName: \"kubernetes.io/projected/d2815225-50bd-4399-a38d-732afc1e06be-kube-api-access-74dvz\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.487940 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-fernet-keys\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.488078 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-combined-ca-bundle\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.496083 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-combined-ca-bundle\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.497264 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-fernet-keys\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.497885 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-config-data\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.517735 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dvz\" (UniqueName: \"kubernetes.io/projected/d2815225-50bd-4399-a38d-732afc1e06be-kube-api-access-74dvz\") pod \"keystone-cron-29558041-fprmt\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:00 crc kubenswrapper[5129]: I0314 10:01:00.531997 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:01 crc kubenswrapper[5129]: I0314 10:01:01.021305 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29558041-fprmt"] Mar 14 10:01:01 crc kubenswrapper[5129]: I0314 10:01:01.189124 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-fprmt" event={"ID":"d2815225-50bd-4399-a38d-732afc1e06be","Type":"ContainerStarted","Data":"fdd774444c6ebf72520be82d2e05ec7b00d828a019ec62d27c0ba3f17ec3c03d"} Mar 14 10:01:02 crc kubenswrapper[5129]: I0314 10:01:02.208868 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-fprmt" event={"ID":"d2815225-50bd-4399-a38d-732afc1e06be","Type":"ContainerStarted","Data":"198ec7470b2863cbd81f2513e06e91d2799ba6ee82dba9428cfb6c2441ff1087"} Mar 14 10:01:02 crc kubenswrapper[5129]: I0314 10:01:02.247431 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29558041-fprmt" podStartSLOduration=2.247402302 podStartE2EDuration="2.247402302s" podCreationTimestamp="2026-03-14 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:01:02.235230971 +0000 UTC m=+10924.987146215" watchObservedRunningTime="2026-03-14 10:01:02.247402302 +0000 UTC m=+10924.999317516" Mar 14 10:01:05 crc kubenswrapper[5129]: I0314 10:01:05.249959 5129 generic.go:334] "Generic (PLEG): container finished" podID="d2815225-50bd-4399-a38d-732afc1e06be" containerID="198ec7470b2863cbd81f2513e06e91d2799ba6ee82dba9428cfb6c2441ff1087" exitCode=0 Mar 14 10:01:05 crc kubenswrapper[5129]: I0314 10:01:05.250051 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-fprmt" event={"ID":"d2815225-50bd-4399-a38d-732afc1e06be","Type":"ContainerDied","Data":"198ec7470b2863cbd81f2513e06e91d2799ba6ee82dba9428cfb6c2441ff1087"} Mar 14 10:01:06 crc kubenswrapper[5129]: I0314 10:01:06.970223 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.054321 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-fernet-keys\") pod \"d2815225-50bd-4399-a38d-732afc1e06be\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.054785 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74dvz\" (UniqueName: \"kubernetes.io/projected/d2815225-50bd-4399-a38d-732afc1e06be-kube-api-access-74dvz\") pod \"d2815225-50bd-4399-a38d-732afc1e06be\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.054928 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-combined-ca-bundle\") pod \"d2815225-50bd-4399-a38d-732afc1e06be\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.055026 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-config-data\") pod \"d2815225-50bd-4399-a38d-732afc1e06be\" (UID: \"d2815225-50bd-4399-a38d-732afc1e06be\") " Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.063344 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d2815225-50bd-4399-a38d-732afc1e06be" (UID: "d2815225-50bd-4399-a38d-732afc1e06be"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.069714 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2815225-50bd-4399-a38d-732afc1e06be-kube-api-access-74dvz" (OuterVolumeSpecName: "kube-api-access-74dvz") pod "d2815225-50bd-4399-a38d-732afc1e06be" (UID: "d2815225-50bd-4399-a38d-732afc1e06be"). InnerVolumeSpecName "kube-api-access-74dvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.105142 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2815225-50bd-4399-a38d-732afc1e06be" (UID: "d2815225-50bd-4399-a38d-732afc1e06be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.142666 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-config-data" (OuterVolumeSpecName: "config-data") pod "d2815225-50bd-4399-a38d-732afc1e06be" (UID: "d2815225-50bd-4399-a38d-732afc1e06be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.158746 5129 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.158785 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74dvz\" (UniqueName: \"kubernetes.io/projected/d2815225-50bd-4399-a38d-732afc1e06be-kube-api-access-74dvz\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.158809 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.158820 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2815225-50bd-4399-a38d-732afc1e06be-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.276236 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-fprmt" event={"ID":"d2815225-50bd-4399-a38d-732afc1e06be","Type":"ContainerDied","Data":"fdd774444c6ebf72520be82d2e05ec7b00d828a019ec62d27c0ba3f17ec3c03d"} Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.276301 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd774444c6ebf72520be82d2e05ec7b00d828a019ec62d27c0ba3f17ec3c03d" Mar 14 10:01:07 crc kubenswrapper[5129]: I0314 10:01:07.276344 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-fprmt" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.817932 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-xw8vf"] Mar 14 10:01:25 crc kubenswrapper[5129]: E0314 10:01:25.819176 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2815225-50bd-4399-a38d-732afc1e06be" containerName="keystone-cron" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.819194 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2815225-50bd-4399-a38d-732afc1e06be" containerName="keystone-cron" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.819465 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2815225-50bd-4399-a38d-732afc1e06be" containerName="keystone-cron" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.820440 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.824109 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.824302 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.837860 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-xw8vf"] Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.991271 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-dispersionconf\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.991680 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-swiftconf\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.991720 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.991762 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdnvx\" (UniqueName: \"kubernetes.io/projected/1402858f-5d6c-44a8-942f-bf99991f0b9a-kube-api-access-cdnvx\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.991835 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-scripts\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.991902 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:25 crc kubenswrapper[5129]: I0314 10:01:25.991931 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1402858f-5d6c-44a8-942f-bf99991f0b9a-etc-swift\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.095074 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-dispersionconf\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.095127 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-swiftconf\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.095162 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.095202 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdnvx\" (UniqueName: \"kubernetes.io/projected/1402858f-5d6c-44a8-942f-bf99991f0b9a-kube-api-access-cdnvx\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.095277 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-scripts\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.095331 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.095358 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1402858f-5d6c-44a8-942f-bf99991f0b9a-etc-swift\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.096453 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1402858f-5d6c-44a8-942f-bf99991f0b9a-etc-swift\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.096729 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-scripts\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.100029 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.104652 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.106572 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-dispersionconf\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.112101 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-swiftconf\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.114935 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdnvx\" (UniqueName: \"kubernetes.io/projected/1402858f-5d6c-44a8-942f-bf99991f0b9a-kube-api-access-cdnvx\") pod \"swift-ring-rebalance-debug-xw8vf\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.199699 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:26 crc kubenswrapper[5129]: I0314 10:01:26.770477 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-xw8vf"] Mar 14 10:01:27 crc kubenswrapper[5129]: I0314 10:01:27.516778 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-xw8vf" event={"ID":"1402858f-5d6c-44a8-942f-bf99991f0b9a","Type":"ContainerStarted","Data":"78102a913892a792348c5e7c7b510175c62ebf6a9602d647c99c5c436e2315d7"} Mar 14 10:01:27 crc kubenswrapper[5129]: I0314 10:01:27.518314 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-xw8vf" event={"ID":"1402858f-5d6c-44a8-942f-bf99991f0b9a","Type":"ContainerStarted","Data":"ee5eb700d25b37d8a2b6b7cc599fea7a825a8be0357fa378b340ce1ba2af7ed1"} Mar 14 10:01:27 crc kubenswrapper[5129]: I0314 10:01:27.543578 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-xw8vf" podStartSLOduration=2.543554888 podStartE2EDuration="2.543554888s" podCreationTimestamp="2026-03-14 10:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:01:27.539539679 +0000 UTC m=+10950.291454863" watchObservedRunningTime="2026-03-14 10:01:27.543554888 +0000 UTC m=+10950.295470072" Mar 14 10:01:36 crc kubenswrapper[5129]: I0314 10:01:36.866252 5129 generic.go:334] "Generic (PLEG): container finished" podID="1402858f-5d6c-44a8-942f-bf99991f0b9a" containerID="78102a913892a792348c5e7c7b510175c62ebf6a9602d647c99c5c436e2315d7" exitCode=0 Mar 14 10:01:36 crc kubenswrapper[5129]: I0314 10:01:36.866297 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-xw8vf" event={"ID":"1402858f-5d6c-44a8-942f-bf99991f0b9a","Type":"ContainerDied","Data":"78102a913892a792348c5e7c7b510175c62ebf6a9602d647c99c5c436e2315d7"} Mar 14 10:01:38 crc kubenswrapper[5129]: I0314 10:01:38.893390 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-xw8vf" event={"ID":"1402858f-5d6c-44a8-942f-bf99991f0b9a","Type":"ContainerDied","Data":"ee5eb700d25b37d8a2b6b7cc599fea7a825a8be0357fa378b340ce1ba2af7ed1"} Mar 14 10:01:38 crc kubenswrapper[5129]: I0314 10:01:38.893931 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5eb700d25b37d8a2b6b7cc599fea7a825a8be0357fa378b340ce1ba2af7ed1" Mar 14 10:01:38 crc kubenswrapper[5129]: I0314 10:01:38.940493 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:38 crc kubenswrapper[5129]: I0314 10:01:38.988659 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-xw8vf"] Mar 14 10:01:38 crc kubenswrapper[5129]: I0314 10:01:38.998477 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-xw8vf"] Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.129687 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdnvx\" (UniqueName: \"kubernetes.io/projected/1402858f-5d6c-44a8-942f-bf99991f0b9a-kube-api-access-cdnvx\") pod \"1402858f-5d6c-44a8-942f-bf99991f0b9a\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.129830 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-ring-data-devices\") pod \"1402858f-5d6c-44a8-942f-bf99991f0b9a\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.129890 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-dispersionconf\") pod \"1402858f-5d6c-44a8-942f-bf99991f0b9a\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.129961 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-swiftconf\") pod \"1402858f-5d6c-44a8-942f-bf99991f0b9a\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.129997 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-combined-ca-bundle\") pod \"1402858f-5d6c-44a8-942f-bf99991f0b9a\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.130031 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-scripts\") pod \"1402858f-5d6c-44a8-942f-bf99991f0b9a\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.130053 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1402858f-5d6c-44a8-942f-bf99991f0b9a-etc-swift\") pod \"1402858f-5d6c-44a8-942f-bf99991f0b9a\" (UID: \"1402858f-5d6c-44a8-942f-bf99991f0b9a\") " Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.130459 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1402858f-5d6c-44a8-942f-bf99991f0b9a" (UID: "1402858f-5d6c-44a8-942f-bf99991f0b9a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.131355 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1402858f-5d6c-44a8-942f-bf99991f0b9a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1402858f-5d6c-44a8-942f-bf99991f0b9a" (UID: "1402858f-5d6c-44a8-942f-bf99991f0b9a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.137829 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1402858f-5d6c-44a8-942f-bf99991f0b9a-kube-api-access-cdnvx" (OuterVolumeSpecName: "kube-api-access-cdnvx") pod "1402858f-5d6c-44a8-942f-bf99991f0b9a" (UID: "1402858f-5d6c-44a8-942f-bf99991f0b9a"). InnerVolumeSpecName "kube-api-access-cdnvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.162138 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1402858f-5d6c-44a8-942f-bf99991f0b9a" (UID: "1402858f-5d6c-44a8-942f-bf99991f0b9a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.171866 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1402858f-5d6c-44a8-942f-bf99991f0b9a" (UID: "1402858f-5d6c-44a8-942f-bf99991f0b9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.175145 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1402858f-5d6c-44a8-942f-bf99991f0b9a" (UID: "1402858f-5d6c-44a8-942f-bf99991f0b9a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.194365 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-scripts" (OuterVolumeSpecName: "scripts") pod "1402858f-5d6c-44a8-942f-bf99991f0b9a" (UID: "1402858f-5d6c-44a8-942f-bf99991f0b9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.232478 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.232511 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.232521 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.232529 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1402858f-5d6c-44a8-942f-bf99991f0b9a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.232538 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdnvx\" (UniqueName: \"kubernetes.io/projected/1402858f-5d6c-44a8-942f-bf99991f0b9a-kube-api-access-cdnvx\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.232546 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1402858f-5d6c-44a8-942f-bf99991f0b9a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.232553 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1402858f-5d6c-44a8-942f-bf99991f0b9a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:39 crc kubenswrapper[5129]: I0314 10:01:39.903775 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xw8vf" Mar 14 10:01:40 crc kubenswrapper[5129]: I0314 10:01:40.070584 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1402858f-5d6c-44a8-942f-bf99991f0b9a" path="/var/lib/kubelet/pods/1402858f-5d6c-44a8-942f-bf99991f0b9a/volumes" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.169490 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558042-qdxv8"] Mar 14 10:02:00 crc kubenswrapper[5129]: E0314 10:02:00.170741 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1402858f-5d6c-44a8-942f-bf99991f0b9a" containerName="swift-ring-rebalance" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.170759 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402858f-5d6c-44a8-942f-bf99991f0b9a" containerName="swift-ring-rebalance" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.171122 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1402858f-5d6c-44a8-942f-bf99991f0b9a" containerName="swift-ring-rebalance" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.172046 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-qdxv8" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.175407 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.175452 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.175485 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.186708 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558042-qdxv8"] Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.317170 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqcgd\" (UniqueName: \"kubernetes.io/projected/b7d5758a-a0b4-4478-8a0a-2d4f9c42011e-kube-api-access-jqcgd\") pod \"auto-csr-approver-29558042-qdxv8\" (UID: \"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e\") " pod="openshift-infra/auto-csr-approver-29558042-qdxv8" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.419983 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqcgd\" (UniqueName: \"kubernetes.io/projected/b7d5758a-a0b4-4478-8a0a-2d4f9c42011e-kube-api-access-jqcgd\") pod \"auto-csr-approver-29558042-qdxv8\" (UID: \"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e\") " pod="openshift-infra/auto-csr-approver-29558042-qdxv8" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.446987 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqcgd\" (UniqueName: \"kubernetes.io/projected/b7d5758a-a0b4-4478-8a0a-2d4f9c42011e-kube-api-access-jqcgd\") pod \"auto-csr-approver-29558042-qdxv8\" (UID: \"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e\") " pod="openshift-infra/auto-csr-approver-29558042-qdxv8" Mar 14 10:02:00 crc kubenswrapper[5129]: I0314 10:02:00.501987 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-qdxv8" Mar 14 10:02:01 crc kubenswrapper[5129]: I0314 10:02:01.035904 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558042-qdxv8"] Mar 14 10:02:01 crc kubenswrapper[5129]: I0314 10:02:01.224585 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558042-qdxv8" event={"ID":"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e","Type":"ContainerStarted","Data":"ac2998edacfc7338d9ea1711907e9c5307ef6cc36d3e6a965f59015809c844e6"} Mar 14 10:02:03 crc kubenswrapper[5129]: I0314 10:02:03.250059 5129 generic.go:334] "Generic (PLEG): container finished" podID="b7d5758a-a0b4-4478-8a0a-2d4f9c42011e" containerID="873f274cf5699cee45598cd08591c3572beb8bc4b01681ffa23ec5a614a4b29f" exitCode=0 Mar 14 10:02:03 crc kubenswrapper[5129]: I0314 10:02:03.250166 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558042-qdxv8" event={"ID":"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e","Type":"ContainerDied","Data":"873f274cf5699cee45598cd08591c3572beb8bc4b01681ffa23ec5a614a4b29f"} Mar 14 10:02:04 crc kubenswrapper[5129]: I0314 10:02:04.844118 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-qdxv8" Mar 14 10:02:04 crc kubenswrapper[5129]: I0314 10:02:04.935955 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqcgd\" (UniqueName: \"kubernetes.io/projected/b7d5758a-a0b4-4478-8a0a-2d4f9c42011e-kube-api-access-jqcgd\") pod \"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e\" (UID: \"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e\") " Mar 14 10:02:04 crc kubenswrapper[5129]: I0314 10:02:04.965264 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d5758a-a0b4-4478-8a0a-2d4f9c42011e-kube-api-access-jqcgd" (OuterVolumeSpecName: "kube-api-access-jqcgd") pod "b7d5758a-a0b4-4478-8a0a-2d4f9c42011e" (UID: "b7d5758a-a0b4-4478-8a0a-2d4f9c42011e"). InnerVolumeSpecName "kube-api-access-jqcgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:02:05 crc kubenswrapper[5129]: I0314 10:02:05.037785 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqcgd\" (UniqueName: \"kubernetes.io/projected/b7d5758a-a0b4-4478-8a0a-2d4f9c42011e-kube-api-access-jqcgd\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:05 crc kubenswrapper[5129]: I0314 10:02:05.274567 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558042-qdxv8" event={"ID":"b7d5758a-a0b4-4478-8a0a-2d4f9c42011e","Type":"ContainerDied","Data":"ac2998edacfc7338d9ea1711907e9c5307ef6cc36d3e6a965f59015809c844e6"} Mar 14 10:02:05 crc kubenswrapper[5129]: I0314 10:02:05.274623 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac2998edacfc7338d9ea1711907e9c5307ef6cc36d3e6a965f59015809c844e6" Mar 14 10:02:05 crc kubenswrapper[5129]: I0314 10:02:05.274681 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-qdxv8" Mar 14 10:02:05 crc kubenswrapper[5129]: I0314 10:02:05.938194 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-bvw5m"] Mar 14 10:02:05 crc kubenswrapper[5129]: I0314 10:02:05.952645 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-bvw5m"] Mar 14 10:02:06 crc kubenswrapper[5129]: I0314 10:02:06.055363 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb247d9c-ddba-4c65-9a64-cc5382227f14" path="/var/lib/kubelet/pods/bb247d9c-ddba-4c65-9a64-cc5382227f14/volumes" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.136884 5129 scope.go:117] "RemoveContainer" containerID="a97bd809eb10dc5907746765db0efedde2eb933e73ec0d722150b4ff9d080f21" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.138859 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-pcbrf"] Mar 14 10:02:39 crc kubenswrapper[5129]: E0314 10:02:39.139445 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d5758a-a0b4-4478-8a0a-2d4f9c42011e" containerName="oc" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.139461 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d5758a-a0b4-4478-8a0a-2d4f9c42011e" containerName="oc" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.139660 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d5758a-a0b4-4478-8a0a-2d4f9c42011e" containerName="oc" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.140406 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.145996 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.146438 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.186677 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-pcbrf"] Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.210805 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-dispersionconf\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.211026 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-scripts\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.211122 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.211210 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xths\" (UniqueName: \"kubernetes.io/projected/3ab21984-c9d6-49d1-8717-65d1a4702a01-kube-api-access-6xths\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.211317 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ab21984-c9d6-49d1-8717-65d1a4702a01-etc-swift\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.211415 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-ring-data-devices\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.211495 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-swiftconf\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.313390 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-dispersionconf\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.313446 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-scripts\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.313481 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.313503 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xths\" (UniqueName: \"kubernetes.io/projected/3ab21984-c9d6-49d1-8717-65d1a4702a01-kube-api-access-6xths\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.313558 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ab21984-c9d6-49d1-8717-65d1a4702a01-etc-swift\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.313624 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-ring-data-devices\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.313644 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-swiftconf\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.314335 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-scripts\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.315527 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ab21984-c9d6-49d1-8717-65d1a4702a01-etc-swift\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.315812 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-ring-data-devices\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.325642 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.327439 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-dispersionconf\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.329109 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-swiftconf\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.331984 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xths\" (UniqueName: \"kubernetes.io/projected/3ab21984-c9d6-49d1-8717-65d1a4702a01-kube-api-access-6xths\") pod \"swift-ring-rebalance-debug-pcbrf\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:39 crc kubenswrapper[5129]: I0314 10:02:39.547487 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:40 crc kubenswrapper[5129]: I0314 10:02:40.140187 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-pcbrf"] Mar 14 10:02:40 crc kubenswrapper[5129]: I0314 10:02:40.792372 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-pcbrf" event={"ID":"3ab21984-c9d6-49d1-8717-65d1a4702a01","Type":"ContainerStarted","Data":"11b3a7ec9bc256970ca66b7c8f49c7450feffd51107b68213aad376ee135355d"} Mar 14 10:02:40 crc kubenswrapper[5129]: I0314 10:02:40.792828 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-pcbrf" event={"ID":"3ab21984-c9d6-49d1-8717-65d1a4702a01","Type":"ContainerStarted","Data":"5584e0072cecd6971605b596bdbfd3628eac03874f74705be07cbb671249544a"} Mar 14 10:02:40 crc kubenswrapper[5129]: I0314 10:02:40.816377 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-pcbrf" podStartSLOduration=1.816356503 podStartE2EDuration="1.816356503s" podCreationTimestamp="2026-03-14 10:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:02:40.814997246 +0000 UTC m=+11023.566912430" watchObservedRunningTime="2026-03-14 10:02:40.816356503 +0000 UTC m=+11023.568271687" Mar 14 10:02:47 crc kubenswrapper[5129]: I0314 10:02:47.809739 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-snrrx"] Mar 14 10:02:47 crc kubenswrapper[5129]: I0314 10:02:47.812366 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:47 crc kubenswrapper[5129]: I0314 10:02:47.822179 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snrrx"] Mar 14 10:02:47 crc kubenswrapper[5129]: I0314 10:02:47.911037 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-utilities\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:47 crc kubenswrapper[5129]: I0314 10:02:47.911092 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2grk5\" (UniqueName: \"kubernetes.io/projected/d67684ac-1f0f-4360-b805-8da062d036e3-kube-api-access-2grk5\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:47 crc kubenswrapper[5129]: I0314 10:02:47.911231 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-catalog-content\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.015325 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-catalog-content\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.015537 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-utilities\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.015594 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2grk5\" (UniqueName: \"kubernetes.io/projected/d67684ac-1f0f-4360-b805-8da062d036e3-kube-api-access-2grk5\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.015883 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-catalog-content\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.016173 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-utilities\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.049432 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2grk5\" (UniqueName: \"kubernetes.io/projected/d67684ac-1f0f-4360-b805-8da062d036e3-kube-api-access-2grk5\") pod \"community-operators-snrrx\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.212456 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.827271 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snrrx"] Mar 14 10:02:48 crc kubenswrapper[5129]: I0314 10:02:48.891821 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snrrx" event={"ID":"d67684ac-1f0f-4360-b805-8da062d036e3","Type":"ContainerStarted","Data":"72196d940ef4824b0491cbbf999f23f0a810a7d7f0c02b4fd8c92a5e82af89f6"} Mar 14 10:02:49 crc kubenswrapper[5129]: I0314 10:02:49.575970 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:02:49 crc kubenswrapper[5129]: I0314 10:02:49.576036 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:02:49 crc kubenswrapper[5129]: I0314 10:02:49.910744 5129 generic.go:334] "Generic (PLEG): container finished" podID="d67684ac-1f0f-4360-b805-8da062d036e3" containerID="c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb" exitCode=0 Mar 14 10:02:49 crc kubenswrapper[5129]: I0314 10:02:49.911008 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snrrx" event={"ID":"d67684ac-1f0f-4360-b805-8da062d036e3","Type":"ContainerDied","Data":"c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb"} Mar 14 10:02:49 crc kubenswrapper[5129]: I0314 10:02:49.918714 5129 generic.go:334] "Generic (PLEG): container finished" podID="3ab21984-c9d6-49d1-8717-65d1a4702a01" containerID="11b3a7ec9bc256970ca66b7c8f49c7450feffd51107b68213aad376ee135355d" exitCode=0 Mar 14 10:02:49 crc kubenswrapper[5129]: I0314 10:02:49.918770 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-pcbrf" event={"ID":"3ab21984-c9d6-49d1-8717-65d1a4702a01","Type":"ContainerDied","Data":"11b3a7ec9bc256970ca66b7c8f49c7450feffd51107b68213aad376ee135355d"} Mar 14 10:02:50 crc kubenswrapper[5129]: I0314 10:02:50.939538 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snrrx" event={"ID":"d67684ac-1f0f-4360-b805-8da062d036e3","Type":"ContainerStarted","Data":"8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41"} Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.160135 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.202519 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-pcbrf"] Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.230426 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-swiftconf\") pod \"3ab21984-c9d6-49d1-8717-65d1a4702a01\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.230689 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-scripts\") pod \"3ab21984-c9d6-49d1-8717-65d1a4702a01\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.230927 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-dispersionconf\") pod \"3ab21984-c9d6-49d1-8717-65d1a4702a01\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.231449 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-ring-data-devices\") pod \"3ab21984-c9d6-49d1-8717-65d1a4702a01\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.231571 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ab21984-c9d6-49d1-8717-65d1a4702a01-etc-swift\") pod \"3ab21984-c9d6-49d1-8717-65d1a4702a01\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.231695 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xths\" (UniqueName: \"kubernetes.io/projected/3ab21984-c9d6-49d1-8717-65d1a4702a01-kube-api-access-6xths\") pod \"3ab21984-c9d6-49d1-8717-65d1a4702a01\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.231818 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-combined-ca-bundle\") pod \"3ab21984-c9d6-49d1-8717-65d1a4702a01\" (UID: \"3ab21984-c9d6-49d1-8717-65d1a4702a01\") " Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.233054 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3ab21984-c9d6-49d1-8717-65d1a4702a01" (UID: "3ab21984-c9d6-49d1-8717-65d1a4702a01"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.233207 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ab21984-c9d6-49d1-8717-65d1a4702a01-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3ab21984-c9d6-49d1-8717-65d1a4702a01" (UID: "3ab21984-c9d6-49d1-8717-65d1a4702a01"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.233398 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ab21984-c9d6-49d1-8717-65d1a4702a01-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.233463 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.238206 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab21984-c9d6-49d1-8717-65d1a4702a01-kube-api-access-6xths" (OuterVolumeSpecName: "kube-api-access-6xths") pod "3ab21984-c9d6-49d1-8717-65d1a4702a01" (UID: "3ab21984-c9d6-49d1-8717-65d1a4702a01"). InnerVolumeSpecName "kube-api-access-6xths". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.269885 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ab21984-c9d6-49d1-8717-65d1a4702a01" (UID: "3ab21984-c9d6-49d1-8717-65d1a4702a01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.287017 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-pcbrf"] Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.300917 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3ab21984-c9d6-49d1-8717-65d1a4702a01" (UID: "3ab21984-c9d6-49d1-8717-65d1a4702a01"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.302477 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-scripts" (OuterVolumeSpecName: "scripts") pod "3ab21984-c9d6-49d1-8717-65d1a4702a01" (UID: "3ab21984-c9d6-49d1-8717-65d1a4702a01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.309572 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3ab21984-c9d6-49d1-8717-65d1a4702a01" (UID: "3ab21984-c9d6-49d1-8717-65d1a4702a01"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.335930 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xths\" (UniqueName: \"kubernetes.io/projected/3ab21984-c9d6-49d1-8717-65d1a4702a01-kube-api-access-6xths\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.335966 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.335976 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.335988 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ab21984-c9d6-49d1-8717-65d1a4702a01-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.335997 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ab21984-c9d6-49d1-8717-65d1a4702a01-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.965004 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5584e0072cecd6971605b596bdbfd3628eac03874f74705be07cbb671249544a" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.965097 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-pcbrf" Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.968701 5129 generic.go:334] "Generic (PLEG): container finished" podID="d67684ac-1f0f-4360-b805-8da062d036e3" containerID="8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41" exitCode=0 Mar 14 10:02:52 crc kubenswrapper[5129]: I0314 10:02:52.968901 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snrrx" event={"ID":"d67684ac-1f0f-4360-b805-8da062d036e3","Type":"ContainerDied","Data":"8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41"} Mar 14 10:02:53 crc kubenswrapper[5129]: I0314 10:02:53.983578 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snrrx" event={"ID":"d67684ac-1f0f-4360-b805-8da062d036e3","Type":"ContainerStarted","Data":"13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a"} Mar 14 10:02:54 crc kubenswrapper[5129]: I0314 10:02:54.027496 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-snrrx" podStartSLOduration=3.5673931100000003 podStartE2EDuration="7.027475452s" podCreationTimestamp="2026-03-14 10:02:47 +0000 UTC" firstStartedPulling="2026-03-14 10:02:49.914083516 +0000 UTC m=+11032.665998710" lastFinishedPulling="2026-03-14 10:02:53.374165868 +0000 UTC m=+11036.126081052" observedRunningTime="2026-03-14 10:02:54.011457098 +0000 UTC m=+11036.763372302" watchObservedRunningTime="2026-03-14 10:02:54.027475452 +0000 UTC m=+11036.779390636" Mar 14 10:02:54 crc kubenswrapper[5129]: I0314 10:02:54.056318 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab21984-c9d6-49d1-8717-65d1a4702a01" path="/var/lib/kubelet/pods/3ab21984-c9d6-49d1-8717-65d1a4702a01/volumes" Mar 14 10:02:58 crc kubenswrapper[5129]: I0314 10:02:58.213668 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:58 crc kubenswrapper[5129]: I0314 10:02:58.214694 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:58 crc kubenswrapper[5129]: I0314 10:02:58.304199 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:59 crc kubenswrapper[5129]: I0314 10:02:59.135311 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:02:59 crc kubenswrapper[5129]: I0314 10:02:59.196596 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snrrx"] Mar 14 10:03:01 crc kubenswrapper[5129]: I0314 10:03:01.086833 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-snrrx" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="registry-server" containerID="cri-o://13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a" gracePeriod=2 Mar 14 10:03:01 crc kubenswrapper[5129]: I0314 10:03:01.970427 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.075552 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-utilities\") pod \"d67684ac-1f0f-4360-b805-8da062d036e3\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.075679 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2grk5\" (UniqueName: \"kubernetes.io/projected/d67684ac-1f0f-4360-b805-8da062d036e3-kube-api-access-2grk5\") pod \"d67684ac-1f0f-4360-b805-8da062d036e3\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.075734 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-catalog-content\") pod \"d67684ac-1f0f-4360-b805-8da062d036e3\" (UID: \"d67684ac-1f0f-4360-b805-8da062d036e3\") " Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.077351 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-utilities" (OuterVolumeSpecName: "utilities") pod "d67684ac-1f0f-4360-b805-8da062d036e3" (UID: "d67684ac-1f0f-4360-b805-8da062d036e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.087074 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67684ac-1f0f-4360-b805-8da062d036e3-kube-api-access-2grk5" (OuterVolumeSpecName: "kube-api-access-2grk5") pod "d67684ac-1f0f-4360-b805-8da062d036e3" (UID: "d67684ac-1f0f-4360-b805-8da062d036e3"). InnerVolumeSpecName "kube-api-access-2grk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.109037 5129 generic.go:334] "Generic (PLEG): container finished" podID="d67684ac-1f0f-4360-b805-8da062d036e3" containerID="13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a" exitCode=0 Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.109082 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snrrx" event={"ID":"d67684ac-1f0f-4360-b805-8da062d036e3","Type":"ContainerDied","Data":"13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a"} Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.109110 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snrrx" event={"ID":"d67684ac-1f0f-4360-b805-8da062d036e3","Type":"ContainerDied","Data":"72196d940ef4824b0491cbbf999f23f0a810a7d7f0c02b4fd8c92a5e82af89f6"} Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.109127 5129 scope.go:117] "RemoveContainer" containerID="13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.109304 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snrrx" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.141862 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d67684ac-1f0f-4360-b805-8da062d036e3" (UID: "d67684ac-1f0f-4360-b805-8da062d036e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.168224 5129 scope.go:117] "RemoveContainer" containerID="8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.179750 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.179793 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2grk5\" (UniqueName: \"kubernetes.io/projected/d67684ac-1f0f-4360-b805-8da062d036e3-kube-api-access-2grk5\") on node \"crc\" DevicePath \"\"" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.179811 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d67684ac-1f0f-4360-b805-8da062d036e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.194343 5129 scope.go:117] "RemoveContainer" containerID="c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.272944 5129 scope.go:117] "RemoveContainer" containerID="13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a" Mar 14 10:03:02 crc kubenswrapper[5129]: E0314 10:03:02.273647 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a\": container with ID starting with 13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a not found: ID does not exist" containerID="13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.273723 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a"} err="failed to get container status \"13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a\": rpc error: code = NotFound desc = could not find container \"13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a\": container with ID starting with 13653154938c6b7ccdc8ae4a703e299842a185e1a5500f082afef09b1307797a not found: ID does not exist" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.273772 5129 scope.go:117] "RemoveContainer" containerID="8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41" Mar 14 10:03:02 crc kubenswrapper[5129]: E0314 10:03:02.274415 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41\": container with ID starting with 8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41 not found: ID does not exist" containerID="8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.274475 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41"} err="failed to get container status \"8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41\": rpc error: code = NotFound desc = could not find container \"8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41\": container with ID starting with 8bb799bfae9fbef5837e5aa1f93ceca39afd7c1dee0567787f1ecfe6c4251f41 not found: ID does not exist" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.274508 5129 scope.go:117] "RemoveContainer" containerID="c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb" Mar 14 10:03:02 crc kubenswrapper[5129]: E0314 10:03:02.274845 5129 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb\": container with ID starting with c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb not found: ID does not exist" containerID="c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.274877 5129 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb"} err="failed to get container status \"c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb\": rpc error: code = NotFound desc = could not find container \"c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb\": container with ID starting with c44f5f123f456c449cd9028e6bbc97621dd8c4333d035d37972b75558b62f7eb not found: ID does not exist" Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.448382 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snrrx"] Mar 14 10:03:02 crc kubenswrapper[5129]: I0314 10:03:02.458325 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-snrrx"] Mar 14 10:03:04 crc kubenswrapper[5129]: I0314 10:03:04.056076 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" path="/var/lib/kubelet/pods/d67684ac-1f0f-4360-b805-8da062d036e3/volumes" Mar 14 10:03:19 crc kubenswrapper[5129]: I0314 10:03:19.574144 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:03:19 crc kubenswrapper[5129]: I0314 10:03:19.575109 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:03:39 crc kubenswrapper[5129]: I0314 10:03:39.293236 5129 scope.go:117] "RemoveContainer" containerID="e17aff6ac6c88e1a3eb5abc357e2dc0d191f481503c712ee39d489c265b9ea0d" Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.574706 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.576237 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.576368 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.577245 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41973fc889c20df973d49b23f7c22998ca1537f65950641326a14bf146e131d6"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.577387 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://41973fc889c20df973d49b23f7c22998ca1537f65950641326a14bf146e131d6" gracePeriod=600 Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.725200 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="41973fc889c20df973d49b23f7c22998ca1537f65950641326a14bf146e131d6" exitCode=0 Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.725269 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"41973fc889c20df973d49b23f7c22998ca1537f65950641326a14bf146e131d6"} Mar 14 10:03:49 crc kubenswrapper[5129]: I0314 10:03:49.725335 5129 scope.go:117] "RemoveContainer" containerID="703f0379322c5188631d53f609890ac97b900a87fefad7d1453207795aa0f3dd" Mar 14 10:03:50 crc kubenswrapper[5129]: I0314 10:03:50.745673 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311"} Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.357711 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-wbqp8"] Mar 14 10:03:52 crc kubenswrapper[5129]: E0314 10:03:52.359041 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="extract-utilities" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.359070 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="extract-utilities" Mar 14 10:03:52 crc kubenswrapper[5129]: E0314 10:03:52.359110 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="extract-content" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.359120 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="extract-content" Mar 14 10:03:52 crc kubenswrapper[5129]: E0314 10:03:52.359140 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="registry-server" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.359149 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="registry-server" Mar 14 10:03:52 crc kubenswrapper[5129]: E0314 10:03:52.359180 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab21984-c9d6-49d1-8717-65d1a4702a01" containerName="swift-ring-rebalance" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.359190 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab21984-c9d6-49d1-8717-65d1a4702a01" containerName="swift-ring-rebalance" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.359504 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67684ac-1f0f-4360-b805-8da062d036e3" containerName="registry-server" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.359574 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab21984-c9d6-49d1-8717-65d1a4702a01" containerName="swift-ring-rebalance" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.360730 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.363084 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.364053 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.380527 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-wbqp8"] Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.453001 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8xl\" (UniqueName: \"kubernetes.io/projected/cc6115ce-1b8a-422b-9031-e5c657ad218b-kube-api-access-zq8xl\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.453829 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cc6115ce-1b8a-422b-9031-e5c657ad218b-etc-swift\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.454051 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-dispersionconf\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.454173 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-ring-data-devices\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.454296 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.454470 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-swiftconf\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.454661 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-scripts\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.556785 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-scripts\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557043 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8xl\" (UniqueName: \"kubernetes.io/projected/cc6115ce-1b8a-422b-9031-e5c657ad218b-kube-api-access-zq8xl\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557185 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cc6115ce-1b8a-422b-9031-e5c657ad218b-etc-swift\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557347 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-dispersionconf\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557430 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-ring-data-devices\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557511 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557641 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-swiftconf\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557630 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cc6115ce-1b8a-422b-9031-e5c657ad218b-etc-swift\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.557997 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-scripts\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.558050 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-ring-data-devices\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.564079 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-swiftconf\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.565282 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.569929 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-dispersionconf\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.580636 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8xl\" (UniqueName: \"kubernetes.io/projected/cc6115ce-1b8a-422b-9031-e5c657ad218b-kube-api-access-zq8xl\") pod \"swift-ring-rebalance-debug-wbqp8\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:52 crc kubenswrapper[5129]: I0314 10:03:52.695384 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:03:53 crc kubenswrapper[5129]: I0314 10:03:53.232874 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-wbqp8"] Mar 14 10:03:53 crc kubenswrapper[5129]: W0314 10:03:53.246234 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc6115ce_1b8a_422b_9031_e5c657ad218b.slice/crio-c48ca5edca08f1f25df1229548fe75b85386c18f792936cc5a78f3289f6e4718 WatchSource:0}: Error finding container c48ca5edca08f1f25df1229548fe75b85386c18f792936cc5a78f3289f6e4718: Status 404 returned error can't find the container with id c48ca5edca08f1f25df1229548fe75b85386c18f792936cc5a78f3289f6e4718 Mar 14 10:03:53 crc kubenswrapper[5129]: I0314 10:03:53.791978 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-wbqp8" event={"ID":"cc6115ce-1b8a-422b-9031-e5c657ad218b","Type":"ContainerStarted","Data":"b50e4ef910d3d5d3cb650131ba76ab11d718ff93f2fbf444bd126063d67f62cc"} Mar 14 10:03:53 crc kubenswrapper[5129]: I0314 10:03:53.792383 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-wbqp8" event={"ID":"cc6115ce-1b8a-422b-9031-e5c657ad218b","Type":"ContainerStarted","Data":"c48ca5edca08f1f25df1229548fe75b85386c18f792936cc5a78f3289f6e4718"} Mar 14 10:03:53 crc kubenswrapper[5129]: I0314 10:03:53.819947 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-wbqp8" podStartSLOduration=1.819920981 podStartE2EDuration="1.819920981s" podCreationTimestamp="2026-03-14 10:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:03:53.813330262 +0000 UTC m=+11096.565245446" watchObservedRunningTime="2026-03-14 10:03:53.819920981 +0000 UTC m=+11096.571836165" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.154590 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558044-s2xbf"] Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.166534 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558044-s2xbf"] Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.166664 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558044-s2xbf" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.170483 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.171188 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.171339 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.243201 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x55s\" (UniqueName: \"kubernetes.io/projected/ddd1c525-862c-40b6-b416-05f024a11987-kube-api-access-7x55s\") pod \"auto-csr-approver-29558044-s2xbf\" (UID: \"ddd1c525-862c-40b6-b416-05f024a11987\") " pod="openshift-infra/auto-csr-approver-29558044-s2xbf" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.347222 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x55s\" (UniqueName: \"kubernetes.io/projected/ddd1c525-862c-40b6-b416-05f024a11987-kube-api-access-7x55s\") pod \"auto-csr-approver-29558044-s2xbf\" (UID: \"ddd1c525-862c-40b6-b416-05f024a11987\") " pod="openshift-infra/auto-csr-approver-29558044-s2xbf" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.372868 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x55s\" (UniqueName: \"kubernetes.io/projected/ddd1c525-862c-40b6-b416-05f024a11987-kube-api-access-7x55s\") pod \"auto-csr-approver-29558044-s2xbf\" (UID: \"ddd1c525-862c-40b6-b416-05f024a11987\") " pod="openshift-infra/auto-csr-approver-29558044-s2xbf" Mar 14 10:04:00 crc kubenswrapper[5129]: I0314 10:04:00.498160 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558044-s2xbf" Mar 14 10:04:01 crc kubenswrapper[5129]: I0314 10:04:01.262648 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558044-s2xbf"] Mar 14 10:04:01 crc kubenswrapper[5129]: I0314 10:04:01.901458 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558044-s2xbf" event={"ID":"ddd1c525-862c-40b6-b416-05f024a11987","Type":"ContainerStarted","Data":"c9b77af6049ed245f882ef3c686b5a450d6bd90bec188c7b394a19b31d4a6165"} Mar 14 10:04:02 crc kubenswrapper[5129]: I0314 10:04:02.917406 5129 generic.go:334] "Generic (PLEG): container finished" podID="ddd1c525-862c-40b6-b416-05f024a11987" containerID="6b50cc3c07901197509f332e4cd4b875d1e08693077da75875452dd6a814833d" exitCode=0 Mar 14 10:04:02 crc kubenswrapper[5129]: I0314 10:04:02.917478 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558044-s2xbf" event={"ID":"ddd1c525-862c-40b6-b416-05f024a11987","Type":"ContainerDied","Data":"6b50cc3c07901197509f332e4cd4b875d1e08693077da75875452dd6a814833d"} Mar 14 10:04:02 crc kubenswrapper[5129]: I0314 10:04:02.921340 5129 generic.go:334] "Generic (PLEG): container finished" podID="cc6115ce-1b8a-422b-9031-e5c657ad218b" containerID="b50e4ef910d3d5d3cb650131ba76ab11d718ff93f2fbf444bd126063d67f62cc" exitCode=0 Mar 14 10:04:02 crc kubenswrapper[5129]: I0314 10:04:02.921409 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-wbqp8" event={"ID":"cc6115ce-1b8a-422b-9031-e5c657ad218b","Type":"ContainerDied","Data":"b50e4ef910d3d5d3cb650131ba76ab11d718ff93f2fbf444bd126063d67f62cc"} Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.491031 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558044-s2xbf" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.499147 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.570802 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-wbqp8"] Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.583389 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-wbqp8"] Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.607803 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-ring-data-devices\") pod \"cc6115ce-1b8a-422b-9031-e5c657ad218b\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.607879 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-dispersionconf\") pod \"cc6115ce-1b8a-422b-9031-e5c657ad218b\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.607905 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x55s\" (UniqueName: \"kubernetes.io/projected/ddd1c525-862c-40b6-b416-05f024a11987-kube-api-access-7x55s\") pod \"ddd1c525-862c-40b6-b416-05f024a11987\" (UID: \"ddd1c525-862c-40b6-b416-05f024a11987\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.607925 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-swiftconf\") pod \"cc6115ce-1b8a-422b-9031-e5c657ad218b\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.608091 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8xl\" (UniqueName: \"kubernetes.io/projected/cc6115ce-1b8a-422b-9031-e5c657ad218b-kube-api-access-zq8xl\") pod \"cc6115ce-1b8a-422b-9031-e5c657ad218b\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.608805 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cc6115ce-1b8a-422b-9031-e5c657ad218b" (UID: "cc6115ce-1b8a-422b-9031-e5c657ad218b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.609369 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-combined-ca-bundle\") pod \"cc6115ce-1b8a-422b-9031-e5c657ad218b\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.609460 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cc6115ce-1b8a-422b-9031-e5c657ad218b-etc-swift\") pod \"cc6115ce-1b8a-422b-9031-e5c657ad218b\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.609496 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-scripts\") pod \"cc6115ce-1b8a-422b-9031-e5c657ad218b\" (UID: \"cc6115ce-1b8a-422b-9031-e5c657ad218b\") " Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.610089 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.610472 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc6115ce-1b8a-422b-9031-e5c657ad218b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cc6115ce-1b8a-422b-9031-e5c657ad218b" (UID: "cc6115ce-1b8a-422b-9031-e5c657ad218b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.652725 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc6115ce-1b8a-422b-9031-e5c657ad218b" (UID: "cc6115ce-1b8a-422b-9031-e5c657ad218b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.654831 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6115ce-1b8a-422b-9031-e5c657ad218b-kube-api-access-zq8xl" (OuterVolumeSpecName: "kube-api-access-zq8xl") pod "cc6115ce-1b8a-422b-9031-e5c657ad218b" (UID: "cc6115ce-1b8a-422b-9031-e5c657ad218b"). InnerVolumeSpecName "kube-api-access-zq8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.656561 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd1c525-862c-40b6-b416-05f024a11987-kube-api-access-7x55s" (OuterVolumeSpecName: "kube-api-access-7x55s") pod "ddd1c525-862c-40b6-b416-05f024a11987" (UID: "ddd1c525-862c-40b6-b416-05f024a11987"). InnerVolumeSpecName "kube-api-access-7x55s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.665631 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cc6115ce-1b8a-422b-9031-e5c657ad218b" (UID: "cc6115ce-1b8a-422b-9031-e5c657ad218b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.672913 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cc6115ce-1b8a-422b-9031-e5c657ad218b" (UID: "cc6115ce-1b8a-422b-9031-e5c657ad218b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.686514 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-scripts" (OuterVolumeSpecName: "scripts") pod "cc6115ce-1b8a-422b-9031-e5c657ad218b" (UID: "cc6115ce-1b8a-422b-9031-e5c657ad218b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.714090 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.714118 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x55s\" (UniqueName: \"kubernetes.io/projected/ddd1c525-862c-40b6-b416-05f024a11987-kube-api-access-7x55s\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.714131 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.714140 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8xl\" (UniqueName: \"kubernetes.io/projected/cc6115ce-1b8a-422b-9031-e5c657ad218b-kube-api-access-zq8xl\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.714150 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc6115ce-1b8a-422b-9031-e5c657ad218b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.714159 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cc6115ce-1b8a-422b-9031-e5c657ad218b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.714167 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc6115ce-1b8a-422b-9031-e5c657ad218b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.966884 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558044-s2xbf" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.966917 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558044-s2xbf" event={"ID":"ddd1c525-862c-40b6-b416-05f024a11987","Type":"ContainerDied","Data":"c9b77af6049ed245f882ef3c686b5a450d6bd90bec188c7b394a19b31d4a6165"} Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.967024 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b77af6049ed245f882ef3c686b5a450d6bd90bec188c7b394a19b31d4a6165" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.970642 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c48ca5edca08f1f25df1229548fe75b85386c18f792936cc5a78f3289f6e4718" Mar 14 10:04:05 crc kubenswrapper[5129]: I0314 10:04:05.970735 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-wbqp8" Mar 14 10:04:06 crc kubenswrapper[5129]: I0314 10:04:06.052321 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6115ce-1b8a-422b-9031-e5c657ad218b" path="/var/lib/kubelet/pods/cc6115ce-1b8a-422b-9031-e5c657ad218b/volumes" Mar 14 10:04:06 crc kubenswrapper[5129]: I0314 10:04:06.577979 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558038-b2dhq"] Mar 14 10:04:06 crc kubenswrapper[5129]: I0314 10:04:06.590068 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558038-b2dhq"] Mar 14 10:04:08 crc kubenswrapper[5129]: I0314 10:04:08.107712 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757e9740-209f-4a0a-9fbd-e658341ba827" path="/var/lib/kubelet/pods/757e9740-209f-4a0a-9fbd-e658341ba827/volumes" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.243547 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9f2b4"] Mar 14 10:04:10 crc kubenswrapper[5129]: E0314 10:04:10.244287 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6115ce-1b8a-422b-9031-e5c657ad218b" containerName="swift-ring-rebalance" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.244300 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6115ce-1b8a-422b-9031-e5c657ad218b" containerName="swift-ring-rebalance" Mar 14 10:04:10 crc kubenswrapper[5129]: E0314 10:04:10.244332 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd1c525-862c-40b6-b416-05f024a11987" containerName="oc" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.244340 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd1c525-862c-40b6-b416-05f024a11987" containerName="oc" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.244560 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd1c525-862c-40b6-b416-05f024a11987" containerName="oc" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.244594 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6115ce-1b8a-422b-9031-e5c657ad218b" containerName="swift-ring-rebalance" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.246043 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.263879 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9f2b4"] Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.326432 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-utilities\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.326929 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-catalog-content\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.327386 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6z94\" (UniqueName: \"kubernetes.io/projected/cc495069-8f9c-4441-b9b4-c6881bf934a2-kube-api-access-l6z94\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.428757 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-catalog-content\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.428938 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6z94\" (UniqueName: \"kubernetes.io/projected/cc495069-8f9c-4441-b9b4-c6881bf934a2-kube-api-access-l6z94\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.428975 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-utilities\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.429353 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-utilities\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.429374 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-catalog-content\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.448343 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6z94\" (UniqueName: \"kubernetes.io/projected/cc495069-8f9c-4441-b9b4-c6881bf934a2-kube-api-access-l6z94\") pod \"certified-operators-9f2b4\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:10 crc kubenswrapper[5129]: I0314 10:04:10.569191 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:11 crc kubenswrapper[5129]: I0314 10:04:11.421093 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9f2b4"] Mar 14 10:04:12 crc kubenswrapper[5129]: I0314 10:04:12.083669 5129 generic.go:334] "Generic (PLEG): container finished" podID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerID="b2a36edd67bbc40216ddb6e31dcf21944bb3f3f296c0e4ccee0f74f21160c45b" exitCode=0 Mar 14 10:04:12 crc kubenswrapper[5129]: I0314 10:04:12.138932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2b4" event={"ID":"cc495069-8f9c-4441-b9b4-c6881bf934a2","Type":"ContainerDied","Data":"b2a36edd67bbc40216ddb6e31dcf21944bb3f3f296c0e4ccee0f74f21160c45b"} Mar 14 10:04:12 crc kubenswrapper[5129]: I0314 10:04:12.139006 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2b4" event={"ID":"cc495069-8f9c-4441-b9b4-c6881bf934a2","Type":"ContainerStarted","Data":"d43dd37ad7c901bc30ffd8507e9ee9f61f615f67c0f47ec83ed3762149822d02"} Mar 14 10:04:13 crc kubenswrapper[5129]: I0314 10:04:13.096791 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2b4" event={"ID":"cc495069-8f9c-4441-b9b4-c6881bf934a2","Type":"ContainerStarted","Data":"f052e2537261fa368b3e37c03ab56cc863f2a022c6bf784ec3c833d111d35a14"} Mar 14 10:04:15 crc kubenswrapper[5129]: I0314 10:04:15.120921 5129 generic.go:334] "Generic (PLEG): container finished" podID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerID="f052e2537261fa368b3e37c03ab56cc863f2a022c6bf784ec3c833d111d35a14" exitCode=0 Mar 14 10:04:15 crc kubenswrapper[5129]: I0314 10:04:15.121499 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2b4" event={"ID":"cc495069-8f9c-4441-b9b4-c6881bf934a2","Type":"ContainerDied","Data":"f052e2537261fa368b3e37c03ab56cc863f2a022c6bf784ec3c833d111d35a14"} Mar 14 10:04:16 crc kubenswrapper[5129]: I0314 10:04:16.145938 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2b4" event={"ID":"cc495069-8f9c-4441-b9b4-c6881bf934a2","Type":"ContainerStarted","Data":"3b5a70f20e70f6513b38e895cd6ed3052877ed77174a1256917650d28c0a8ba8"} Mar 14 10:04:16 crc kubenswrapper[5129]: I0314 10:04:16.174484 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9f2b4" podStartSLOduration=2.725070873 podStartE2EDuration="6.174465275s" podCreationTimestamp="2026-03-14 10:04:10 +0000 UTC" firstStartedPulling="2026-03-14 10:04:12.106139293 +0000 UTC m=+11114.858054477" lastFinishedPulling="2026-03-14 10:04:15.555533675 +0000 UTC m=+11118.307448879" observedRunningTime="2026-03-14 10:04:16.166314824 +0000 UTC m=+11118.918230008" watchObservedRunningTime="2026-03-14 10:04:16.174465275 +0000 UTC m=+11118.926380459" Mar 14 10:04:20 crc kubenswrapper[5129]: I0314 10:04:20.570288 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:20 crc kubenswrapper[5129]: I0314 10:04:20.570708 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:20 crc kubenswrapper[5129]: I0314 10:04:20.626111 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:21 crc kubenswrapper[5129]: I0314 10:04:21.252888 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:21 crc kubenswrapper[5129]: I0314 10:04:21.305568 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9f2b4"] Mar 14 10:04:23 crc kubenswrapper[5129]: I0314 10:04:23.229474 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9f2b4" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="registry-server" containerID="cri-o://3b5a70f20e70f6513b38e895cd6ed3052877ed77174a1256917650d28c0a8ba8" gracePeriod=2 Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.253314 5129 generic.go:334] "Generic (PLEG): container finished" podID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerID="3b5a70f20e70f6513b38e895cd6ed3052877ed77174a1256917650d28c0a8ba8" exitCode=0 Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.253416 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2b4" event={"ID":"cc495069-8f9c-4441-b9b4-c6881bf934a2","Type":"ContainerDied","Data":"3b5a70f20e70f6513b38e895cd6ed3052877ed77174a1256917650d28c0a8ba8"} Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.414336 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.568361 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-utilities\") pod \"cc495069-8f9c-4441-b9b4-c6881bf934a2\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.568509 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6z94\" (UniqueName: \"kubernetes.io/projected/cc495069-8f9c-4441-b9b4-c6881bf934a2-kube-api-access-l6z94\") pod \"cc495069-8f9c-4441-b9b4-c6881bf934a2\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.568668 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-catalog-content\") pod \"cc495069-8f9c-4441-b9b4-c6881bf934a2\" (UID: \"cc495069-8f9c-4441-b9b4-c6881bf934a2\") " Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.569737 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-utilities" (OuterVolumeSpecName: "utilities") pod "cc495069-8f9c-4441-b9b4-c6881bf934a2" (UID: "cc495069-8f9c-4441-b9b4-c6881bf934a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.580705 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc495069-8f9c-4441-b9b4-c6881bf934a2-kube-api-access-l6z94" (OuterVolumeSpecName: "kube-api-access-l6z94") pod "cc495069-8f9c-4441-b9b4-c6881bf934a2" (UID: "cc495069-8f9c-4441-b9b4-c6881bf934a2"). InnerVolumeSpecName "kube-api-access-l6z94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.620993 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc495069-8f9c-4441-b9b4-c6881bf934a2" (UID: "cc495069-8f9c-4441-b9b4-c6881bf934a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.687485 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.687531 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6z94\" (UniqueName: \"kubernetes.io/projected/cc495069-8f9c-4441-b9b4-c6881bf934a2-kube-api-access-l6z94\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:24 crc kubenswrapper[5129]: I0314 10:04:24.687544 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc495069-8f9c-4441-b9b4-c6881bf934a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:04:25 crc kubenswrapper[5129]: I0314 10:04:25.271661 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9f2b4" event={"ID":"cc495069-8f9c-4441-b9b4-c6881bf934a2","Type":"ContainerDied","Data":"d43dd37ad7c901bc30ffd8507e9ee9f61f615f67c0f47ec83ed3762149822d02"} Mar 14 10:04:25 crc kubenswrapper[5129]: I0314 10:04:25.272198 5129 scope.go:117] "RemoveContainer" containerID="3b5a70f20e70f6513b38e895cd6ed3052877ed77174a1256917650d28c0a8ba8" Mar 14 10:04:25 crc kubenswrapper[5129]: I0314 10:04:25.271764 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9f2b4" Mar 14 10:04:25 crc kubenswrapper[5129]: I0314 10:04:25.304202 5129 scope.go:117] "RemoveContainer" containerID="f052e2537261fa368b3e37c03ab56cc863f2a022c6bf784ec3c833d111d35a14" Mar 14 10:04:25 crc kubenswrapper[5129]: I0314 10:04:25.347160 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9f2b4"] Mar 14 10:04:25 crc kubenswrapper[5129]: I0314 10:04:25.353028 5129 scope.go:117] "RemoveContainer" containerID="b2a36edd67bbc40216ddb6e31dcf21944bb3f3f296c0e4ccee0f74f21160c45b" Mar 14 10:04:25 crc kubenswrapper[5129]: I0314 10:04:25.358695 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9f2b4"] Mar 14 10:04:26 crc kubenswrapper[5129]: I0314 10:04:26.056988 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" path="/var/lib/kubelet/pods/cc495069-8f9c-4441-b9b4-c6881bf934a2/volumes" Mar 14 10:04:39 crc kubenswrapper[5129]: I0314 10:04:39.446656 5129 scope.go:117] "RemoveContainer" containerID="ca3d2b9c21ed43d9bca45926aa0338fa92281b455121d922b84ccf600f4931a6" Mar 14 10:04:39 crc kubenswrapper[5129]: I0314 10:04:39.507177 5129 scope.go:117] "RemoveContainer" containerID="05ec201ac0bc772f2cc230173ca6640d09318f9f60f7a012fba3d5dcb5ca470d" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.765689 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-xrrhm"] Mar 14 10:05:05 crc kubenswrapper[5129]: E0314 10:05:05.766876 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="extract-utilities" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.766895 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="extract-utilities" Mar 14 10:05:05 crc kubenswrapper[5129]: E0314 10:05:05.766917 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="extract-content" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.766926 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="extract-content" Mar 14 10:05:05 crc kubenswrapper[5129]: E0314 10:05:05.766958 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="registry-server" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.766964 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="registry-server" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.767235 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc495069-8f9c-4441-b9b4-c6881bf934a2" containerName="registry-server" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.768345 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.774991 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.779586 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.791972 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-xrrhm"] Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.905112 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-ring-data-devices\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.905193 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-dispersionconf\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.905264 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c6ebf843-1cc8-411c-a978-c6967b2da699-etc-swift\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.905303 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.905329 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-swiftconf\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.905389 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfkm\" (UniqueName: \"kubernetes.io/projected/c6ebf843-1cc8-411c-a978-c6967b2da699-kube-api-access-5qfkm\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:05 crc kubenswrapper[5129]: I0314 10:05:05.905443 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-scripts\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.007699 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.007772 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-swiftconf\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.007863 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfkm\" (UniqueName: \"kubernetes.io/projected/c6ebf843-1cc8-411c-a978-c6967b2da699-kube-api-access-5qfkm\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.007922 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-scripts\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.007970 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-ring-data-devices\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.008014 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-dispersionconf\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.008065 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c6ebf843-1cc8-411c-a978-c6967b2da699-etc-swift\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.008746 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c6ebf843-1cc8-411c-a978-c6967b2da699-etc-swift\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.009332 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-scripts\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.009336 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-ring-data-devices\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.014113 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-swiftconf\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.019912 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-dispersionconf\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.022564 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfkm\" (UniqueName: \"kubernetes.io/projected/c6ebf843-1cc8-411c-a978-c6967b2da699-kube-api-access-5qfkm\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.039352 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-xrrhm\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.098438 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.663095 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-xrrhm"] Mar 14 10:05:06 crc kubenswrapper[5129]: I0314 10:05:06.778736 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-xrrhm" event={"ID":"c6ebf843-1cc8-411c-a978-c6967b2da699","Type":"ContainerStarted","Data":"d50af3e522c324cc586ce6e7d67d20b85beeda3caf4f3f363aeec923ebd7a36b"} Mar 14 10:05:07 crc kubenswrapper[5129]: I0314 10:05:07.791357 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-xrrhm" event={"ID":"c6ebf843-1cc8-411c-a978-c6967b2da699","Type":"ContainerStarted","Data":"d8fbd9a083bd0b175d34aa3116a38086fb5d28ff132cb61b8d7a4a9e32d35d9b"} Mar 14 10:05:07 crc kubenswrapper[5129]: I0314 10:05:07.859302 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-xrrhm" podStartSLOduration=2.859271158 podStartE2EDuration="2.859271158s" podCreationTimestamp="2026-03-14 10:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:05:07.822352164 +0000 UTC m=+11170.574267348" watchObservedRunningTime="2026-03-14 10:05:07.859271158 +0000 UTC m=+11170.611186342" Mar 14 10:05:18 crc kubenswrapper[5129]: I0314 10:05:18.918339 5129 generic.go:334] "Generic (PLEG): container finished" podID="c6ebf843-1cc8-411c-a978-c6967b2da699" containerID="d8fbd9a083bd0b175d34aa3116a38086fb5d28ff132cb61b8d7a4a9e32d35d9b" exitCode=0 Mar 14 10:05:18 crc kubenswrapper[5129]: I0314 10:05:18.919562 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-xrrhm" event={"ID":"c6ebf843-1cc8-411c-a978-c6967b2da699","Type":"ContainerDied","Data":"d8fbd9a083bd0b175d34aa3116a38086fb5d28ff132cb61b8d7a4a9e32d35d9b"} Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.205330 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.271924 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-xrrhm"] Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.289597 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-xrrhm"] Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.310203 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c6ebf843-1cc8-411c-a978-c6967b2da699-etc-swift\") pod \"c6ebf843-1cc8-411c-a978-c6967b2da699\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.310287 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-ring-data-devices\") pod \"c6ebf843-1cc8-411c-a978-c6967b2da699\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.310371 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-dispersionconf\") pod \"c6ebf843-1cc8-411c-a978-c6967b2da699\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.310419 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-combined-ca-bundle\") pod \"c6ebf843-1cc8-411c-a978-c6967b2da699\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.310521 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-swiftconf\") pod \"c6ebf843-1cc8-411c-a978-c6967b2da699\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.310561 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qfkm\" (UniqueName: \"kubernetes.io/projected/c6ebf843-1cc8-411c-a978-c6967b2da699-kube-api-access-5qfkm\") pod \"c6ebf843-1cc8-411c-a978-c6967b2da699\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.310686 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-scripts\") pod \"c6ebf843-1cc8-411c-a978-c6967b2da699\" (UID: \"c6ebf843-1cc8-411c-a978-c6967b2da699\") " Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.313316 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ebf843-1cc8-411c-a978-c6967b2da699-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c6ebf843-1cc8-411c-a978-c6967b2da699" (UID: "c6ebf843-1cc8-411c-a978-c6967b2da699"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.314381 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c6ebf843-1cc8-411c-a978-c6967b2da699" (UID: "c6ebf843-1cc8-411c-a978-c6967b2da699"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.318932 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ebf843-1cc8-411c-a978-c6967b2da699-kube-api-access-5qfkm" (OuterVolumeSpecName: "kube-api-access-5qfkm") pod "c6ebf843-1cc8-411c-a978-c6967b2da699" (UID: "c6ebf843-1cc8-411c-a978-c6967b2da699"). InnerVolumeSpecName "kube-api-access-5qfkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.347956 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c6ebf843-1cc8-411c-a978-c6967b2da699" (UID: "c6ebf843-1cc8-411c-a978-c6967b2da699"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.348759 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c6ebf843-1cc8-411c-a978-c6967b2da699" (UID: "c6ebf843-1cc8-411c-a978-c6967b2da699"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.357472 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-scripts" (OuterVolumeSpecName: "scripts") pod "c6ebf843-1cc8-411c-a978-c6967b2da699" (UID: "c6ebf843-1cc8-411c-a978-c6967b2da699"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.362073 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6ebf843-1cc8-411c-a978-c6967b2da699" (UID: "c6ebf843-1cc8-411c-a978-c6967b2da699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.413421 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.413469 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qfkm\" (UniqueName: \"kubernetes.io/projected/c6ebf843-1cc8-411c-a978-c6967b2da699-kube-api-access-5qfkm\") on node \"crc\" DevicePath \"\"" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.413484 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.413495 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c6ebf843-1cc8-411c-a978-c6967b2da699-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.413508 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c6ebf843-1cc8-411c-a978-c6967b2da699-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.413519 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.413529 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6ebf843-1cc8-411c-a978-c6967b2da699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.965799 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d50af3e522c324cc586ce6e7d67d20b85beeda3caf4f3f363aeec923ebd7a36b" Mar 14 10:05:21 crc kubenswrapper[5129]: I0314 10:05:21.965941 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-xrrhm" Mar 14 10:05:22 crc kubenswrapper[5129]: I0314 10:05:22.057961 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ebf843-1cc8-411c-a978-c6967b2da699" path="/var/lib/kubelet/pods/c6ebf843-1cc8-411c-a978-c6967b2da699/volumes" Mar 14 10:05:39 crc kubenswrapper[5129]: I0314 10:05:39.660368 5129 scope.go:117] "RemoveContainer" containerID="05f9f9b8c8374c2c5bd3b8b8756c9637e8531364b7d0cd91cc56643e040f7bbf" Mar 14 10:05:49 crc kubenswrapper[5129]: I0314 10:05:49.574328 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:05:49 crc kubenswrapper[5129]: I0314 10:05:49.575098 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.145436 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558046-2hj7v"] Mar 14 10:06:00 crc kubenswrapper[5129]: E0314 10:06:00.147429 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ebf843-1cc8-411c-a978-c6967b2da699" containerName="swift-ring-rebalance" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.147586 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ebf843-1cc8-411c-a978-c6967b2da699" containerName="swift-ring-rebalance" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.148026 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ebf843-1cc8-411c-a978-c6967b2da699" containerName="swift-ring-rebalance" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.149136 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558046-2hj7v" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.154077 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.154383 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.154872 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.164274 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558046-2hj7v"] Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.267865 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcm2\" (UniqueName: \"kubernetes.io/projected/778526c8-262a-4a3f-a4bb-d895c2d576c3-kube-api-access-bxcm2\") pod \"auto-csr-approver-29558046-2hj7v\" (UID: \"778526c8-262a-4a3f-a4bb-d895c2d576c3\") " pod="openshift-infra/auto-csr-approver-29558046-2hj7v" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.370737 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcm2\" (UniqueName: \"kubernetes.io/projected/778526c8-262a-4a3f-a4bb-d895c2d576c3-kube-api-access-bxcm2\") pod \"auto-csr-approver-29558046-2hj7v\" (UID: \"778526c8-262a-4a3f-a4bb-d895c2d576c3\") " pod="openshift-infra/auto-csr-approver-29558046-2hj7v" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.399948 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcm2\" (UniqueName: \"kubernetes.io/projected/778526c8-262a-4a3f-a4bb-d895c2d576c3-kube-api-access-bxcm2\") pod \"auto-csr-approver-29558046-2hj7v\" (UID: \"778526c8-262a-4a3f-a4bb-d895c2d576c3\") " pod="openshift-infra/auto-csr-approver-29558046-2hj7v" Mar 14 10:06:00 crc kubenswrapper[5129]: I0314 10:06:00.472772 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558046-2hj7v" Mar 14 10:06:01 crc kubenswrapper[5129]: I0314 10:06:01.071265 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558046-2hj7v"] Mar 14 10:06:01 crc kubenswrapper[5129]: I0314 10:06:01.081372 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:06:01 crc kubenswrapper[5129]: I0314 10:06:01.441365 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558046-2hj7v" event={"ID":"778526c8-262a-4a3f-a4bb-d895c2d576c3","Type":"ContainerStarted","Data":"ea392f7481d3ebede2eb7a27ae0ab4dc284cd6cae7d4d42e8ce8940f51991df0"} Mar 14 10:06:03 crc kubenswrapper[5129]: I0314 10:06:03.463424 5129 generic.go:334] "Generic (PLEG): container finished" podID="778526c8-262a-4a3f-a4bb-d895c2d576c3" containerID="9440ff55d54b7ae5c3592cdf1c00e7b20a63e8834b43eafbd48007a6457afa69" exitCode=0 Mar 14 10:06:03 crc kubenswrapper[5129]: I0314 10:06:03.463489 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558046-2hj7v" event={"ID":"778526c8-262a-4a3f-a4bb-d895c2d576c3","Type":"ContainerDied","Data":"9440ff55d54b7ae5c3592cdf1c00e7b20a63e8834b43eafbd48007a6457afa69"} Mar 14 10:06:05 crc kubenswrapper[5129]: I0314 10:06:05.322510 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558046-2hj7v" Mar 14 10:06:05 crc kubenswrapper[5129]: I0314 10:06:05.487080 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558046-2hj7v" event={"ID":"778526c8-262a-4a3f-a4bb-d895c2d576c3","Type":"ContainerDied","Data":"ea392f7481d3ebede2eb7a27ae0ab4dc284cd6cae7d4d42e8ce8940f51991df0"} Mar 14 10:06:05 crc kubenswrapper[5129]: I0314 10:06:05.487125 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea392f7481d3ebede2eb7a27ae0ab4dc284cd6cae7d4d42e8ce8940f51991df0" Mar 14 10:06:05 crc kubenswrapper[5129]: I0314 10:06:05.487122 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558046-2hj7v" Mar 14 10:06:05 crc kubenswrapper[5129]: I0314 10:06:05.498069 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxcm2\" (UniqueName: \"kubernetes.io/projected/778526c8-262a-4a3f-a4bb-d895c2d576c3-kube-api-access-bxcm2\") pod \"778526c8-262a-4a3f-a4bb-d895c2d576c3\" (UID: \"778526c8-262a-4a3f-a4bb-d895c2d576c3\") " Mar 14 10:06:05 crc kubenswrapper[5129]: I0314 10:06:05.503768 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/778526c8-262a-4a3f-a4bb-d895c2d576c3-kube-api-access-bxcm2" (OuterVolumeSpecName: "kube-api-access-bxcm2") pod "778526c8-262a-4a3f-a4bb-d895c2d576c3" (UID: "778526c8-262a-4a3f-a4bb-d895c2d576c3"). InnerVolumeSpecName "kube-api-access-bxcm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:06:05 crc kubenswrapper[5129]: I0314 10:06:05.601230 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxcm2\" (UniqueName: \"kubernetes.io/projected/778526c8-262a-4a3f-a4bb-d895c2d576c3-kube-api-access-bxcm2\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:06 crc kubenswrapper[5129]: I0314 10:06:06.403896 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558040-6bqj2"] Mar 14 10:06:06 crc kubenswrapper[5129]: I0314 10:06:06.415299 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558040-6bqj2"] Mar 14 10:06:08 crc kubenswrapper[5129]: I0314 10:06:08.051320 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fbb2ad7-729a-418f-a7be-e8db9ab1bb15" path="/var/lib/kubelet/pods/2fbb2ad7-729a-418f-a7be-e8db9ab1bb15/volumes" Mar 14 10:06:19 crc kubenswrapper[5129]: I0314 10:06:19.574211 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:06:19 crc kubenswrapper[5129]: I0314 10:06:19.575240 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.438855 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-4jbps"] Mar 14 10:06:21 crc kubenswrapper[5129]: E0314 10:06:21.439344 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778526c8-262a-4a3f-a4bb-d895c2d576c3" containerName="oc" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.439356 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="778526c8-262a-4a3f-a4bb-d895c2d576c3" containerName="oc" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.439571 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="778526c8-262a-4a3f-a4bb-d895c2d576c3" containerName="oc" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.440297 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.442433 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.443051 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.463687 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-4jbps"] Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.621518 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-dispersionconf\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.621580 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wfd\" (UniqueName: \"kubernetes.io/projected/994d8b24-c304-464f-a65e-01be87335135-kube-api-access-m7wfd\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.621627 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-swiftconf\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.621657 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-ring-data-devices\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.621764 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/994d8b24-c304-464f-a65e-01be87335135-etc-swift\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.621893 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-scripts\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.621974 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.723923 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.724101 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-dispersionconf\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.724176 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-swiftconf\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.724203 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wfd\" (UniqueName: \"kubernetes.io/projected/994d8b24-c304-464f-a65e-01be87335135-kube-api-access-m7wfd\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.724237 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-ring-data-devices\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.724274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/994d8b24-c304-464f-a65e-01be87335135-etc-swift\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.724317 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-scripts\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.725170 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-scripts\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.726683 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/994d8b24-c304-464f-a65e-01be87335135-etc-swift\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.726864 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-ring-data-devices\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.734449 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-swiftconf\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.743648 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-dispersionconf\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.743731 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.747886 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wfd\" (UniqueName: \"kubernetes.io/projected/994d8b24-c304-464f-a65e-01be87335135-kube-api-access-m7wfd\") pod \"swift-ring-rebalance-debug-4jbps\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:21 crc kubenswrapper[5129]: I0314 10:06:21.769024 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:22 crc kubenswrapper[5129]: I0314 10:06:22.302874 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-4jbps"] Mar 14 10:06:22 crc kubenswrapper[5129]: I0314 10:06:22.743265 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4jbps" event={"ID":"994d8b24-c304-464f-a65e-01be87335135","Type":"ContainerStarted","Data":"ca6f36738cd71117852950f72ffa14ffac3bbb350f7e6c2fccd083f54cd3cffb"} Mar 14 10:06:22 crc kubenswrapper[5129]: I0314 10:06:22.744623 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4jbps" event={"ID":"994d8b24-c304-464f-a65e-01be87335135","Type":"ContainerStarted","Data":"22022034cc5c92999274c40dda76475d42b92470a2bb986f9eb28f29c7e42820"} Mar 14 10:06:22 crc kubenswrapper[5129]: I0314 10:06:22.795765 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-4jbps" podStartSLOduration=1.795734664 podStartE2EDuration="1.795734664s" podCreationTimestamp="2026-03-14 10:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:06:22.785281861 +0000 UTC m=+11245.537197055" watchObservedRunningTime="2026-03-14 10:06:22.795734664 +0000 UTC m=+11245.547649878" Mar 14 10:06:31 crc kubenswrapper[5129]: I0314 10:06:31.857210 5129 generic.go:334] "Generic (PLEG): container finished" podID="994d8b24-c304-464f-a65e-01be87335135" containerID="ca6f36738cd71117852950f72ffa14ffac3bbb350f7e6c2fccd083f54cd3cffb" exitCode=0 Mar 14 10:06:31 crc kubenswrapper[5129]: I0314 10:06:31.857318 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-4jbps" event={"ID":"994d8b24-c304-464f-a65e-01be87335135","Type":"ContainerDied","Data":"ca6f36738cd71117852950f72ffa14ffac3bbb350f7e6c2fccd083f54cd3cffb"} Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.247985 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.296789 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-4jbps"] Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.316528 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-4jbps"] Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.346062 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-swiftconf\") pod \"994d8b24-c304-464f-a65e-01be87335135\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.346314 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7wfd\" (UniqueName: \"kubernetes.io/projected/994d8b24-c304-464f-a65e-01be87335135-kube-api-access-m7wfd\") pod \"994d8b24-c304-464f-a65e-01be87335135\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.346482 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-combined-ca-bundle\") pod \"994d8b24-c304-464f-a65e-01be87335135\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.346569 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-scripts\") pod \"994d8b24-c304-464f-a65e-01be87335135\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.346705 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-dispersionconf\") pod \"994d8b24-c304-464f-a65e-01be87335135\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.346799 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-ring-data-devices\") pod \"994d8b24-c304-464f-a65e-01be87335135\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.347001 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/994d8b24-c304-464f-a65e-01be87335135-etc-swift\") pod \"994d8b24-c304-464f-a65e-01be87335135\" (UID: \"994d8b24-c304-464f-a65e-01be87335135\") " Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.348434 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994d8b24-c304-464f-a65e-01be87335135-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "994d8b24-c304-464f-a65e-01be87335135" (UID: "994d8b24-c304-464f-a65e-01be87335135"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.349358 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "994d8b24-c304-464f-a65e-01be87335135" (UID: "994d8b24-c304-464f-a65e-01be87335135"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.369816 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994d8b24-c304-464f-a65e-01be87335135-kube-api-access-m7wfd" (OuterVolumeSpecName: "kube-api-access-m7wfd") pod "994d8b24-c304-464f-a65e-01be87335135" (UID: "994d8b24-c304-464f-a65e-01be87335135"). InnerVolumeSpecName "kube-api-access-m7wfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.382193 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-scripts" (OuterVolumeSpecName: "scripts") pod "994d8b24-c304-464f-a65e-01be87335135" (UID: "994d8b24-c304-464f-a65e-01be87335135"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.388904 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "994d8b24-c304-464f-a65e-01be87335135" (UID: "994d8b24-c304-464f-a65e-01be87335135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.405496 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "994d8b24-c304-464f-a65e-01be87335135" (UID: "994d8b24-c304-464f-a65e-01be87335135"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.421820 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "994d8b24-c304-464f-a65e-01be87335135" (UID: "994d8b24-c304-464f-a65e-01be87335135"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.449983 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.450021 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.450032 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.450040 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/994d8b24-c304-464f-a65e-01be87335135-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.450048 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/994d8b24-c304-464f-a65e-01be87335135-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.450056 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/994d8b24-c304-464f-a65e-01be87335135-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.450064 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7wfd\" (UniqueName: \"kubernetes.io/projected/994d8b24-c304-464f-a65e-01be87335135-kube-api-access-m7wfd\") on node \"crc\" DevicePath \"\"" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.925250 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22022034cc5c92999274c40dda76475d42b92470a2bb986f9eb28f29c7e42820" Mar 14 10:06:34 crc kubenswrapper[5129]: I0314 10:06:34.925331 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-4jbps" Mar 14 10:06:36 crc kubenswrapper[5129]: I0314 10:06:36.060941 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994d8b24-c304-464f-a65e-01be87335135" path="/var/lib/kubelet/pods/994d8b24-c304-464f-a65e-01be87335135/volumes" Mar 14 10:06:39 crc kubenswrapper[5129]: I0314 10:06:39.754009 5129 scope.go:117] "RemoveContainer" containerID="665bee379c6ae7e4c2953e5bf670794a88b968eefe0f434b944fbff91557e6bd" Mar 14 10:06:39 crc kubenswrapper[5129]: I0314 10:06:39.806944 5129 scope.go:117] "RemoveContainer" containerID="03571b17cb0547261e15b98f55bae0f240fba5eaeb90ada894e4b42479455bc1" Mar 14 10:06:49 crc kubenswrapper[5129]: I0314 10:06:49.574140 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:06:49 crc kubenswrapper[5129]: I0314 10:06:49.575135 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:06:49 crc kubenswrapper[5129]: I0314 10:06:49.575497 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:06:49 crc kubenswrapper[5129]: I0314 10:06:49.576321 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:06:49 crc kubenswrapper[5129]: I0314 10:06:49.576378 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" gracePeriod=600 Mar 14 10:06:49 crc kubenswrapper[5129]: E0314 10:06:49.699029 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:06:50 crc kubenswrapper[5129]: I0314 10:06:50.123965 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" exitCode=0 Mar 14 10:06:50 crc kubenswrapper[5129]: I0314 10:06:50.124047 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311"} Mar 14 10:06:50 crc kubenswrapper[5129]: I0314 10:06:50.124143 5129 scope.go:117] "RemoveContainer" containerID="41973fc889c20df973d49b23f7c22998ca1537f65950641326a14bf146e131d6" Mar 14 10:06:50 crc kubenswrapper[5129]: I0314 10:06:50.125048 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:06:50 crc kubenswrapper[5129]: E0314 10:06:50.125520 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:07:03 crc kubenswrapper[5129]: I0314 10:07:03.037534 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:07:03 crc kubenswrapper[5129]: E0314 10:07:03.038393 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.859473 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-chtrc"] Mar 14 10:07:16 crc kubenswrapper[5129]: E0314 10:07:16.860675 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994d8b24-c304-464f-a65e-01be87335135" containerName="swift-ring-rebalance" Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.860695 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="994d8b24-c304-464f-a65e-01be87335135" containerName="swift-ring-rebalance" Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.861032 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="994d8b24-c304-464f-a65e-01be87335135" containerName="swift-ring-rebalance" Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.863107 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.916419 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtrc"] Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.971481 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-utilities\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.971560 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-catalog-content\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:16 crc kubenswrapper[5129]: I0314 10:07:16.971660 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8brt8\" (UniqueName: \"kubernetes.io/projected/e30f884f-9da4-4dd6-9498-9eba64de14d7-kube-api-access-8brt8\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.045991 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:07:17 crc kubenswrapper[5129]: E0314 10:07:17.046218 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.075517 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8brt8\" (UniqueName: \"kubernetes.io/projected/e30f884f-9da4-4dd6-9498-9eba64de14d7-kube-api-access-8brt8\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.075764 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-utilities\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.075810 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-catalog-content\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.076416 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-catalog-content\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.076728 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-utilities\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.116580 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8brt8\" (UniqueName: \"kubernetes.io/projected/e30f884f-9da4-4dd6-9498-9eba64de14d7-kube-api-access-8brt8\") pod \"redhat-marketplace-chtrc\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:17 crc kubenswrapper[5129]: I0314 10:07:17.187155 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:18 crc kubenswrapper[5129]: I0314 10:07:18.064354 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtrc"] Mar 14 10:07:18 crc kubenswrapper[5129]: I0314 10:07:18.469766 5129 generic.go:334] "Generic (PLEG): container finished" podID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerID="43d28d29d846ab6674a06003177c0ceaf2326d8fcb4ee03630a6af8414188d7f" exitCode=0 Mar 14 10:07:18 crc kubenswrapper[5129]: I0314 10:07:18.469809 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtrc" event={"ID":"e30f884f-9da4-4dd6-9498-9eba64de14d7","Type":"ContainerDied","Data":"43d28d29d846ab6674a06003177c0ceaf2326d8fcb4ee03630a6af8414188d7f"} Mar 14 10:07:18 crc kubenswrapper[5129]: I0314 10:07:18.470144 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtrc" event={"ID":"e30f884f-9da4-4dd6-9498-9eba64de14d7","Type":"ContainerStarted","Data":"d78049b304cf6af9deefc71ada2ab605dd6bc98b0aa6c8390eeb45558c60000f"} Mar 14 10:07:19 crc kubenswrapper[5129]: I0314 10:07:19.491366 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtrc" event={"ID":"e30f884f-9da4-4dd6-9498-9eba64de14d7","Type":"ContainerStarted","Data":"a43ede8ea4e21bbc02a73190a5dfeb9055dfafe7b80a25cb6b5e4368e02ba173"} Mar 14 10:07:20 crc kubenswrapper[5129]: I0314 10:07:20.505501 5129 generic.go:334] "Generic (PLEG): container finished" podID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerID="a43ede8ea4e21bbc02a73190a5dfeb9055dfafe7b80a25cb6b5e4368e02ba173" exitCode=0 Mar 14 10:07:20 crc kubenswrapper[5129]: I0314 10:07:20.505553 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtrc" event={"ID":"e30f884f-9da4-4dd6-9498-9eba64de14d7","Type":"ContainerDied","Data":"a43ede8ea4e21bbc02a73190a5dfeb9055dfafe7b80a25cb6b5e4368e02ba173"} Mar 14 10:07:21 crc kubenswrapper[5129]: I0314 10:07:21.525039 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtrc" event={"ID":"e30f884f-9da4-4dd6-9498-9eba64de14d7","Type":"ContainerStarted","Data":"2f1d008f525e3cf9074cbadf7e4fc205b3e54afabf0f229201f1d4b61e4e7d43"} Mar 14 10:07:21 crc kubenswrapper[5129]: I0314 10:07:21.565818 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-chtrc" podStartSLOduration=3.134304798 podStartE2EDuration="5.565794447s" podCreationTimestamp="2026-03-14 10:07:16 +0000 UTC" firstStartedPulling="2026-03-14 10:07:18.47163682 +0000 UTC m=+11301.223552004" lastFinishedPulling="2026-03-14 10:07:20.903126469 +0000 UTC m=+11303.655041653" observedRunningTime="2026-03-14 10:07:21.560298908 +0000 UTC m=+11304.312214082" watchObservedRunningTime="2026-03-14 10:07:21.565794447 +0000 UTC m=+11304.317709631" Mar 14 10:07:27 crc kubenswrapper[5129]: I0314 10:07:27.189101 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:27 crc kubenswrapper[5129]: I0314 10:07:27.189758 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:27 crc kubenswrapper[5129]: I0314 10:07:27.246956 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:27 crc kubenswrapper[5129]: I0314 10:07:27.674754 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:27 crc kubenswrapper[5129]: I0314 10:07:27.732761 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtrc"] Mar 14 10:07:29 crc kubenswrapper[5129]: I0314 10:07:29.036515 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:07:29 crc kubenswrapper[5129]: E0314 10:07:29.037338 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:07:29 crc kubenswrapper[5129]: I0314 10:07:29.644099 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-chtrc" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="registry-server" containerID="cri-o://2f1d008f525e3cf9074cbadf7e4fc205b3e54afabf0f229201f1d4b61e4e7d43" gracePeriod=2 Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.669662 5129 generic.go:334] "Generic (PLEG): container finished" podID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerID="2f1d008f525e3cf9074cbadf7e4fc205b3e54afabf0f229201f1d4b61e4e7d43" exitCode=0 Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.670165 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtrc" event={"ID":"e30f884f-9da4-4dd6-9498-9eba64de14d7","Type":"ContainerDied","Data":"2f1d008f525e3cf9074cbadf7e4fc205b3e54afabf0f229201f1d4b61e4e7d43"} Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.784733 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.910810 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-utilities\") pod \"e30f884f-9da4-4dd6-9498-9eba64de14d7\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.911536 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8brt8\" (UniqueName: \"kubernetes.io/projected/e30f884f-9da4-4dd6-9498-9eba64de14d7-kube-api-access-8brt8\") pod \"e30f884f-9da4-4dd6-9498-9eba64de14d7\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.911657 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-catalog-content\") pod \"e30f884f-9da4-4dd6-9498-9eba64de14d7\" (UID: \"e30f884f-9da4-4dd6-9498-9eba64de14d7\") " Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.911991 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-utilities" (OuterVolumeSpecName: "utilities") pod "e30f884f-9da4-4dd6-9498-9eba64de14d7" (UID: "e30f884f-9da4-4dd6-9498-9eba64de14d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.912412 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.922109 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30f884f-9da4-4dd6-9498-9eba64de14d7-kube-api-access-8brt8" (OuterVolumeSpecName: "kube-api-access-8brt8") pod "e30f884f-9da4-4dd6-9498-9eba64de14d7" (UID: "e30f884f-9da4-4dd6-9498-9eba64de14d7"). InnerVolumeSpecName "kube-api-access-8brt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:07:30 crc kubenswrapper[5129]: I0314 10:07:30.943003 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e30f884f-9da4-4dd6-9498-9eba64de14d7" (UID: "e30f884f-9da4-4dd6-9498-9eba64de14d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.015508 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8brt8\" (UniqueName: \"kubernetes.io/projected/e30f884f-9da4-4dd6-9498-9eba64de14d7-kube-api-access-8brt8\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.015564 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e30f884f-9da4-4dd6-9498-9eba64de14d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.683752 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chtrc" event={"ID":"e30f884f-9da4-4dd6-9498-9eba64de14d7","Type":"ContainerDied","Data":"d78049b304cf6af9deefc71ada2ab605dd6bc98b0aa6c8390eeb45558c60000f"} Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.683828 5129 scope.go:117] "RemoveContainer" containerID="2f1d008f525e3cf9074cbadf7e4fc205b3e54afabf0f229201f1d4b61e4e7d43" Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.683849 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chtrc" Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.710357 5129 scope.go:117] "RemoveContainer" containerID="a43ede8ea4e21bbc02a73190a5dfeb9055dfafe7b80a25cb6b5e4368e02ba173" Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.744576 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtrc"] Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.777825 5129 scope.go:117] "RemoveContainer" containerID="43d28d29d846ab6674a06003177c0ceaf2326d8fcb4ee03630a6af8414188d7f" Mar 14 10:07:31 crc kubenswrapper[5129]: I0314 10:07:31.778824 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-chtrc"] Mar 14 10:07:32 crc kubenswrapper[5129]: I0314 10:07:32.050793 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" path="/var/lib/kubelet/pods/e30f884f-9da4-4dd6-9498-9eba64de14d7/volumes" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.555164 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-nt2mt"] Mar 14 10:07:34 crc kubenswrapper[5129]: E0314 10:07:34.556094 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="extract-content" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.556115 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="extract-content" Mar 14 10:07:34 crc kubenswrapper[5129]: E0314 10:07:34.556170 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="extract-utilities" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.556179 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="extract-utilities" Mar 14 10:07:34 crc kubenswrapper[5129]: E0314 10:07:34.556192 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="registry-server" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.556201 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="registry-server" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.556465 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30f884f-9da4-4dd6-9498-9eba64de14d7" containerName="registry-server" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.557448 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.561453 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.562133 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.598741 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-nt2mt"] Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.607595 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-scripts\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.607850 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-ring-data-devices\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.607955 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.608076 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-swiftconf\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.608291 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-etc-swift\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.608333 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-dispersionconf\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.608428 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wvvl\" (UniqueName: \"kubernetes.io/projected/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-kube-api-access-9wvvl\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.710388 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-ring-data-devices\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.710459 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.710506 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-swiftconf\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.710610 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-etc-swift\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.710659 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-dispersionconf\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.710688 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wvvl\" (UniqueName: \"kubernetes.io/projected/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-kube-api-access-9wvvl\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.710762 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-scripts\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.711835 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-scripts\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.712129 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-etc-swift\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.718520 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-swiftconf\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.718852 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-ring-data-devices\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.719333 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-dispersionconf\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.731925 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wvvl\" (UniqueName: \"kubernetes.io/projected/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-kube-api-access-9wvvl\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.743824 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-nt2mt\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:34 crc kubenswrapper[5129]: I0314 10:07:34.910288 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:35 crc kubenswrapper[5129]: I0314 10:07:35.718979 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-nt2mt"] Mar 14 10:07:36 crc kubenswrapper[5129]: I0314 10:07:36.762451 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-nt2mt" event={"ID":"6c2941b4-9c4d-49f2-90f1-75dbfd22105d","Type":"ContainerStarted","Data":"84dbca154cffbb4ed440206d11a7f69472d37e88b1dc4cc716b3d0227e302322"} Mar 14 10:07:36 crc kubenswrapper[5129]: I0314 10:07:36.763045 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-nt2mt" event={"ID":"6c2941b4-9c4d-49f2-90f1-75dbfd22105d","Type":"ContainerStarted","Data":"e2231a6d4ee65cb6a615dc9eb4caeaa32b144c698f1150a213dd3c0652ad6e2f"} Mar 14 10:07:36 crc kubenswrapper[5129]: I0314 10:07:36.777476 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-nt2mt" podStartSLOduration=2.777450914 podStartE2EDuration="2.777450914s" podCreationTimestamp="2026-03-14 10:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:07:36.775228014 +0000 UTC m=+11319.527143208" watchObservedRunningTime="2026-03-14 10:07:36.777450914 +0000 UTC m=+11319.529366098" Mar 14 10:07:39 crc kubenswrapper[5129]: I0314 10:07:39.946282 5129 scope.go:117] "RemoveContainer" containerID="78102a913892a792348c5e7c7b510175c62ebf6a9602d647c99c5c436e2315d7" Mar 14 10:07:43 crc kubenswrapper[5129]: I0314 10:07:43.036933 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:07:43 crc kubenswrapper[5129]: E0314 10:07:43.037844 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:07:46 crc kubenswrapper[5129]: I0314 10:07:46.912739 5129 generic.go:334] "Generic (PLEG): container finished" podID="6c2941b4-9c4d-49f2-90f1-75dbfd22105d" containerID="84dbca154cffbb4ed440206d11a7f69472d37e88b1dc4cc716b3d0227e302322" exitCode=0 Mar 14 10:07:46 crc kubenswrapper[5129]: I0314 10:07:46.912832 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-nt2mt" event={"ID":"6c2941b4-9c4d-49f2-90f1-75dbfd22105d","Type":"ContainerDied","Data":"84dbca154cffbb4ed440206d11a7f69472d37e88b1dc4cc716b3d0227e302322"} Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.411819 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.438852 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-swiftconf\") pod \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.439217 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-scripts\") pod \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.439374 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wvvl\" (UniqueName: \"kubernetes.io/projected/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-kube-api-access-9wvvl\") pod \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.439495 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-etc-swift\") pod \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.439579 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-combined-ca-bundle\") pod \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.439632 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-ring-data-devices\") pod \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.439672 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-dispersionconf\") pod \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\" (UID: \"6c2941b4-9c4d-49f2-90f1-75dbfd22105d\") " Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.440159 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6c2941b4-9c4d-49f2-90f1-75dbfd22105d" (UID: "6c2941b4-9c4d-49f2-90f1-75dbfd22105d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.440299 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6c2941b4-9c4d-49f2-90f1-75dbfd22105d" (UID: "6c2941b4-9c4d-49f2-90f1-75dbfd22105d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.440561 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.440579 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.468934 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-kube-api-access-9wvvl" (OuterVolumeSpecName: "kube-api-access-9wvvl") pod "6c2941b4-9c4d-49f2-90f1-75dbfd22105d" (UID: "6c2941b4-9c4d-49f2-90f1-75dbfd22105d"). InnerVolumeSpecName "kube-api-access-9wvvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.491833 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-nt2mt"] Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.504725 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-scripts" (OuterVolumeSpecName: "scripts") pod "6c2941b4-9c4d-49f2-90f1-75dbfd22105d" (UID: "6c2941b4-9c4d-49f2-90f1-75dbfd22105d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.507261 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c2941b4-9c4d-49f2-90f1-75dbfd22105d" (UID: "6c2941b4-9c4d-49f2-90f1-75dbfd22105d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.507787 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6c2941b4-9c4d-49f2-90f1-75dbfd22105d" (UID: "6c2941b4-9c4d-49f2-90f1-75dbfd22105d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.523957 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-nt2mt"] Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.527622 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6c2941b4-9c4d-49f2-90f1-75dbfd22105d" (UID: "6c2941b4-9c4d-49f2-90f1-75dbfd22105d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.542199 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.542239 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wvvl\" (UniqueName: \"kubernetes.io/projected/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-kube-api-access-9wvvl\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.542251 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.542261 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.542270 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c2941b4-9c4d-49f2-90f1-75dbfd22105d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.945276 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2231a6d4ee65cb6a615dc9eb4caeaa32b144c698f1150a213dd3c0652ad6e2f" Mar 14 10:07:49 crc kubenswrapper[5129]: I0314 10:07:49.945378 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-nt2mt" Mar 14 10:07:50 crc kubenswrapper[5129]: I0314 10:07:50.051091 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2941b4-9c4d-49f2-90f1-75dbfd22105d" path="/var/lib/kubelet/pods/6c2941b4-9c4d-49f2-90f1-75dbfd22105d/volumes" Mar 14 10:07:58 crc kubenswrapper[5129]: I0314 10:07:58.045889 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:07:58 crc kubenswrapper[5129]: E0314 10:07:58.046841 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.162421 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558048-7cf44"] Mar 14 10:08:00 crc kubenswrapper[5129]: E0314 10:08:00.163339 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2941b4-9c4d-49f2-90f1-75dbfd22105d" containerName="swift-ring-rebalance" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.163366 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2941b4-9c4d-49f2-90f1-75dbfd22105d" containerName="swift-ring-rebalance" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.163844 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2941b4-9c4d-49f2-90f1-75dbfd22105d" containerName="swift-ring-rebalance" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.164945 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558048-7cf44" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.168056 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.168063 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.170217 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.184462 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558048-7cf44"] Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.315893 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knl7z\" (UniqueName: \"kubernetes.io/projected/d4e46763-0519-4c45-b8fe-cd6d0b069f13-kube-api-access-knl7z\") pod \"auto-csr-approver-29558048-7cf44\" (UID: \"d4e46763-0519-4c45-b8fe-cd6d0b069f13\") " pod="openshift-infra/auto-csr-approver-29558048-7cf44" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.419735 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knl7z\" (UniqueName: \"kubernetes.io/projected/d4e46763-0519-4c45-b8fe-cd6d0b069f13-kube-api-access-knl7z\") pod \"auto-csr-approver-29558048-7cf44\" (UID: \"d4e46763-0519-4c45-b8fe-cd6d0b069f13\") " pod="openshift-infra/auto-csr-approver-29558048-7cf44" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.444834 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knl7z\" (UniqueName: \"kubernetes.io/projected/d4e46763-0519-4c45-b8fe-cd6d0b069f13-kube-api-access-knl7z\") pod \"auto-csr-approver-29558048-7cf44\" (UID: \"d4e46763-0519-4c45-b8fe-cd6d0b069f13\") " pod="openshift-infra/auto-csr-approver-29558048-7cf44" Mar 14 10:08:00 crc kubenswrapper[5129]: I0314 10:08:00.493481 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558048-7cf44" Mar 14 10:08:01 crc kubenswrapper[5129]: I0314 10:08:01.269682 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558048-7cf44"] Mar 14 10:08:02 crc kubenswrapper[5129]: I0314 10:08:02.150932 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558048-7cf44" event={"ID":"d4e46763-0519-4c45-b8fe-cd6d0b069f13","Type":"ContainerStarted","Data":"fd0aecd32e5f2019da4dec896f2d45b1837d1b9e038e83c087f91b76374fceba"} Mar 14 10:08:03 crc kubenswrapper[5129]: I0314 10:08:03.167693 5129 generic.go:334] "Generic (PLEG): container finished" podID="d4e46763-0519-4c45-b8fe-cd6d0b069f13" containerID="95d2b0ab8c3accb25327cfed7c1e926019c039cac5bf681f655741fb46e0c4fb" exitCode=0 Mar 14 10:08:03 crc kubenswrapper[5129]: I0314 10:08:03.168187 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558048-7cf44" event={"ID":"d4e46763-0519-4c45-b8fe-cd6d0b069f13","Type":"ContainerDied","Data":"95d2b0ab8c3accb25327cfed7c1e926019c039cac5bf681f655741fb46e0c4fb"} Mar 14 10:08:05 crc kubenswrapper[5129]: I0314 10:08:05.204067 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558048-7cf44" event={"ID":"d4e46763-0519-4c45-b8fe-cd6d0b069f13","Type":"ContainerDied","Data":"fd0aecd32e5f2019da4dec896f2d45b1837d1b9e038e83c087f91b76374fceba"} Mar 14 10:08:05 crc kubenswrapper[5129]: I0314 10:08:05.204547 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0aecd32e5f2019da4dec896f2d45b1837d1b9e038e83c087f91b76374fceba" Mar 14 10:08:05 crc kubenswrapper[5129]: I0314 10:08:05.287155 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558048-7cf44" Mar 14 10:08:05 crc kubenswrapper[5129]: I0314 10:08:05.449552 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knl7z\" (UniqueName: \"kubernetes.io/projected/d4e46763-0519-4c45-b8fe-cd6d0b069f13-kube-api-access-knl7z\") pod \"d4e46763-0519-4c45-b8fe-cd6d0b069f13\" (UID: \"d4e46763-0519-4c45-b8fe-cd6d0b069f13\") " Mar 14 10:08:05 crc kubenswrapper[5129]: I0314 10:08:05.455695 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e46763-0519-4c45-b8fe-cd6d0b069f13-kube-api-access-knl7z" (OuterVolumeSpecName: "kube-api-access-knl7z") pod "d4e46763-0519-4c45-b8fe-cd6d0b069f13" (UID: "d4e46763-0519-4c45-b8fe-cd6d0b069f13"). InnerVolumeSpecName "kube-api-access-knl7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:08:05 crc kubenswrapper[5129]: I0314 10:08:05.554802 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knl7z\" (UniqueName: \"kubernetes.io/projected/d4e46763-0519-4c45-b8fe-cd6d0b069f13-kube-api-access-knl7z\") on node \"crc\" DevicePath \"\"" Mar 14 10:08:06 crc kubenswrapper[5129]: I0314 10:08:06.217524 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558048-7cf44" Mar 14 10:08:06 crc kubenswrapper[5129]: I0314 10:08:06.396583 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558042-qdxv8"] Mar 14 10:08:06 crc kubenswrapper[5129]: I0314 10:08:06.411321 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558042-qdxv8"] Mar 14 10:08:08 crc kubenswrapper[5129]: I0314 10:08:08.072353 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d5758a-a0b4-4478-8a0a-2d4f9c42011e" path="/var/lib/kubelet/pods/b7d5758a-a0b4-4478-8a0a-2d4f9c42011e/volumes" Mar 14 10:08:12 crc kubenswrapper[5129]: I0314 10:08:12.037261 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:08:12 crc kubenswrapper[5129]: E0314 10:08:12.037995 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.297954 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fzv2j"] Mar 14 10:08:19 crc kubenswrapper[5129]: E0314 10:08:19.299375 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e46763-0519-4c45-b8fe-cd6d0b069f13" containerName="oc" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.299397 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e46763-0519-4c45-b8fe-cd6d0b069f13" containerName="oc" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.299846 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e46763-0519-4c45-b8fe-cd6d0b069f13" containerName="oc" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.302683 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.314039 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzv2j"] Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.402891 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-utilities\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.403258 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfrlp\" (UniqueName: \"kubernetes.io/projected/3a090521-4f5b-4a48-ad5c-9405b72e40b2-kube-api-access-zfrlp\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.403535 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-catalog-content\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.505252 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-catalog-content\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.505332 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-utilities\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.505399 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfrlp\" (UniqueName: \"kubernetes.io/projected/3a090521-4f5b-4a48-ad5c-9405b72e40b2-kube-api-access-zfrlp\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.506169 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-utilities\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.506519 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-catalog-content\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.528711 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfrlp\" (UniqueName: \"kubernetes.io/projected/3a090521-4f5b-4a48-ad5c-9405b72e40b2-kube-api-access-zfrlp\") pod \"redhat-operators-fzv2j\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:19 crc kubenswrapper[5129]: I0314 10:08:19.647289 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:20 crc kubenswrapper[5129]: I0314 10:08:20.402691 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzv2j"] Mar 14 10:08:21 crc kubenswrapper[5129]: I0314 10:08:21.422322 5129 generic.go:334] "Generic (PLEG): container finished" podID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerID="1a06e0d420a74cb9ce56913007e0cea528c2012b492b73cc3dceafa3946a5a1c" exitCode=0 Mar 14 10:08:21 crc kubenswrapper[5129]: I0314 10:08:21.422410 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzv2j" event={"ID":"3a090521-4f5b-4a48-ad5c-9405b72e40b2","Type":"ContainerDied","Data":"1a06e0d420a74cb9ce56913007e0cea528c2012b492b73cc3dceafa3946a5a1c"} Mar 14 10:08:21 crc kubenswrapper[5129]: I0314 10:08:21.422725 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzv2j" event={"ID":"3a090521-4f5b-4a48-ad5c-9405b72e40b2","Type":"ContainerStarted","Data":"00b90fc7fd74562a626675079fc5c2bab952b5a5803dd3409a6c773586ab6a39"} Mar 14 10:08:22 crc kubenswrapper[5129]: I0314 10:08:22.437424 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzv2j" event={"ID":"3a090521-4f5b-4a48-ad5c-9405b72e40b2","Type":"ContainerStarted","Data":"4e6dbf98cb23824872c3a0dee3707ddf7161d06e9c768505e0027896d58a3f43"} Mar 14 10:08:26 crc kubenswrapper[5129]: I0314 10:08:26.036735 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:08:26 crc kubenswrapper[5129]: E0314 10:08:26.038382 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:08:27 crc kubenswrapper[5129]: I0314 10:08:27.506459 5129 generic.go:334] "Generic (PLEG): container finished" podID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerID="4e6dbf98cb23824872c3a0dee3707ddf7161d06e9c768505e0027896d58a3f43" exitCode=0 Mar 14 10:08:27 crc kubenswrapper[5129]: I0314 10:08:27.506657 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzv2j" event={"ID":"3a090521-4f5b-4a48-ad5c-9405b72e40b2","Type":"ContainerDied","Data":"4e6dbf98cb23824872c3a0dee3707ddf7161d06e9c768505e0027896d58a3f43"} Mar 14 10:08:28 crc kubenswrapper[5129]: I0314 10:08:28.528050 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzv2j" event={"ID":"3a090521-4f5b-4a48-ad5c-9405b72e40b2","Type":"ContainerStarted","Data":"be9ecb5da8527ea87733f2ec6ac7cd755506e8d28ad85aaa1fbeec1cc54a8034"} Mar 14 10:08:28 crc kubenswrapper[5129]: I0314 10:08:28.560516 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fzv2j" podStartSLOduration=2.841456766 podStartE2EDuration="9.560467146s" podCreationTimestamp="2026-03-14 10:08:19 +0000 UTC" firstStartedPulling="2026-03-14 10:08:21.424001391 +0000 UTC m=+11364.175916575" lastFinishedPulling="2026-03-14 10:08:28.143011771 +0000 UTC m=+11370.894926955" observedRunningTime="2026-03-14 10:08:28.551113973 +0000 UTC m=+11371.303029177" watchObservedRunningTime="2026-03-14 10:08:28.560467146 +0000 UTC m=+11371.312382330" Mar 14 10:08:29 crc kubenswrapper[5129]: I0314 10:08:29.648088 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:29 crc kubenswrapper[5129]: I0314 10:08:29.648388 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:30 crc kubenswrapper[5129]: I0314 10:08:30.716396 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fzv2j" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="registry-server" probeResult="failure" output=< Mar 14 10:08:30 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:08:30 crc kubenswrapper[5129]: > Mar 14 10:08:39 crc kubenswrapper[5129]: I0314 10:08:39.037040 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:08:39 crc kubenswrapper[5129]: E0314 10:08:39.037950 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:08:40 crc kubenswrapper[5129]: I0314 10:08:40.090944 5129 scope.go:117] "RemoveContainer" containerID="873f274cf5699cee45598cd08591c3572beb8bc4b01681ffa23ec5a614a4b29f" Mar 14 10:08:40 crc kubenswrapper[5129]: I0314 10:08:40.712320 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fzv2j" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="registry-server" probeResult="failure" output=< Mar 14 10:08:40 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:08:40 crc kubenswrapper[5129]: > Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.643624 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-66kj7"] Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.646382 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.649882 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.650010 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.656921 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-66kj7"] Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.725532 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.752587 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-dispersionconf\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.752717 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.752755 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-ring-data-devices\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.752774 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/baff5b24-d1a9-4826-805b-425c628d3d5d-etc-swift\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.752837 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-swiftconf\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.752854 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-scripts\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.752872 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgvd\" (UniqueName: \"kubernetes.io/projected/baff5b24-d1a9-4826-805b-425c628d3d5d-kube-api-access-4zgvd\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.795390 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.855274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-dispersionconf\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.855412 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.855456 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-ring-data-devices\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.855479 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/baff5b24-d1a9-4826-805b-425c628d3d5d-etc-swift\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.855535 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-swiftconf\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.855557 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-scripts\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.855575 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgvd\" (UniqueName: \"kubernetes.io/projected/baff5b24-d1a9-4826-805b-425c628d3d5d-kube-api-access-4zgvd\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.856669 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-ring-data-devices\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.856908 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/baff5b24-d1a9-4826-805b-425c628d3d5d-etc-swift\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.859532 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-scripts\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.862440 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.862804 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-dispersionconf\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.868950 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-swiftconf\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.877230 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgvd\" (UniqueName: \"kubernetes.io/projected/baff5b24-d1a9-4826-805b-425c628d3d5d-kube-api-access-4zgvd\") pod \"swift-ring-rebalance-debug-66kj7\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:49 crc kubenswrapper[5129]: I0314 10:08:49.972128 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:08:50 crc kubenswrapper[5129]: I0314 10:08:50.502827 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzv2j"] Mar 14 10:08:50 crc kubenswrapper[5129]: I0314 10:08:50.696395 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-66kj7"] Mar 14 10:08:50 crc kubenswrapper[5129]: I0314 10:08:50.797529 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-66kj7" event={"ID":"baff5b24-d1a9-4826-805b-425c628d3d5d","Type":"ContainerStarted","Data":"8b64b31ae98821b5729072199f33db6796ddd11755a211d94afba6283a2f147e"} Mar 14 10:08:50 crc kubenswrapper[5129]: I0314 10:08:50.797753 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fzv2j" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="registry-server" containerID="cri-o://be9ecb5da8527ea87733f2ec6ac7cd755506e8d28ad85aaa1fbeec1cc54a8034" gracePeriod=2 Mar 14 10:08:51 crc kubenswrapper[5129]: I0314 10:08:51.040079 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:08:51 crc kubenswrapper[5129]: E0314 10:08:51.041092 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:08:51 crc kubenswrapper[5129]: I0314 10:08:51.814231 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-66kj7" event={"ID":"baff5b24-d1a9-4826-805b-425c628d3d5d","Type":"ContainerStarted","Data":"d16feee0a3e9cb213de96f7af24b0ac8037a4b9829f9ee9b41ff76d2bcaaca51"} Mar 14 10:08:51 crc kubenswrapper[5129]: I0314 10:08:51.817868 5129 generic.go:334] "Generic (PLEG): container finished" podID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerID="be9ecb5da8527ea87733f2ec6ac7cd755506e8d28ad85aaa1fbeec1cc54a8034" exitCode=0 Mar 14 10:08:51 crc kubenswrapper[5129]: I0314 10:08:51.817941 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzv2j" event={"ID":"3a090521-4f5b-4a48-ad5c-9405b72e40b2","Type":"ContainerDied","Data":"be9ecb5da8527ea87733f2ec6ac7cd755506e8d28ad85aaa1fbeec1cc54a8034"} Mar 14 10:08:51 crc kubenswrapper[5129]: I0314 10:08:51.856181 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-66kj7" podStartSLOduration=2.856152641 podStartE2EDuration="2.856152641s" podCreationTimestamp="2026-03-14 10:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:08:51.842143281 +0000 UTC m=+11394.594058505" watchObservedRunningTime="2026-03-14 10:08:51.856152641 +0000 UTC m=+11394.608067855" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.115526 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.213462 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfrlp\" (UniqueName: \"kubernetes.io/projected/3a090521-4f5b-4a48-ad5c-9405b72e40b2-kube-api-access-zfrlp\") pod \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.213547 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-catalog-content\") pod \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.213634 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-utilities\") pod \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\" (UID: \"3a090521-4f5b-4a48-ad5c-9405b72e40b2\") " Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.215141 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-utilities" (OuterVolumeSpecName: "utilities") pod "3a090521-4f5b-4a48-ad5c-9405b72e40b2" (UID: "3a090521-4f5b-4a48-ad5c-9405b72e40b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.220560 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a090521-4f5b-4a48-ad5c-9405b72e40b2-kube-api-access-zfrlp" (OuterVolumeSpecName: "kube-api-access-zfrlp") pod "3a090521-4f5b-4a48-ad5c-9405b72e40b2" (UID: "3a090521-4f5b-4a48-ad5c-9405b72e40b2"). InnerVolumeSpecName "kube-api-access-zfrlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.316266 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfrlp\" (UniqueName: \"kubernetes.io/projected/3a090521-4f5b-4a48-ad5c-9405b72e40b2-kube-api-access-zfrlp\") on node \"crc\" DevicePath \"\"" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.316301 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.362565 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a090521-4f5b-4a48-ad5c-9405b72e40b2" (UID: "3a090521-4f5b-4a48-ad5c-9405b72e40b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.419275 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a090521-4f5b-4a48-ad5c-9405b72e40b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.834455 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzv2j" event={"ID":"3a090521-4f5b-4a48-ad5c-9405b72e40b2","Type":"ContainerDied","Data":"00b90fc7fd74562a626675079fc5c2bab952b5a5803dd3409a6c773586ab6a39"} Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.834568 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzv2j" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.834776 5129 scope.go:117] "RemoveContainer" containerID="be9ecb5da8527ea87733f2ec6ac7cd755506e8d28ad85aaa1fbeec1cc54a8034" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.868788 5129 scope.go:117] "RemoveContainer" containerID="4e6dbf98cb23824872c3a0dee3707ddf7161d06e9c768505e0027896d58a3f43" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.896131 5129 scope.go:117] "RemoveContainer" containerID="1a06e0d420a74cb9ce56913007e0cea528c2012b492b73cc3dceafa3946a5a1c" Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.896736 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzv2j"] Mar 14 10:08:52 crc kubenswrapper[5129]: I0314 10:08:52.909522 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fzv2j"] Mar 14 10:08:54 crc kubenswrapper[5129]: I0314 10:08:54.067510 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" path="/var/lib/kubelet/pods/3a090521-4f5b-4a48-ad5c-9405b72e40b2/volumes" Mar 14 10:08:59 crc kubenswrapper[5129]: I0314 10:08:59.914766 5129 generic.go:334] "Generic (PLEG): container finished" podID="baff5b24-d1a9-4826-805b-425c628d3d5d" containerID="d16feee0a3e9cb213de96f7af24b0ac8037a4b9829f9ee9b41ff76d2bcaaca51" exitCode=0 Mar 14 10:08:59 crc kubenswrapper[5129]: I0314 10:08:59.914825 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-66kj7" event={"ID":"baff5b24-d1a9-4826-805b-425c628d3d5d","Type":"ContainerDied","Data":"d16feee0a3e9cb213de96f7af24b0ac8037a4b9829f9ee9b41ff76d2bcaaca51"} Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.249285 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.284323 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-combined-ca-bundle\") pod \"baff5b24-d1a9-4826-805b-425c628d3d5d\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.284414 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/baff5b24-d1a9-4826-805b-425c628d3d5d-etc-swift\") pod \"baff5b24-d1a9-4826-805b-425c628d3d5d\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.284524 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-dispersionconf\") pod \"baff5b24-d1a9-4826-805b-425c628d3d5d\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.284554 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-ring-data-devices\") pod \"baff5b24-d1a9-4826-805b-425c628d3d5d\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.284581 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zgvd\" (UniqueName: \"kubernetes.io/projected/baff5b24-d1a9-4826-805b-425c628d3d5d-kube-api-access-4zgvd\") pod \"baff5b24-d1a9-4826-805b-425c628d3d5d\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.284667 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-swiftconf\") pod \"baff5b24-d1a9-4826-805b-425c628d3d5d\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.284713 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-scripts\") pod \"baff5b24-d1a9-4826-805b-425c628d3d5d\" (UID: \"baff5b24-d1a9-4826-805b-425c628d3d5d\") " Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.285513 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baff5b24-d1a9-4826-805b-425c628d3d5d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "baff5b24-d1a9-4826-805b-425c628d3d5d" (UID: "baff5b24-d1a9-4826-805b-425c628d3d5d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.286569 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/baff5b24-d1a9-4826-805b-425c628d3d5d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.287958 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "baff5b24-d1a9-4826-805b-425c628d3d5d" (UID: "baff5b24-d1a9-4826-805b-425c628d3d5d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.301869 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-66kj7"] Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.302733 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baff5b24-d1a9-4826-805b-425c628d3d5d-kube-api-access-4zgvd" (OuterVolumeSpecName: "kube-api-access-4zgvd") pod "baff5b24-d1a9-4826-805b-425c628d3d5d" (UID: "baff5b24-d1a9-4826-805b-425c628d3d5d"). InnerVolumeSpecName "kube-api-access-4zgvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.316039 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-66kj7"] Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.321931 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "baff5b24-d1a9-4826-805b-425c628d3d5d" (UID: "baff5b24-d1a9-4826-805b-425c628d3d5d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.340161 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "baff5b24-d1a9-4826-805b-425c628d3d5d" (UID: "baff5b24-d1a9-4826-805b-425c628d3d5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.348805 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "baff5b24-d1a9-4826-805b-425c628d3d5d" (UID: "baff5b24-d1a9-4826-805b-425c628d3d5d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.352659 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-scripts" (OuterVolumeSpecName: "scripts") pod "baff5b24-d1a9-4826-805b-425c628d3d5d" (UID: "baff5b24-d1a9-4826-805b-425c628d3d5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.388482 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.388514 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.388527 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.388537 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/baff5b24-d1a9-4826-805b-425c628d3d5d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.388547 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zgvd\" (UniqueName: \"kubernetes.io/projected/baff5b24-d1a9-4826-805b-425c628d3d5d-kube-api-access-4zgvd\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.388558 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/baff5b24-d1a9-4826-805b-425c628d3d5d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.627138 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-zldr4"] Mar 14 10:09:02 crc kubenswrapper[5129]: E0314 10:09:02.628237 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="registry-server" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.628422 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="registry-server" Mar 14 10:09:02 crc kubenswrapper[5129]: E0314 10:09:02.628569 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="extract-utilities" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.628802 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="extract-utilities" Mar 14 10:09:02 crc kubenswrapper[5129]: E0314 10:09:02.629049 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="extract-content" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.629205 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="extract-content" Mar 14 10:09:02 crc kubenswrapper[5129]: E0314 10:09:02.629354 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baff5b24-d1a9-4826-805b-425c628d3d5d" containerName="swift-ring-rebalance" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.629456 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="baff5b24-d1a9-4826-805b-425c628d3d5d" containerName="swift-ring-rebalance" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.630003 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a090521-4f5b-4a48-ad5c-9405b72e40b2" containerName="registry-server" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.630190 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="baff5b24-d1a9-4826-805b-425c628d3d5d" containerName="swift-ring-rebalance" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.631514 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.637897 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-zldr4"] Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.696590 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06b07df5-169f-449f-907f-ce03ba398f6e-etc-swift\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.697408 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-ring-data-devices\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.697641 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-swiftconf\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.697783 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-dispersionconf\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.697949 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.698103 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-scripts\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.698255 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cd4k\" (UniqueName: \"kubernetes.io/projected/06b07df5-169f-449f-907f-ce03ba398f6e-kube-api-access-7cd4k\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.800969 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-swiftconf\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.801009 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-dispersionconf\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.801073 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.801111 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-scripts\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.801141 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cd4k\" (UniqueName: \"kubernetes.io/projected/06b07df5-169f-449f-907f-ce03ba398f6e-kube-api-access-7cd4k\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.801277 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06b07df5-169f-449f-907f-ce03ba398f6e-etc-swift\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.801332 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-ring-data-devices\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.802049 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06b07df5-169f-449f-907f-ce03ba398f6e-etc-swift\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.802239 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-scripts\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.802318 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-ring-data-devices\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.805825 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-swiftconf\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.806314 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.809410 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-dispersionconf\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.818865 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cd4k\" (UniqueName: \"kubernetes.io/projected/06b07df5-169f-449f-907f-ce03ba398f6e-kube-api-access-7cd4k\") pod \"swift-ring-rebalance-debug-zldr4\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.953395 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b64b31ae98821b5729072199f33db6796ddd11755a211d94afba6283a2f147e" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.953497 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-66kj7" Mar 14 10:09:02 crc kubenswrapper[5129]: I0314 10:09:02.963488 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:03 crc kubenswrapper[5129]: I0314 10:09:03.500039 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-zldr4"] Mar 14 10:09:03 crc kubenswrapper[5129]: W0314 10:09:03.509636 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b07df5_169f_449f_907f_ce03ba398f6e.slice/crio-c439f0a83175a94370aa0c0efafaa64a467c403764539da4fd3fdb8fc8ef7695 WatchSource:0}: Error finding container c439f0a83175a94370aa0c0efafaa64a467c403764539da4fd3fdb8fc8ef7695: Status 404 returned error can't find the container with id c439f0a83175a94370aa0c0efafaa64a467c403764539da4fd3fdb8fc8ef7695 Mar 14 10:09:03 crc kubenswrapper[5129]: I0314 10:09:03.966865 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-zldr4" event={"ID":"06b07df5-169f-449f-907f-ce03ba398f6e","Type":"ContainerStarted","Data":"68f1ceaa86d325d6c13a84bf243388f5ae290a2dc6d0df234c45815bedfcaa5f"} Mar 14 10:09:03 crc kubenswrapper[5129]: I0314 10:09:03.967215 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-zldr4" event={"ID":"06b07df5-169f-449f-907f-ce03ba398f6e","Type":"ContainerStarted","Data":"c439f0a83175a94370aa0c0efafaa64a467c403764539da4fd3fdb8fc8ef7695"} Mar 14 10:09:03 crc kubenswrapper[5129]: I0314 10:09:03.996047 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-zldr4" podStartSLOduration=1.9960256429999998 podStartE2EDuration="1.996025643s" podCreationTimestamp="2026-03-14 10:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:09:03.989146405 +0000 UTC m=+11406.741061620" watchObservedRunningTime="2026-03-14 10:09:03.996025643 +0000 UTC m=+11406.747940827" Mar 14 10:09:04 crc kubenswrapper[5129]: I0314 10:09:04.055172 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baff5b24-d1a9-4826-805b-425c628d3d5d" path="/var/lib/kubelet/pods/baff5b24-d1a9-4826-805b-425c628d3d5d/volumes" Mar 14 10:09:06 crc kubenswrapper[5129]: I0314 10:09:06.036231 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:09:06 crc kubenswrapper[5129]: E0314 10:09:06.036813 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:09:09 crc kubenswrapper[5129]: I0314 10:09:09.032166 5129 generic.go:334] "Generic (PLEG): container finished" podID="06b07df5-169f-449f-907f-ce03ba398f6e" containerID="68f1ceaa86d325d6c13a84bf243388f5ae290a2dc6d0df234c45815bedfcaa5f" exitCode=0 Mar 14 10:09:09 crc kubenswrapper[5129]: I0314 10:09:09.032758 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-zldr4" event={"ID":"06b07df5-169f-449f-907f-ce03ba398f6e","Type":"ContainerDied","Data":"68f1ceaa86d325d6c13a84bf243388f5ae290a2dc6d0df234c45815bedfcaa5f"} Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.036197 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.091170 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-zldr4" event={"ID":"06b07df5-169f-449f-907f-ce03ba398f6e","Type":"ContainerDied","Data":"c439f0a83175a94370aa0c0efafaa64a467c403764539da4fd3fdb8fc8ef7695"} Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.091214 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c439f0a83175a94370aa0c0efafaa64a467c403764539da4fd3fdb8fc8ef7695" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.091271 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zldr4" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.103856 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-swiftconf\") pod \"06b07df5-169f-449f-907f-ce03ba398f6e\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.103939 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-combined-ca-bundle\") pod \"06b07df5-169f-449f-907f-ce03ba398f6e\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.103984 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-ring-data-devices\") pod \"06b07df5-169f-449f-907f-ce03ba398f6e\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.104058 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06b07df5-169f-449f-907f-ce03ba398f6e-etc-swift\") pod \"06b07df5-169f-449f-907f-ce03ba398f6e\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.104104 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-dispersionconf\") pod \"06b07df5-169f-449f-907f-ce03ba398f6e\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.104146 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cd4k\" (UniqueName: \"kubernetes.io/projected/06b07df5-169f-449f-907f-ce03ba398f6e-kube-api-access-7cd4k\") pod \"06b07df5-169f-449f-907f-ce03ba398f6e\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.104193 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-scripts\") pod \"06b07df5-169f-449f-907f-ce03ba398f6e\" (UID: \"06b07df5-169f-449f-907f-ce03ba398f6e\") " Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.111696 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b07df5-169f-449f-907f-ce03ba398f6e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "06b07df5-169f-449f-907f-ce03ba398f6e" (UID: "06b07df5-169f-449f-907f-ce03ba398f6e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.112027 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "06b07df5-169f-449f-907f-ce03ba398f6e" (UID: "06b07df5-169f-449f-907f-ce03ba398f6e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.127177 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b07df5-169f-449f-907f-ce03ba398f6e-kube-api-access-7cd4k" (OuterVolumeSpecName: "kube-api-access-7cd4k") pod "06b07df5-169f-449f-907f-ce03ba398f6e" (UID: "06b07df5-169f-449f-907f-ce03ba398f6e"). InnerVolumeSpecName "kube-api-access-7cd4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.134052 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-zldr4"] Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.163176 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-zldr4"] Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.168440 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06b07df5-169f-449f-907f-ce03ba398f6e" (UID: "06b07df5-169f-449f-907f-ce03ba398f6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.185739 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "06b07df5-169f-449f-907f-ce03ba398f6e" (UID: "06b07df5-169f-449f-907f-ce03ba398f6e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.207223 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.207255 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.207264 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.207273 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06b07df5-169f-449f-907f-ce03ba398f6e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.207285 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cd4k\" (UniqueName: \"kubernetes.io/projected/06b07df5-169f-449f-907f-ce03ba398f6e-kube-api-access-7cd4k\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.208271 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-scripts" (OuterVolumeSpecName: "scripts") pod "06b07df5-169f-449f-907f-ce03ba398f6e" (UID: "06b07df5-169f-449f-907f-ce03ba398f6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.212754 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "06b07df5-169f-449f-907f-ce03ba398f6e" (UID: "06b07df5-169f-449f-907f-ce03ba398f6e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.310364 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06b07df5-169f-449f-907f-ce03ba398f6e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:11 crc kubenswrapper[5129]: I0314 10:09:11.310439 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06b07df5-169f-449f-907f-ce03ba398f6e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:12 crc kubenswrapper[5129]: I0314 10:09:12.049384 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b07df5-169f-449f-907f-ce03ba398f6e" path="/var/lib/kubelet/pods/06b07df5-169f-449f-907f-ce03ba398f6e/volumes" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.547545 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-f8jrh"] Mar 14 10:09:14 crc kubenswrapper[5129]: E0314 10:09:14.548689 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b07df5-169f-449f-907f-ce03ba398f6e" containerName="swift-ring-rebalance" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.548708 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b07df5-169f-449f-907f-ce03ba398f6e" containerName="swift-ring-rebalance" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.549034 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b07df5-169f-449f-907f-ce03ba398f6e" containerName="swift-ring-rebalance" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.550201 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.558112 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.558394 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.569706 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-f8jrh"] Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.596016 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/793d783e-a78f-406e-ad00-dae5fb46994f-etc-swift\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.596101 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-swiftconf\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.596186 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.596309 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-scripts\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.596348 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-dispersionconf\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.596378 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-ring-data-devices\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.596528 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42dss\" (UniqueName: \"kubernetes.io/projected/793d783e-a78f-406e-ad00-dae5fb46994f-kube-api-access-42dss\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.698859 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42dss\" (UniqueName: \"kubernetes.io/projected/793d783e-a78f-406e-ad00-dae5fb46994f-kube-api-access-42dss\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.699001 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/793d783e-a78f-406e-ad00-dae5fb46994f-etc-swift\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.699057 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-swiftconf\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.699092 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.699164 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-scripts\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.699205 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-dispersionconf\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.699238 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-ring-data-devices\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.699573 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/793d783e-a78f-406e-ad00-dae5fb46994f-etc-swift\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.701247 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-ring-data-devices\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.702316 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-scripts\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.705129 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-swiftconf\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.705477 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-dispersionconf\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.708214 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.719883 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42dss\" (UniqueName: \"kubernetes.io/projected/793d783e-a78f-406e-ad00-dae5fb46994f-kube-api-access-42dss\") pod \"swift-ring-rebalance-debug-f8jrh\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:14 crc kubenswrapper[5129]: I0314 10:09:14.895077 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:15 crc kubenswrapper[5129]: I0314 10:09:15.446944 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-f8jrh"] Mar 14 10:09:16 crc kubenswrapper[5129]: I0314 10:09:16.176863 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-f8jrh" event={"ID":"793d783e-a78f-406e-ad00-dae5fb46994f","Type":"ContainerStarted","Data":"be36b34f644b29c4aac73d6c163cf3d2c25f82d96952f83febab75c1b01c4b8e"} Mar 14 10:09:16 crc kubenswrapper[5129]: I0314 10:09:16.177173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-f8jrh" event={"ID":"793d783e-a78f-406e-ad00-dae5fb46994f","Type":"ContainerStarted","Data":"7ff0bfdcdd20cd2127bc5d3a8c79fbd402c2632ae7dc7cb80b0e355d1d58ae9b"} Mar 14 10:09:16 crc kubenswrapper[5129]: I0314 10:09:16.203230 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-f8jrh" podStartSLOduration=2.203187862 podStartE2EDuration="2.203187862s" podCreationTimestamp="2026-03-14 10:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:09:16.197080996 +0000 UTC m=+11418.948996190" watchObservedRunningTime="2026-03-14 10:09:16.203187862 +0000 UTC m=+11418.955103046" Mar 14 10:09:21 crc kubenswrapper[5129]: I0314 10:09:21.037169 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:09:21 crc kubenswrapper[5129]: E0314 10:09:21.037890 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:09:26 crc kubenswrapper[5129]: I0314 10:09:26.283812 5129 generic.go:334] "Generic (PLEG): container finished" podID="793d783e-a78f-406e-ad00-dae5fb46994f" containerID="be36b34f644b29c4aac73d6c163cf3d2c25f82d96952f83febab75c1b01c4b8e" exitCode=0 Mar 14 10:09:26 crc kubenswrapper[5129]: I0314 10:09:26.283889 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-f8jrh" event={"ID":"793d783e-a78f-406e-ad00-dae5fb46994f","Type":"ContainerDied","Data":"be36b34f644b29c4aac73d6c163cf3d2c25f82d96952f83febab75c1b01c4b8e"} Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.311546 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-f8jrh" event={"ID":"793d783e-a78f-406e-ad00-dae5fb46994f","Type":"ContainerDied","Data":"7ff0bfdcdd20cd2127bc5d3a8c79fbd402c2632ae7dc7cb80b0e355d1d58ae9b"} Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.312491 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ff0bfdcdd20cd2127bc5d3a8c79fbd402c2632ae7dc7cb80b0e355d1d58ae9b" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.372994 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.437429 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-f8jrh"] Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.452467 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-f8jrh"] Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.465239 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-swiftconf\") pod \"793d783e-a78f-406e-ad00-dae5fb46994f\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.465337 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-ring-data-devices\") pod \"793d783e-a78f-406e-ad00-dae5fb46994f\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.465493 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-combined-ca-bundle\") pod \"793d783e-a78f-406e-ad00-dae5fb46994f\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.465523 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42dss\" (UniqueName: \"kubernetes.io/projected/793d783e-a78f-406e-ad00-dae5fb46994f-kube-api-access-42dss\") pod \"793d783e-a78f-406e-ad00-dae5fb46994f\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.465541 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-dispersionconf\") pod \"793d783e-a78f-406e-ad00-dae5fb46994f\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.465677 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-scripts\") pod \"793d783e-a78f-406e-ad00-dae5fb46994f\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.465707 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/793d783e-a78f-406e-ad00-dae5fb46994f-etc-swift\") pod \"793d783e-a78f-406e-ad00-dae5fb46994f\" (UID: \"793d783e-a78f-406e-ad00-dae5fb46994f\") " Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.467463 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/793d783e-a78f-406e-ad00-dae5fb46994f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "793d783e-a78f-406e-ad00-dae5fb46994f" (UID: "793d783e-a78f-406e-ad00-dae5fb46994f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.468358 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "793d783e-a78f-406e-ad00-dae5fb46994f" (UID: "793d783e-a78f-406e-ad00-dae5fb46994f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.495707 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793d783e-a78f-406e-ad00-dae5fb46994f-kube-api-access-42dss" (OuterVolumeSpecName: "kube-api-access-42dss") pod "793d783e-a78f-406e-ad00-dae5fb46994f" (UID: "793d783e-a78f-406e-ad00-dae5fb46994f"). InnerVolumeSpecName "kube-api-access-42dss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.508021 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "793d783e-a78f-406e-ad00-dae5fb46994f" (UID: "793d783e-a78f-406e-ad00-dae5fb46994f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.513125 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "793d783e-a78f-406e-ad00-dae5fb46994f" (UID: "793d783e-a78f-406e-ad00-dae5fb46994f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.532967 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-scripts" (OuterVolumeSpecName: "scripts") pod "793d783e-a78f-406e-ad00-dae5fb46994f" (UID: "793d783e-a78f-406e-ad00-dae5fb46994f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.541552 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "793d783e-a78f-406e-ad00-dae5fb46994f" (UID: "793d783e-a78f-406e-ad00-dae5fb46994f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.568425 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.568480 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.568495 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.568509 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42dss\" (UniqueName: \"kubernetes.io/projected/793d783e-a78f-406e-ad00-dae5fb46994f-kube-api-access-42dss\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.568525 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/793d783e-a78f-406e-ad00-dae5fb46994f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.568536 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/793d783e-a78f-406e-ad00-dae5fb46994f-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:28 crc kubenswrapper[5129]: I0314 10:09:28.568548 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/793d783e-a78f-406e-ad00-dae5fb46994f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:09:29 crc kubenswrapper[5129]: I0314 10:09:29.321415 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-f8jrh" Mar 14 10:09:30 crc kubenswrapper[5129]: I0314 10:09:30.074466 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="793d783e-a78f-406e-ad00-dae5fb46994f" path="/var/lib/kubelet/pods/793d783e-a78f-406e-ad00-dae5fb46994f/volumes" Mar 14 10:09:35 crc kubenswrapper[5129]: I0314 10:09:35.038466 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:09:35 crc kubenswrapper[5129]: E0314 10:09:35.039382 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:09:40 crc kubenswrapper[5129]: I0314 10:09:40.203184 5129 scope.go:117] "RemoveContainer" containerID="11b3a7ec9bc256970ca66b7c8f49c7450feffd51107b68213aad376ee135355d" Mar 14 10:09:49 crc kubenswrapper[5129]: I0314 10:09:49.037018 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:09:49 crc kubenswrapper[5129]: E0314 10:09:49.038266 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.160891 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558050-nqkz4"] Mar 14 10:10:00 crc kubenswrapper[5129]: E0314 10:10:00.162416 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793d783e-a78f-406e-ad00-dae5fb46994f" containerName="swift-ring-rebalance" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.162435 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="793d783e-a78f-406e-ad00-dae5fb46994f" containerName="swift-ring-rebalance" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.162797 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="793d783e-a78f-406e-ad00-dae5fb46994f" containerName="swift-ring-rebalance" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.163781 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558050-nqkz4" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.166557 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.167010 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.168886 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.179527 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558050-nqkz4"] Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.227281 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcll8\" (UniqueName: \"kubernetes.io/projected/80013541-f81c-4b8f-af6b-ee9e5110aa29-kube-api-access-pcll8\") pod \"auto-csr-approver-29558050-nqkz4\" (UID: \"80013541-f81c-4b8f-af6b-ee9e5110aa29\") " pod="openshift-infra/auto-csr-approver-29558050-nqkz4" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.330333 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcll8\" (UniqueName: \"kubernetes.io/projected/80013541-f81c-4b8f-af6b-ee9e5110aa29-kube-api-access-pcll8\") pod \"auto-csr-approver-29558050-nqkz4\" (UID: \"80013541-f81c-4b8f-af6b-ee9e5110aa29\") " pod="openshift-infra/auto-csr-approver-29558050-nqkz4" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.355996 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcll8\" (UniqueName: \"kubernetes.io/projected/80013541-f81c-4b8f-af6b-ee9e5110aa29-kube-api-access-pcll8\") pod \"auto-csr-approver-29558050-nqkz4\" (UID: \"80013541-f81c-4b8f-af6b-ee9e5110aa29\") " pod="openshift-infra/auto-csr-approver-29558050-nqkz4" Mar 14 10:10:00 crc kubenswrapper[5129]: I0314 10:10:00.488391 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558050-nqkz4" Mar 14 10:10:01 crc kubenswrapper[5129]: I0314 10:10:01.242552 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558050-nqkz4"] Mar 14 10:10:01 crc kubenswrapper[5129]: I0314 10:10:01.720141 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558050-nqkz4" event={"ID":"80013541-f81c-4b8f-af6b-ee9e5110aa29","Type":"ContainerStarted","Data":"2dd10bcf422022bc44ea0566d05a7c822450ab98b9b5ea3068fe57899da601ba"} Mar 14 10:10:03 crc kubenswrapper[5129]: I0314 10:10:03.037097 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:10:03 crc kubenswrapper[5129]: E0314 10:10:03.037801 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:10:03 crc kubenswrapper[5129]: I0314 10:10:03.781678 5129 generic.go:334] "Generic (PLEG): container finished" podID="80013541-f81c-4b8f-af6b-ee9e5110aa29" containerID="2d91210a70935a3feaaab09f92765c43629624e839ec5a3205ee784dcdf833b2" exitCode=0 Mar 14 10:10:03 crc kubenswrapper[5129]: I0314 10:10:03.781742 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558050-nqkz4" event={"ID":"80013541-f81c-4b8f-af6b-ee9e5110aa29","Type":"ContainerDied","Data":"2d91210a70935a3feaaab09f92765c43629624e839ec5a3205ee784dcdf833b2"} Mar 14 10:10:06 crc kubenswrapper[5129]: I0314 10:10:06.042583 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558050-nqkz4" Mar 14 10:10:06 crc kubenswrapper[5129]: I0314 10:10:06.177089 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcll8\" (UniqueName: \"kubernetes.io/projected/80013541-f81c-4b8f-af6b-ee9e5110aa29-kube-api-access-pcll8\") pod \"80013541-f81c-4b8f-af6b-ee9e5110aa29\" (UID: \"80013541-f81c-4b8f-af6b-ee9e5110aa29\") " Mar 14 10:10:06 crc kubenswrapper[5129]: I0314 10:10:06.184931 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80013541-f81c-4b8f-af6b-ee9e5110aa29-kube-api-access-pcll8" (OuterVolumeSpecName: "kube-api-access-pcll8") pod "80013541-f81c-4b8f-af6b-ee9e5110aa29" (UID: "80013541-f81c-4b8f-af6b-ee9e5110aa29"). InnerVolumeSpecName "kube-api-access-pcll8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:10:06 crc kubenswrapper[5129]: I0314 10:10:06.280548 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcll8\" (UniqueName: \"kubernetes.io/projected/80013541-f81c-4b8f-af6b-ee9e5110aa29-kube-api-access-pcll8\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:06 crc kubenswrapper[5129]: I0314 10:10:06.820151 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558050-nqkz4" event={"ID":"80013541-f81c-4b8f-af6b-ee9e5110aa29","Type":"ContainerDied","Data":"2dd10bcf422022bc44ea0566d05a7c822450ab98b9b5ea3068fe57899da601ba"} Mar 14 10:10:06 crc kubenswrapper[5129]: I0314 10:10:06.820546 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dd10bcf422022bc44ea0566d05a7c822450ab98b9b5ea3068fe57899da601ba" Mar 14 10:10:06 crc kubenswrapper[5129]: I0314 10:10:06.820236 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558050-nqkz4" Mar 14 10:10:07 crc kubenswrapper[5129]: I0314 10:10:07.150673 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558044-s2xbf"] Mar 14 10:10:07 crc kubenswrapper[5129]: I0314 10:10:07.163178 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558044-s2xbf"] Mar 14 10:10:08 crc kubenswrapper[5129]: I0314 10:10:08.073594 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd1c525-862c-40b6-b416-05f024a11987" path="/var/lib/kubelet/pods/ddd1c525-862c-40b6-b416-05f024a11987/volumes" Mar 14 10:10:17 crc kubenswrapper[5129]: I0314 10:10:17.041590 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:10:17 crc kubenswrapper[5129]: E0314 10:10:17.042479 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.593478 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-zs9tk"] Mar 14 10:10:28 crc kubenswrapper[5129]: E0314 10:10:28.594459 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80013541-f81c-4b8f-af6b-ee9e5110aa29" containerName="oc" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.594474 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="80013541-f81c-4b8f-af6b-ee9e5110aa29" containerName="oc" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.594723 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="80013541-f81c-4b8f-af6b-ee9e5110aa29" containerName="oc" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.595456 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.600532 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.601817 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.605818 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-zs9tk"] Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.726568 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-swiftconf\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.726759 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-scripts\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.726806 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b5tp\" (UniqueName: \"kubernetes.io/projected/12f7c85b-d150-4599-8346-3a3a7efd6b35-kube-api-access-4b5tp\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.726842 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12f7c85b-d150-4599-8346-3a3a7efd6b35-etc-swift\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.727062 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.727166 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-dispersionconf\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.727218 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-ring-data-devices\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.828943 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-swiftconf\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.829018 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-scripts\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.829079 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b5tp\" (UniqueName: \"kubernetes.io/projected/12f7c85b-d150-4599-8346-3a3a7efd6b35-kube-api-access-4b5tp\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.829123 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12f7c85b-d150-4599-8346-3a3a7efd6b35-etc-swift\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.829228 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.829285 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-dispersionconf\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.829331 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-ring-data-devices\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.829948 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12f7c85b-d150-4599-8346-3a3a7efd6b35-etc-swift\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.830206 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-scripts\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.830594 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-ring-data-devices\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.844296 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-swiftconf\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.844296 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-dispersionconf\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.844702 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.848726 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b5tp\" (UniqueName: \"kubernetes.io/projected/12f7c85b-d150-4599-8346-3a3a7efd6b35-kube-api-access-4b5tp\") pod \"swift-ring-rebalance-debug-zs9tk\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:28 crc kubenswrapper[5129]: I0314 10:10:28.928805 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:29 crc kubenswrapper[5129]: I0314 10:10:29.489670 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-zs9tk"] Mar 14 10:10:30 crc kubenswrapper[5129]: I0314 10:10:30.114525 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-zs9tk" event={"ID":"12f7c85b-d150-4599-8346-3a3a7efd6b35","Type":"ContainerStarted","Data":"b044f47d2387b618a42c7ad5e8997ffeaf2c4159cdb85603af763ebe20be9461"} Mar 14 10:10:30 crc kubenswrapper[5129]: I0314 10:10:30.115041 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-zs9tk" event={"ID":"12f7c85b-d150-4599-8346-3a3a7efd6b35","Type":"ContainerStarted","Data":"ea486363f369ae24e7bf65a3f02499073fea7fbbadecb900d081ff151c6454de"} Mar 14 10:10:30 crc kubenswrapper[5129]: I0314 10:10:30.144502 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-zs9tk" podStartSLOduration=2.144469457 podStartE2EDuration="2.144469457s" podCreationTimestamp="2026-03-14 10:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:10:30.135435711 +0000 UTC m=+11492.887350925" watchObservedRunningTime="2026-03-14 10:10:30.144469457 +0000 UTC m=+11492.896384661" Mar 14 10:10:31 crc kubenswrapper[5129]: I0314 10:10:31.037285 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:10:31 crc kubenswrapper[5129]: E0314 10:10:31.038076 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:10:39 crc kubenswrapper[5129]: I0314 10:10:39.244879 5129 generic.go:334] "Generic (PLEG): container finished" podID="12f7c85b-d150-4599-8346-3a3a7efd6b35" containerID="b044f47d2387b618a42c7ad5e8997ffeaf2c4159cdb85603af763ebe20be9461" exitCode=0 Mar 14 10:10:39 crc kubenswrapper[5129]: I0314 10:10:39.244981 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-zs9tk" event={"ID":"12f7c85b-d150-4599-8346-3a3a7efd6b35","Type":"ContainerDied","Data":"b044f47d2387b618a42c7ad5e8997ffeaf2c4159cdb85603af763ebe20be9461"} Mar 14 10:10:40 crc kubenswrapper[5129]: I0314 10:10:40.358624 5129 scope.go:117] "RemoveContainer" containerID="b50e4ef910d3d5d3cb650131ba76ab11d718ff93f2fbf444bd126063d67f62cc" Mar 14 10:10:40 crc kubenswrapper[5129]: I0314 10:10:40.408451 5129 scope.go:117] "RemoveContainer" containerID="6b50cc3c07901197509f332e4cd4b875d1e08693077da75875452dd6a814833d" Mar 14 10:10:41 crc kubenswrapper[5129]: I0314 10:10:41.881761 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:41 crc kubenswrapper[5129]: I0314 10:10:41.930927 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-zs9tk"] Mar 14 10:10:41 crc kubenswrapper[5129]: I0314 10:10:41.942894 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-zs9tk"] Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.027504 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-scripts\") pod \"12f7c85b-d150-4599-8346-3a3a7efd6b35\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.027584 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-combined-ca-bundle\") pod \"12f7c85b-d150-4599-8346-3a3a7efd6b35\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.027621 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-swiftconf\") pod \"12f7c85b-d150-4599-8346-3a3a7efd6b35\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.027686 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-dispersionconf\") pod \"12f7c85b-d150-4599-8346-3a3a7efd6b35\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.027713 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-ring-data-devices\") pod \"12f7c85b-d150-4599-8346-3a3a7efd6b35\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.027818 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b5tp\" (UniqueName: \"kubernetes.io/projected/12f7c85b-d150-4599-8346-3a3a7efd6b35-kube-api-access-4b5tp\") pod \"12f7c85b-d150-4599-8346-3a3a7efd6b35\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.027933 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12f7c85b-d150-4599-8346-3a3a7efd6b35-etc-swift\") pod \"12f7c85b-d150-4599-8346-3a3a7efd6b35\" (UID: \"12f7c85b-d150-4599-8346-3a3a7efd6b35\") " Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.029063 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "12f7c85b-d150-4599-8346-3a3a7efd6b35" (UID: "12f7c85b-d150-4599-8346-3a3a7efd6b35"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.029302 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12f7c85b-d150-4599-8346-3a3a7efd6b35-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "12f7c85b-d150-4599-8346-3a3a7efd6b35" (UID: "12f7c85b-d150-4599-8346-3a3a7efd6b35"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.033430 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f7c85b-d150-4599-8346-3a3a7efd6b35-kube-api-access-4b5tp" (OuterVolumeSpecName: "kube-api-access-4b5tp") pod "12f7c85b-d150-4599-8346-3a3a7efd6b35" (UID: "12f7c85b-d150-4599-8346-3a3a7efd6b35"). InnerVolumeSpecName "kube-api-access-4b5tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.058498 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "12f7c85b-d150-4599-8346-3a3a7efd6b35" (UID: "12f7c85b-d150-4599-8346-3a3a7efd6b35"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.067526 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "12f7c85b-d150-4599-8346-3a3a7efd6b35" (UID: "12f7c85b-d150-4599-8346-3a3a7efd6b35"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.070341 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-scripts" (OuterVolumeSpecName: "scripts") pod "12f7c85b-d150-4599-8346-3a3a7efd6b35" (UID: "12f7c85b-d150-4599-8346-3a3a7efd6b35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.070560 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12f7c85b-d150-4599-8346-3a3a7efd6b35" (UID: "12f7c85b-d150-4599-8346-3a3a7efd6b35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.130388 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.130754 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.130779 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/12f7c85b-d150-4599-8346-3a3a7efd6b35-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.130788 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.130798 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b5tp\" (UniqueName: \"kubernetes.io/projected/12f7c85b-d150-4599-8346-3a3a7efd6b35-kube-api-access-4b5tp\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.130809 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/12f7c85b-d150-4599-8346-3a3a7efd6b35-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.130818 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12f7c85b-d150-4599-8346-3a3a7efd6b35-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.284657 5129 scope.go:117] "RemoveContainer" containerID="b044f47d2387b618a42c7ad5e8997ffeaf2c4159cdb85603af763ebe20be9461" Mar 14 10:10:42 crc kubenswrapper[5129]: I0314 10:10:42.284738 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-zs9tk" Mar 14 10:10:44 crc kubenswrapper[5129]: I0314 10:10:44.050000 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f7c85b-d150-4599-8346-3a3a7efd6b35" path="/var/lib/kubelet/pods/12f7c85b-d150-4599-8346-3a3a7efd6b35/volumes" Mar 14 10:10:45 crc kubenswrapper[5129]: I0314 10:10:45.037127 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:10:45 crc kubenswrapper[5129]: E0314 10:10:45.037459 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:10:59 crc kubenswrapper[5129]: I0314 10:10:59.037640 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:10:59 crc kubenswrapper[5129]: E0314 10:10:59.038820 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:11:11 crc kubenswrapper[5129]: I0314 10:11:11.036647 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:11:11 crc kubenswrapper[5129]: E0314 10:11:11.037899 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:11:25 crc kubenswrapper[5129]: I0314 10:11:25.036554 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:11:25 crc kubenswrapper[5129]: E0314 10:11:25.037436 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:11:39 crc kubenswrapper[5129]: I0314 10:11:39.036999 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:11:39 crc kubenswrapper[5129]: E0314 10:11:39.037909 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:11:40 crc kubenswrapper[5129]: I0314 10:11:40.529336 5129 scope.go:117] "RemoveContainer" containerID="d8fbd9a083bd0b175d34aa3116a38086fb5d28ff132cb61b8d7a4a9e32d35d9b" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.103202 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-fz7gj"] Mar 14 10:11:42 crc kubenswrapper[5129]: E0314 10:11:42.104101 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f7c85b-d150-4599-8346-3a3a7efd6b35" containerName="swift-ring-rebalance" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.104116 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f7c85b-d150-4599-8346-3a3a7efd6b35" containerName="swift-ring-rebalance" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.104382 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f7c85b-d150-4599-8346-3a3a7efd6b35" containerName="swift-ring-rebalance" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.105206 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.107294 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.110288 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.117396 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-fz7gj"] Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.237059 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-dispersionconf\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.237431 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-ring-data-devices\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.237455 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-scripts\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.237497 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-swiftconf\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.237529 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.237592 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8zk\" (UniqueName: \"kubernetes.io/projected/e55d039f-ac21-462f-8b8a-e718c7944a50-kube-api-access-5m8zk\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.237639 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e55d039f-ac21-462f-8b8a-e718c7944a50-etc-swift\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.339486 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8zk\" (UniqueName: \"kubernetes.io/projected/e55d039f-ac21-462f-8b8a-e718c7944a50-kube-api-access-5m8zk\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.339559 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e55d039f-ac21-462f-8b8a-e718c7944a50-etc-swift\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.339658 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-dispersionconf\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.339687 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-ring-data-devices\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.341247 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-scripts\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.340254 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e55d039f-ac21-462f-8b8a-e718c7944a50-etc-swift\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.341184 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-ring-data-devices\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.341409 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-swiftconf\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.341447 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.341950 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-scripts\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.345749 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.346026 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-dispersionconf\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.354435 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-swiftconf\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.364220 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8zk\" (UniqueName: \"kubernetes.io/projected/e55d039f-ac21-462f-8b8a-e718c7944a50-kube-api-access-5m8zk\") pod \"swift-ring-rebalance-debug-fz7gj\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:42 crc kubenswrapper[5129]: I0314 10:11:42.436083 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:43 crc kubenswrapper[5129]: I0314 10:11:43.183095 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-fz7gj"] Mar 14 10:11:44 crc kubenswrapper[5129]: I0314 10:11:44.075937 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-fz7gj" event={"ID":"e55d039f-ac21-462f-8b8a-e718c7944a50","Type":"ContainerStarted","Data":"de4ae6f322f49ca3bcd9cdcdf988f0a2c653f8161c92110cd8d8cafc8ff1617f"} Mar 14 10:11:44 crc kubenswrapper[5129]: I0314 10:11:44.076548 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-fz7gj" event={"ID":"e55d039f-ac21-462f-8b8a-e718c7944a50","Type":"ContainerStarted","Data":"859252ec549322616746a160123554458fe6a527b629d50d2985b58954e68d7c"} Mar 14 10:11:44 crc kubenswrapper[5129]: I0314 10:11:44.103571 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-fz7gj" podStartSLOduration=2.103553076 podStartE2EDuration="2.103553076s" podCreationTimestamp="2026-03-14 10:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:11:44.098403727 +0000 UTC m=+11566.850318911" watchObservedRunningTime="2026-03-14 10:11:44.103553076 +0000 UTC m=+11566.855468260" Mar 14 10:11:53 crc kubenswrapper[5129]: I0314 10:11:53.036661 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:11:54 crc kubenswrapper[5129]: I0314 10:11:54.226937 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"ea55a9c69c43b228fc76f8d80751a65180ad34d13576370de81b7c849d750b69"} Mar 14 10:11:54 crc kubenswrapper[5129]: I0314 10:11:54.231977 5129 generic.go:334] "Generic (PLEG): container finished" podID="e55d039f-ac21-462f-8b8a-e718c7944a50" containerID="de4ae6f322f49ca3bcd9cdcdf988f0a2c653f8161c92110cd8d8cafc8ff1617f" exitCode=0 Mar 14 10:11:54 crc kubenswrapper[5129]: I0314 10:11:54.232159 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-fz7gj" event={"ID":"e55d039f-ac21-462f-8b8a-e718c7944a50","Type":"ContainerDied","Data":"de4ae6f322f49ca3bcd9cdcdf988f0a2c653f8161c92110cd8d8cafc8ff1617f"} Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.108105 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.154086 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e55d039f-ac21-462f-8b8a-e718c7944a50-etc-swift\") pod \"e55d039f-ac21-462f-8b8a-e718c7944a50\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.154228 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-scripts\") pod \"e55d039f-ac21-462f-8b8a-e718c7944a50\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.154272 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m8zk\" (UniqueName: \"kubernetes.io/projected/e55d039f-ac21-462f-8b8a-e718c7944a50-kube-api-access-5m8zk\") pod \"e55d039f-ac21-462f-8b8a-e718c7944a50\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.154312 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-swiftconf\") pod \"e55d039f-ac21-462f-8b8a-e718c7944a50\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.154362 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-ring-data-devices\") pod \"e55d039f-ac21-462f-8b8a-e718c7944a50\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.154474 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-dispersionconf\") pod \"e55d039f-ac21-462f-8b8a-e718c7944a50\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.154568 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-combined-ca-bundle\") pod \"e55d039f-ac21-462f-8b8a-e718c7944a50\" (UID: \"e55d039f-ac21-462f-8b8a-e718c7944a50\") " Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.155450 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e55d039f-ac21-462f-8b8a-e718c7944a50" (UID: "e55d039f-ac21-462f-8b8a-e718c7944a50"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.159043 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55d039f-ac21-462f-8b8a-e718c7944a50-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e55d039f-ac21-462f-8b8a-e718c7944a50" (UID: "e55d039f-ac21-462f-8b8a-e718c7944a50"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.162694 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55d039f-ac21-462f-8b8a-e718c7944a50-kube-api-access-5m8zk" (OuterVolumeSpecName: "kube-api-access-5m8zk") pod "e55d039f-ac21-462f-8b8a-e718c7944a50" (UID: "e55d039f-ac21-462f-8b8a-e718c7944a50"). InnerVolumeSpecName "kube-api-access-5m8zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.170995 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-fz7gj"] Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.185824 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-fz7gj"] Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.194158 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-scripts" (OuterVolumeSpecName: "scripts") pod "e55d039f-ac21-462f-8b8a-e718c7944a50" (UID: "e55d039f-ac21-462f-8b8a-e718c7944a50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.201029 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e55d039f-ac21-462f-8b8a-e718c7944a50" (UID: "e55d039f-ac21-462f-8b8a-e718c7944a50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.221670 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e55d039f-ac21-462f-8b8a-e718c7944a50" (UID: "e55d039f-ac21-462f-8b8a-e718c7944a50"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.226747 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e55d039f-ac21-462f-8b8a-e718c7944a50" (UID: "e55d039f-ac21-462f-8b8a-e718c7944a50"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.256888 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e55d039f-ac21-462f-8b8a-e718c7944a50-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.256925 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.256937 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m8zk\" (UniqueName: \"kubernetes.io/projected/e55d039f-ac21-462f-8b8a-e718c7944a50-kube-api-access-5m8zk\") on node \"crc\" DevicePath \"\"" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.256949 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.256958 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e55d039f-ac21-462f-8b8a-e718c7944a50-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.256965 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.256973 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d039f-ac21-462f-8b8a-e718c7944a50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.272028 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="859252ec549322616746a160123554458fe6a527b629d50d2985b58954e68d7c" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.272114 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-fz7gj" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.584189 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-859f8"] Mar 14 10:11:57 crc kubenswrapper[5129]: E0314 10:11:57.584870 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55d039f-ac21-462f-8b8a-e718c7944a50" containerName="swift-ring-rebalance" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.584895 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55d039f-ac21-462f-8b8a-e718c7944a50" containerName="swift-ring-rebalance" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.585203 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55d039f-ac21-462f-8b8a-e718c7944a50" containerName="swift-ring-rebalance" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.586213 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.588351 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.588878 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.595655 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-859f8"] Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.665234 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-scripts\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.665412 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-dispersionconf\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.665554 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a34f9d-2f69-42df-afd9-a50a10689291-etc-swift\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.665655 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-swiftconf\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.665747 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlp7r\" (UniqueName: \"kubernetes.io/projected/37a34f9d-2f69-42df-afd9-a50a10689291-kube-api-access-vlp7r\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.665825 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-ring-data-devices\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.665915 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.767909 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlp7r\" (UniqueName: \"kubernetes.io/projected/37a34f9d-2f69-42df-afd9-a50a10689291-kube-api-access-vlp7r\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.768277 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-ring-data-devices\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.768397 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.768616 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-scripts\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.768757 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-dispersionconf\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.769795 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-ring-data-devices\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.770091 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a34f9d-2f69-42df-afd9-a50a10689291-etc-swift\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.770233 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-swiftconf\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.770304 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-scripts\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.770505 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a34f9d-2f69-42df-afd9-a50a10689291-etc-swift\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.773960 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-swiftconf\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.774321 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-dispersionconf\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.775074 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.799285 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlp7r\" (UniqueName: \"kubernetes.io/projected/37a34f9d-2f69-42df-afd9-a50a10689291-kube-api-access-vlp7r\") pod \"swift-ring-rebalance-debug-859f8\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:57 crc kubenswrapper[5129]: I0314 10:11:57.906629 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:11:58 crc kubenswrapper[5129]: I0314 10:11:58.063278 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55d039f-ac21-462f-8b8a-e718c7944a50" path="/var/lib/kubelet/pods/e55d039f-ac21-462f-8b8a-e718c7944a50/volumes" Mar 14 10:11:58 crc kubenswrapper[5129]: I0314 10:11:58.606624 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-859f8"] Mar 14 10:11:59 crc kubenswrapper[5129]: I0314 10:11:59.302117 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-859f8" event={"ID":"37a34f9d-2f69-42df-afd9-a50a10689291","Type":"ContainerStarted","Data":"417cb6e18b7733b2b622ddfd5456bc11968427d100e4cd1b243940fbca03946b"} Mar 14 10:11:59 crc kubenswrapper[5129]: I0314 10:11:59.302645 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-859f8" event={"ID":"37a34f9d-2f69-42df-afd9-a50a10689291","Type":"ContainerStarted","Data":"0ac6111c9631c72576421d987d4473b78d9822080f25b238178342782fb4e1f4"} Mar 14 10:11:59 crc kubenswrapper[5129]: I0314 10:11:59.325369 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-859f8" podStartSLOduration=2.32534895 podStartE2EDuration="2.32534895s" podCreationTimestamp="2026-03-14 10:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:11:59.3223733 +0000 UTC m=+11582.074288494" watchObservedRunningTime="2026-03-14 10:11:59.32534895 +0000 UTC m=+11582.077264134" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.154943 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558052-nm8bn"] Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.156433 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558052-nm8bn" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.159933 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.160453 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.160795 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.168377 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558052-nm8bn"] Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.236351 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zhp\" (UniqueName: \"kubernetes.io/projected/7e76b0f3-4a63-41f8-bf7d-41057276f813-kube-api-access-s9zhp\") pod \"auto-csr-approver-29558052-nm8bn\" (UID: \"7e76b0f3-4a63-41f8-bf7d-41057276f813\") " pod="openshift-infra/auto-csr-approver-29558052-nm8bn" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.338106 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zhp\" (UniqueName: \"kubernetes.io/projected/7e76b0f3-4a63-41f8-bf7d-41057276f813-kube-api-access-s9zhp\") pod \"auto-csr-approver-29558052-nm8bn\" (UID: \"7e76b0f3-4a63-41f8-bf7d-41057276f813\") " pod="openshift-infra/auto-csr-approver-29558052-nm8bn" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.355398 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zhp\" (UniqueName: \"kubernetes.io/projected/7e76b0f3-4a63-41f8-bf7d-41057276f813-kube-api-access-s9zhp\") pod \"auto-csr-approver-29558052-nm8bn\" (UID: \"7e76b0f3-4a63-41f8-bf7d-41057276f813\") " pod="openshift-infra/auto-csr-approver-29558052-nm8bn" Mar 14 10:12:00 crc kubenswrapper[5129]: I0314 10:12:00.476203 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558052-nm8bn" Mar 14 10:12:01 crc kubenswrapper[5129]: I0314 10:12:01.146011 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558052-nm8bn"] Mar 14 10:12:01 crc kubenswrapper[5129]: W0314 10:12:01.152249 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e76b0f3_4a63_41f8_bf7d_41057276f813.slice/crio-7eac82cf68e43363fd7800685c5129650550dc2415abf1c8f1b16d476a294252 WatchSource:0}: Error finding container 7eac82cf68e43363fd7800685c5129650550dc2415abf1c8f1b16d476a294252: Status 404 returned error can't find the container with id 7eac82cf68e43363fd7800685c5129650550dc2415abf1c8f1b16d476a294252 Mar 14 10:12:01 crc kubenswrapper[5129]: I0314 10:12:01.159393 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:12:01 crc kubenswrapper[5129]: I0314 10:12:01.325317 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558052-nm8bn" event={"ID":"7e76b0f3-4a63-41f8-bf7d-41057276f813","Type":"ContainerStarted","Data":"7eac82cf68e43363fd7800685c5129650550dc2415abf1c8f1b16d476a294252"} Mar 14 10:12:02 crc kubenswrapper[5129]: E0314 10:12:02.758530 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e76b0f3_4a63_41f8_bf7d_41057276f813.slice/crio-d2f42ded749c2e5ef1844064e827669f0255b18aa3fe34b11bbe4451c192758c.scope\": RecentStats: unable to find data in memory cache]" Mar 14 10:12:03 crc kubenswrapper[5129]: I0314 10:12:03.353015 5129 generic.go:334] "Generic (PLEG): container finished" podID="7e76b0f3-4a63-41f8-bf7d-41057276f813" containerID="d2f42ded749c2e5ef1844064e827669f0255b18aa3fe34b11bbe4451c192758c" exitCode=0 Mar 14 10:12:03 crc kubenswrapper[5129]: I0314 10:12:03.353084 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558052-nm8bn" event={"ID":"7e76b0f3-4a63-41f8-bf7d-41057276f813","Type":"ContainerDied","Data":"d2f42ded749c2e5ef1844064e827669f0255b18aa3fe34b11bbe4451c192758c"} Mar 14 10:12:04 crc kubenswrapper[5129]: I0314 10:12:04.367763 5129 generic.go:334] "Generic (PLEG): container finished" podID="37a34f9d-2f69-42df-afd9-a50a10689291" containerID="417cb6e18b7733b2b622ddfd5456bc11968427d100e4cd1b243940fbca03946b" exitCode=0 Mar 14 10:12:04 crc kubenswrapper[5129]: I0314 10:12:04.367843 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-859f8" event={"ID":"37a34f9d-2f69-42df-afd9-a50a10689291","Type":"ContainerDied","Data":"417cb6e18b7733b2b622ddfd5456bc11968427d100e4cd1b243940fbca03946b"} Mar 14 10:12:05 crc kubenswrapper[5129]: I0314 10:12:05.912325 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558052-nm8bn" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.090539 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9zhp\" (UniqueName: \"kubernetes.io/projected/7e76b0f3-4a63-41f8-bf7d-41057276f813-kube-api-access-s9zhp\") pod \"7e76b0f3-4a63-41f8-bf7d-41057276f813\" (UID: \"7e76b0f3-4a63-41f8-bf7d-41057276f813\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.099119 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e76b0f3-4a63-41f8-bf7d-41057276f813-kube-api-access-s9zhp" (OuterVolumeSpecName: "kube-api-access-s9zhp") pod "7e76b0f3-4a63-41f8-bf7d-41057276f813" (UID: "7e76b0f3-4a63-41f8-bf7d-41057276f813"). InnerVolumeSpecName "kube-api-access-s9zhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.193678 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9zhp\" (UniqueName: \"kubernetes.io/projected/7e76b0f3-4a63-41f8-bf7d-41057276f813-kube-api-access-s9zhp\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.391902 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558052-nm8bn" event={"ID":"7e76b0f3-4a63-41f8-bf7d-41057276f813","Type":"ContainerDied","Data":"7eac82cf68e43363fd7800685c5129650550dc2415abf1c8f1b16d476a294252"} Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.391948 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eac82cf68e43363fd7800685c5129650550dc2415abf1c8f1b16d476a294252" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.392014 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558052-nm8bn" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.724589 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.784374 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-859f8"] Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.796910 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-859f8"] Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.907476 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-combined-ca-bundle\") pod \"37a34f9d-2f69-42df-afd9-a50a10689291\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.907544 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a34f9d-2f69-42df-afd9-a50a10689291-etc-swift\") pod \"37a34f9d-2f69-42df-afd9-a50a10689291\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.907569 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-dispersionconf\") pod \"37a34f9d-2f69-42df-afd9-a50a10689291\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.908567 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a34f9d-2f69-42df-afd9-a50a10689291-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "37a34f9d-2f69-42df-afd9-a50a10689291" (UID: "37a34f9d-2f69-42df-afd9-a50a10689291"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.908777 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-ring-data-devices\") pod \"37a34f9d-2f69-42df-afd9-a50a10689291\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.909250 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "37a34f9d-2f69-42df-afd9-a50a10689291" (UID: "37a34f9d-2f69-42df-afd9-a50a10689291"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.909397 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlp7r\" (UniqueName: \"kubernetes.io/projected/37a34f9d-2f69-42df-afd9-a50a10689291-kube-api-access-vlp7r\") pod \"37a34f9d-2f69-42df-afd9-a50a10689291\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.909437 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-swiftconf\") pod \"37a34f9d-2f69-42df-afd9-a50a10689291\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.911026 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-scripts\") pod \"37a34f9d-2f69-42df-afd9-a50a10689291\" (UID: \"37a34f9d-2f69-42df-afd9-a50a10689291\") " Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.911787 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/37a34f9d-2f69-42df-afd9-a50a10689291-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.911800 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.913460 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a34f9d-2f69-42df-afd9-a50a10689291-kube-api-access-vlp7r" (OuterVolumeSpecName: "kube-api-access-vlp7r") pod "37a34f9d-2f69-42df-afd9-a50a10689291" (UID: "37a34f9d-2f69-42df-afd9-a50a10689291"). InnerVolumeSpecName "kube-api-access-vlp7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.941578 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "37a34f9d-2f69-42df-afd9-a50a10689291" (UID: "37a34f9d-2f69-42df-afd9-a50a10689291"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.946877 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-scripts" (OuterVolumeSpecName: "scripts") pod "37a34f9d-2f69-42df-afd9-a50a10689291" (UID: "37a34f9d-2f69-42df-afd9-a50a10689291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.950149 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "37a34f9d-2f69-42df-afd9-a50a10689291" (UID: "37a34f9d-2f69-42df-afd9-a50a10689291"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:12:06 crc kubenswrapper[5129]: I0314 10:12:06.976793 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37a34f9d-2f69-42df-afd9-a50a10689291" (UID: "37a34f9d-2f69-42df-afd9-a50a10689291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.000594 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558046-2hj7v"] Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.010186 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558046-2hj7v"] Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.013843 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlp7r\" (UniqueName: \"kubernetes.io/projected/37a34f9d-2f69-42df-afd9-a50a10689291-kube-api-access-vlp7r\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.013873 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.013885 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37a34f9d-2f69-42df-afd9-a50a10689291-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.013912 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.013921 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/37a34f9d-2f69-42df-afd9-a50a10689291-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.404938 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ac6111c9631c72576421d987d4473b78d9822080f25b238178342782fb4e1f4" Mar 14 10:12:07 crc kubenswrapper[5129]: I0314 10:12:07.404966 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-859f8" Mar 14 10:12:08 crc kubenswrapper[5129]: I0314 10:12:08.051909 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a34f9d-2f69-42df-afd9-a50a10689291" path="/var/lib/kubelet/pods/37a34f9d-2f69-42df-afd9-a50a10689291/volumes" Mar 14 10:12:08 crc kubenswrapper[5129]: I0314 10:12:08.052436 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="778526c8-262a-4a3f-a4bb-d895c2d576c3" path="/var/lib/kubelet/pods/778526c8-262a-4a3f-a4bb-d895c2d576c3/volumes" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.385337 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-7lcr7"] Mar 14 10:12:10 crc kubenswrapper[5129]: E0314 10:12:10.386253 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e76b0f3-4a63-41f8-bf7d-41057276f813" containerName="oc" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.386270 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e76b0f3-4a63-41f8-bf7d-41057276f813" containerName="oc" Mar 14 10:12:10 crc kubenswrapper[5129]: E0314 10:12:10.386308 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a34f9d-2f69-42df-afd9-a50a10689291" containerName="swift-ring-rebalance" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.386316 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a34f9d-2f69-42df-afd9-a50a10689291" containerName="swift-ring-rebalance" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.386531 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e76b0f3-4a63-41f8-bf7d-41057276f813" containerName="oc" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.386555 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a34f9d-2f69-42df-afd9-a50a10689291" containerName="swift-ring-rebalance" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.387412 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.389841 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.390170 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.398963 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-7lcr7"] Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.494532 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nf7n\" (UniqueName: \"kubernetes.io/projected/77fd2137-737d-438e-9598-25940c8ea3bc-kube-api-access-5nf7n\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.494603 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.494832 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-dispersionconf\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.494962 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.495040 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-scripts\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.495094 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-swiftconf\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.495130 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77fd2137-737d-438e-9598-25940c8ea3bc-etc-swift\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.597809 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.597894 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-scripts\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.597942 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-swiftconf\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.597970 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77fd2137-737d-438e-9598-25940c8ea3bc-etc-swift\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.598018 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nf7n\" (UniqueName: \"kubernetes.io/projected/77fd2137-737d-438e-9598-25940c8ea3bc-kube-api-access-5nf7n\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.598129 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.598636 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77fd2137-737d-438e-9598-25940c8ea3bc-etc-swift\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.598731 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-dispersionconf\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.599002 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-ring-data-devices\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.599067 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-scripts\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.603812 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-swiftconf\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.610130 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-dispersionconf\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.610339 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.649184 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nf7n\" (UniqueName: \"kubernetes.io/projected/77fd2137-737d-438e-9598-25940c8ea3bc-kube-api-access-5nf7n\") pod \"swift-ring-rebalance-debug-7lcr7\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:10 crc kubenswrapper[5129]: I0314 10:12:10.715872 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:11 crc kubenswrapper[5129]: I0314 10:12:11.464968 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-7lcr7"] Mar 14 10:12:12 crc kubenswrapper[5129]: I0314 10:12:12.473984 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-7lcr7" event={"ID":"77fd2137-737d-438e-9598-25940c8ea3bc","Type":"ContainerStarted","Data":"5662ae3ed896bb098aa6e771af3e13e66763a24f51d21092270e116913faf162"} Mar 14 10:12:12 crc kubenswrapper[5129]: I0314 10:12:12.474702 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-7lcr7" event={"ID":"77fd2137-737d-438e-9598-25940c8ea3bc","Type":"ContainerStarted","Data":"9798b29a1ec81fa439489105f06a5427fbb81751fe829911f426015c9074f93d"} Mar 14 10:12:12 crc kubenswrapper[5129]: I0314 10:12:12.506234 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-7lcr7" podStartSLOduration=2.506208353 podStartE2EDuration="2.506208353s" podCreationTimestamp="2026-03-14 10:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:12:12.487903719 +0000 UTC m=+11595.239818903" watchObservedRunningTime="2026-03-14 10:12:12.506208353 +0000 UTC m=+11595.258123547" Mar 14 10:12:22 crc kubenswrapper[5129]: I0314 10:12:22.602975 5129 generic.go:334] "Generic (PLEG): container finished" podID="77fd2137-737d-438e-9598-25940c8ea3bc" containerID="5662ae3ed896bb098aa6e771af3e13e66763a24f51d21092270e116913faf162" exitCode=0 Mar 14 10:12:22 crc kubenswrapper[5129]: I0314 10:12:22.603050 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-7lcr7" event={"ID":"77fd2137-737d-438e-9598-25940c8ea3bc","Type":"ContainerDied","Data":"5662ae3ed896bb098aa6e771af3e13e66763a24f51d21092270e116913faf162"} Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.554143 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.598053 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-7lcr7"] Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.608355 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-7lcr7"] Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.631729 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-combined-ca-bundle\") pod \"77fd2137-737d-438e-9598-25940c8ea3bc\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.631806 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-dispersionconf\") pod \"77fd2137-737d-438e-9598-25940c8ea3bc\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.631848 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-swiftconf\") pod \"77fd2137-737d-438e-9598-25940c8ea3bc\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.631875 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77fd2137-737d-438e-9598-25940c8ea3bc-etc-swift\") pod \"77fd2137-737d-438e-9598-25940c8ea3bc\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.631948 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-ring-data-devices\") pod \"77fd2137-737d-438e-9598-25940c8ea3bc\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.632102 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nf7n\" (UniqueName: \"kubernetes.io/projected/77fd2137-737d-438e-9598-25940c8ea3bc-kube-api-access-5nf7n\") pod \"77fd2137-737d-438e-9598-25940c8ea3bc\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.632140 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-scripts\") pod \"77fd2137-737d-438e-9598-25940c8ea3bc\" (UID: \"77fd2137-737d-438e-9598-25940c8ea3bc\") " Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.635212 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77fd2137-737d-438e-9598-25940c8ea3bc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "77fd2137-737d-438e-9598-25940c8ea3bc" (UID: "77fd2137-737d-438e-9598-25940c8ea3bc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.636510 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "77fd2137-737d-438e-9598-25940c8ea3bc" (UID: "77fd2137-737d-438e-9598-25940c8ea3bc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.653432 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9798b29a1ec81fa439489105f06a5427fbb81751fe829911f426015c9074f93d" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.653472 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-7lcr7" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.661338 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77fd2137-737d-438e-9598-25940c8ea3bc-kube-api-access-5nf7n" (OuterVolumeSpecName: "kube-api-access-5nf7n") pod "77fd2137-737d-438e-9598-25940c8ea3bc" (UID: "77fd2137-737d-438e-9598-25940c8ea3bc"). InnerVolumeSpecName "kube-api-access-5nf7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.667046 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77fd2137-737d-438e-9598-25940c8ea3bc" (UID: "77fd2137-737d-438e-9598-25940c8ea3bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.667139 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "77fd2137-737d-438e-9598-25940c8ea3bc" (UID: "77fd2137-737d-438e-9598-25940c8ea3bc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.684578 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "77fd2137-737d-438e-9598-25940c8ea3bc" (UID: "77fd2137-737d-438e-9598-25940c8ea3bc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.686220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-scripts" (OuterVolumeSpecName: "scripts") pod "77fd2137-737d-438e-9598-25940c8ea3bc" (UID: "77fd2137-737d-438e-9598-25940c8ea3bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.735527 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.735560 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.735569 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77fd2137-737d-438e-9598-25940c8ea3bc-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.735578 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77fd2137-737d-438e-9598-25940c8ea3bc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.735587 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.735613 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nf7n\" (UniqueName: \"kubernetes.io/projected/77fd2137-737d-438e-9598-25940c8ea3bc-kube-api-access-5nf7n\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:25 crc kubenswrapper[5129]: I0314 10:12:25.735623 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77fd2137-737d-438e-9598-25940c8ea3bc-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:12:26 crc kubenswrapper[5129]: I0314 10:12:26.065217 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fd2137-737d-438e-9598-25940c8ea3bc" path="/var/lib/kubelet/pods/77fd2137-737d-438e-9598-25940c8ea3bc/volumes" Mar 14 10:12:40 crc kubenswrapper[5129]: I0314 10:12:40.617788 5129 scope.go:117] "RemoveContainer" containerID="ca6f36738cd71117852950f72ffa14ffac3bbb350f7e6c2fccd083f54cd3cffb" Mar 14 10:12:40 crc kubenswrapper[5129]: I0314 10:12:40.659212 5129 scope.go:117] "RemoveContainer" containerID="9440ff55d54b7ae5c3592cdf1c00e7b20a63e8834b43eafbd48007a6457afa69" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.198314 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85qdq"] Mar 14 10:12:56 crc kubenswrapper[5129]: E0314 10:12:56.199233 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fd2137-737d-438e-9598-25940c8ea3bc" containerName="swift-ring-rebalance" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.199246 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fd2137-737d-438e-9598-25940c8ea3bc" containerName="swift-ring-rebalance" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.199477 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fd2137-737d-438e-9598-25940c8ea3bc" containerName="swift-ring-rebalance" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.201070 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.209577 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85qdq"] Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.381961 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-catalog-content\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.382045 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmlr\" (UniqueName: \"kubernetes.io/projected/97869495-4dcb-466b-9089-1d82e571a419-kube-api-access-qkmlr\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.382282 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-utilities\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.484540 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-utilities\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.484765 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-catalog-content\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.485097 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-utilities\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.485226 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-catalog-content\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.485382 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmlr\" (UniqueName: \"kubernetes.io/projected/97869495-4dcb-466b-9089-1d82e571a419-kube-api-access-qkmlr\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.512899 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmlr\" (UniqueName: \"kubernetes.io/projected/97869495-4dcb-466b-9089-1d82e571a419-kube-api-access-qkmlr\") pod \"community-operators-85qdq\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:56 crc kubenswrapper[5129]: I0314 10:12:56.519875 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:12:57 crc kubenswrapper[5129]: I0314 10:12:57.430485 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85qdq"] Mar 14 10:12:58 crc kubenswrapper[5129]: I0314 10:12:58.116824 5129 generic.go:334] "Generic (PLEG): container finished" podID="97869495-4dcb-466b-9089-1d82e571a419" containerID="ba5e6bc7adc6aef28b427fc346188d914d57d21a73699a97a517e8c34feffcf5" exitCode=0 Mar 14 10:12:58 crc kubenswrapper[5129]: I0314 10:12:58.118706 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85qdq" event={"ID":"97869495-4dcb-466b-9089-1d82e571a419","Type":"ContainerDied","Data":"ba5e6bc7adc6aef28b427fc346188d914d57d21a73699a97a517e8c34feffcf5"} Mar 14 10:12:58 crc kubenswrapper[5129]: I0314 10:12:58.118796 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85qdq" event={"ID":"97869495-4dcb-466b-9089-1d82e571a419","Type":"ContainerStarted","Data":"d6e0d3521f1565266874be4007769c172b7e5c3714b00e492eca9c36f37ea8b9"} Mar 14 10:12:59 crc kubenswrapper[5129]: I0314 10:12:59.128624 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85qdq" event={"ID":"97869495-4dcb-466b-9089-1d82e571a419","Type":"ContainerStarted","Data":"cfcc4ea3ef9ae4cd114ea317b315c89e6fb1d1e9f23e5053cbb9df6172b8a1af"} Mar 14 10:13:01 crc kubenswrapper[5129]: I0314 10:13:01.153967 5129 generic.go:334] "Generic (PLEG): container finished" podID="97869495-4dcb-466b-9089-1d82e571a419" containerID="cfcc4ea3ef9ae4cd114ea317b315c89e6fb1d1e9f23e5053cbb9df6172b8a1af" exitCode=0 Mar 14 10:13:01 crc kubenswrapper[5129]: I0314 10:13:01.154082 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85qdq" event={"ID":"97869495-4dcb-466b-9089-1d82e571a419","Type":"ContainerDied","Data":"cfcc4ea3ef9ae4cd114ea317b315c89e6fb1d1e9f23e5053cbb9df6172b8a1af"} Mar 14 10:13:02 crc kubenswrapper[5129]: I0314 10:13:02.167942 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85qdq" event={"ID":"97869495-4dcb-466b-9089-1d82e571a419","Type":"ContainerStarted","Data":"67a21d1187e121f727115b80d15e3e42a152c89ed1f9a00519c486ad9b0cc904"} Mar 14 10:13:02 crc kubenswrapper[5129]: I0314 10:13:02.202333 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85qdq" podStartSLOduration=2.40819919 podStartE2EDuration="6.202307041s" podCreationTimestamp="2026-03-14 10:12:56 +0000 UTC" firstStartedPulling="2026-03-14 10:12:58.122682556 +0000 UTC m=+11640.874597740" lastFinishedPulling="2026-03-14 10:13:01.916790407 +0000 UTC m=+11644.668705591" observedRunningTime="2026-03-14 10:13:02.187710227 +0000 UTC m=+11644.939625411" watchObservedRunningTime="2026-03-14 10:13:02.202307041 +0000 UTC m=+11644.954222225" Mar 14 10:13:06 crc kubenswrapper[5129]: I0314 10:13:06.521009 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:13:06 crc kubenswrapper[5129]: I0314 10:13:06.521712 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:13:06 crc kubenswrapper[5129]: I0314 10:13:06.577405 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:13:07 crc kubenswrapper[5129]: I0314 10:13:07.288349 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:13:07 crc kubenswrapper[5129]: I0314 10:13:07.357003 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85qdq"] Mar 14 10:13:09 crc kubenswrapper[5129]: I0314 10:13:09.250904 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85qdq" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="registry-server" containerID="cri-o://67a21d1187e121f727115b80d15e3e42a152c89ed1f9a00519c486ad9b0cc904" gracePeriod=2 Mar 14 10:13:10 crc kubenswrapper[5129]: I0314 10:13:10.263738 5129 generic.go:334] "Generic (PLEG): container finished" podID="97869495-4dcb-466b-9089-1d82e571a419" containerID="67a21d1187e121f727115b80d15e3e42a152c89ed1f9a00519c486ad9b0cc904" exitCode=0 Mar 14 10:13:10 crc kubenswrapper[5129]: I0314 10:13:10.263817 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85qdq" event={"ID":"97869495-4dcb-466b-9089-1d82e571a419","Type":"ContainerDied","Data":"67a21d1187e121f727115b80d15e3e42a152c89ed1f9a00519c486ad9b0cc904"} Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.034634 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.135283 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-utilities\") pod \"97869495-4dcb-466b-9089-1d82e571a419\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.135364 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-catalog-content\") pod \"97869495-4dcb-466b-9089-1d82e571a419\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.135568 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkmlr\" (UniqueName: \"kubernetes.io/projected/97869495-4dcb-466b-9089-1d82e571a419-kube-api-access-qkmlr\") pod \"97869495-4dcb-466b-9089-1d82e571a419\" (UID: \"97869495-4dcb-466b-9089-1d82e571a419\") " Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.137064 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-utilities" (OuterVolumeSpecName: "utilities") pod "97869495-4dcb-466b-9089-1d82e571a419" (UID: "97869495-4dcb-466b-9089-1d82e571a419"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.143313 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97869495-4dcb-466b-9089-1d82e571a419-kube-api-access-qkmlr" (OuterVolumeSpecName: "kube-api-access-qkmlr") pod "97869495-4dcb-466b-9089-1d82e571a419" (UID: "97869495-4dcb-466b-9089-1d82e571a419"). InnerVolumeSpecName "kube-api-access-qkmlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.212961 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97869495-4dcb-466b-9089-1d82e571a419" (UID: "97869495-4dcb-466b-9089-1d82e571a419"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.238625 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkmlr\" (UniqueName: \"kubernetes.io/projected/97869495-4dcb-466b-9089-1d82e571a419-kube-api-access-qkmlr\") on node \"crc\" DevicePath \"\"" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.238686 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.238708 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97869495-4dcb-466b-9089-1d82e571a419-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.277968 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85qdq" event={"ID":"97869495-4dcb-466b-9089-1d82e571a419","Type":"ContainerDied","Data":"d6e0d3521f1565266874be4007769c172b7e5c3714b00e492eca9c36f37ea8b9"} Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.278034 5129 scope.go:117] "RemoveContainer" containerID="67a21d1187e121f727115b80d15e3e42a152c89ed1f9a00519c486ad9b0cc904" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.278176 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85qdq" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.325569 5129 scope.go:117] "RemoveContainer" containerID="cfcc4ea3ef9ae4cd114ea317b315c89e6fb1d1e9f23e5053cbb9df6172b8a1af" Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.360700 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85qdq"] Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.372901 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85qdq"] Mar 14 10:13:11 crc kubenswrapper[5129]: I0314 10:13:11.380798 5129 scope.go:117] "RemoveContainer" containerID="ba5e6bc7adc6aef28b427fc346188d914d57d21a73699a97a517e8c34feffcf5" Mar 14 10:13:12 crc kubenswrapper[5129]: I0314 10:13:12.054921 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97869495-4dcb-466b-9089-1d82e571a419" path="/var/lib/kubelet/pods/97869495-4dcb-466b-9089-1d82e571a419/volumes" Mar 14 10:13:40 crc kubenswrapper[5129]: I0314 10:13:40.805370 5129 scope.go:117] "RemoveContainer" containerID="84dbca154cffbb4ed440206d11a7f69472d37e88b1dc4cc716b3d0227e302322" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.167221 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558054-cqqrl"] Mar 14 10:14:00 crc kubenswrapper[5129]: E0314 10:14:00.168353 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="extract-utilities" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.168369 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="extract-utilities" Mar 14 10:14:00 crc kubenswrapper[5129]: E0314 10:14:00.168384 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="registry-server" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.168390 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="registry-server" Mar 14 10:14:00 crc kubenswrapper[5129]: E0314 10:14:00.168402 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="extract-content" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.168411 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="extract-content" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.168680 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="97869495-4dcb-466b-9089-1d82e571a419" containerName="registry-server" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.169472 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558054-cqqrl" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.176696 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.177188 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.177278 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.189141 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558054-cqqrl"] Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.300971 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvwc\" (UniqueName: \"kubernetes.io/projected/01532751-0be7-49a7-95d2-d5251776e030-kube-api-access-lmvwc\") pod \"auto-csr-approver-29558054-cqqrl\" (UID: \"01532751-0be7-49a7-95d2-d5251776e030\") " pod="openshift-infra/auto-csr-approver-29558054-cqqrl" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.403575 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvwc\" (UniqueName: \"kubernetes.io/projected/01532751-0be7-49a7-95d2-d5251776e030-kube-api-access-lmvwc\") pod \"auto-csr-approver-29558054-cqqrl\" (UID: \"01532751-0be7-49a7-95d2-d5251776e030\") " pod="openshift-infra/auto-csr-approver-29558054-cqqrl" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.422454 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvwc\" (UniqueName: \"kubernetes.io/projected/01532751-0be7-49a7-95d2-d5251776e030-kube-api-access-lmvwc\") pod \"auto-csr-approver-29558054-cqqrl\" (UID: \"01532751-0be7-49a7-95d2-d5251776e030\") " pod="openshift-infra/auto-csr-approver-29558054-cqqrl" Mar 14 10:14:00 crc kubenswrapper[5129]: I0314 10:14:00.523274 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558054-cqqrl" Mar 14 10:14:01 crc kubenswrapper[5129]: I0314 10:14:01.461643 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558054-cqqrl"] Mar 14 10:14:01 crc kubenswrapper[5129]: I0314 10:14:01.903125 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558054-cqqrl" event={"ID":"01532751-0be7-49a7-95d2-d5251776e030","Type":"ContainerStarted","Data":"94fd0ae5235320b999a669af8dfdde0197f4ae510a5a82e1d0e4035a20e559f2"} Mar 14 10:14:02 crc kubenswrapper[5129]: I0314 10:14:02.916358 5129 generic.go:334] "Generic (PLEG): container finished" podID="01532751-0be7-49a7-95d2-d5251776e030" containerID="276c5ab932191ba7a0966403f5b2ea7aa3a97c558235e3074dc4d1385d4ac574" exitCode=0 Mar 14 10:14:02 crc kubenswrapper[5129]: I0314 10:14:02.916485 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558054-cqqrl" event={"ID":"01532751-0be7-49a7-95d2-d5251776e030","Type":"ContainerDied","Data":"276c5ab932191ba7a0966403f5b2ea7aa3a97c558235e3074dc4d1385d4ac574"} Mar 14 10:14:05 crc kubenswrapper[5129]: I0314 10:14:05.549389 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558054-cqqrl" Mar 14 10:14:05 crc kubenswrapper[5129]: I0314 10:14:05.738910 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmvwc\" (UniqueName: \"kubernetes.io/projected/01532751-0be7-49a7-95d2-d5251776e030-kube-api-access-lmvwc\") pod \"01532751-0be7-49a7-95d2-d5251776e030\" (UID: \"01532751-0be7-49a7-95d2-d5251776e030\") " Mar 14 10:14:05 crc kubenswrapper[5129]: I0314 10:14:05.744893 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01532751-0be7-49a7-95d2-d5251776e030-kube-api-access-lmvwc" (OuterVolumeSpecName: "kube-api-access-lmvwc") pod "01532751-0be7-49a7-95d2-d5251776e030" (UID: "01532751-0be7-49a7-95d2-d5251776e030"). InnerVolumeSpecName "kube-api-access-lmvwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:14:05 crc kubenswrapper[5129]: I0314 10:14:05.842122 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmvwc\" (UniqueName: \"kubernetes.io/projected/01532751-0be7-49a7-95d2-d5251776e030-kube-api-access-lmvwc\") on node \"crc\" DevicePath \"\"" Mar 14 10:14:05 crc kubenswrapper[5129]: I0314 10:14:05.948845 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558054-cqqrl" event={"ID":"01532751-0be7-49a7-95d2-d5251776e030","Type":"ContainerDied","Data":"94fd0ae5235320b999a669af8dfdde0197f4ae510a5a82e1d0e4035a20e559f2"} Mar 14 10:14:05 crc kubenswrapper[5129]: I0314 10:14:05.949074 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94fd0ae5235320b999a669af8dfdde0197f4ae510a5a82e1d0e4035a20e559f2" Mar 14 10:14:05 crc kubenswrapper[5129]: I0314 10:14:05.948922 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558054-cqqrl" Mar 14 10:14:06 crc kubenswrapper[5129]: I0314 10:14:06.640147 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558048-7cf44"] Mar 14 10:14:06 crc kubenswrapper[5129]: I0314 10:14:06.653040 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558048-7cf44"] Mar 14 10:14:08 crc kubenswrapper[5129]: I0314 10:14:08.057346 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e46763-0519-4c45-b8fe-cd6d0b069f13" path="/var/lib/kubelet/pods/d4e46763-0519-4c45-b8fe-cd6d0b069f13/volumes" Mar 14 10:14:19 crc kubenswrapper[5129]: I0314 10:14:19.574279 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:14:19 crc kubenswrapper[5129]: I0314 10:14:19.574782 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:14:40 crc kubenswrapper[5129]: I0314 10:14:40.904929 5129 scope.go:117] "RemoveContainer" containerID="95d2b0ab8c3accb25327cfed7c1e926019c039cac5bf681f655741fb46e0c4fb" Mar 14 10:14:49 crc kubenswrapper[5129]: I0314 10:14:49.574019 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:14:49 crc kubenswrapper[5129]: I0314 10:14:49.574730 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.173012 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx"] Mar 14 10:15:00 crc kubenswrapper[5129]: E0314 10:15:00.174287 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01532751-0be7-49a7-95d2-d5251776e030" containerName="oc" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.174311 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="01532751-0be7-49a7-95d2-d5251776e030" containerName="oc" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.174619 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="01532751-0be7-49a7-95d2-d5251776e030" containerName="oc" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.175476 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.180935 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.181749 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.188998 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx"] Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.192414 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54240eb6-3bf8-4917-b4be-5bcc86d45244-config-volume\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.192984 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6ww6\" (UniqueName: \"kubernetes.io/projected/54240eb6-3bf8-4917-b4be-5bcc86d45244-kube-api-access-n6ww6\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.193099 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54240eb6-3bf8-4917-b4be-5bcc86d45244-secret-volume\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.297467 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6ww6\" (UniqueName: \"kubernetes.io/projected/54240eb6-3bf8-4917-b4be-5bcc86d45244-kube-api-access-n6ww6\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.297531 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54240eb6-3bf8-4917-b4be-5bcc86d45244-secret-volume\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.297618 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54240eb6-3bf8-4917-b4be-5bcc86d45244-config-volume\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.298504 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54240eb6-3bf8-4917-b4be-5bcc86d45244-config-volume\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.309677 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54240eb6-3bf8-4917-b4be-5bcc86d45244-secret-volume\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.316636 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6ww6\" (UniqueName: \"kubernetes.io/projected/54240eb6-3bf8-4917-b4be-5bcc86d45244-kube-api-access-n6ww6\") pod \"collect-profiles-29558055-nhdrx\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:00 crc kubenswrapper[5129]: I0314 10:15:00.503194 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:01 crc kubenswrapper[5129]: I0314 10:15:01.221679 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx"] Mar 14 10:15:01 crc kubenswrapper[5129]: I0314 10:15:01.696829 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" event={"ID":"54240eb6-3bf8-4917-b4be-5bcc86d45244","Type":"ContainerStarted","Data":"926f3d7736f770c3e445cdd266f8337d247603ed664f8ae0da66273938bfe993"} Mar 14 10:15:01 crc kubenswrapper[5129]: I0314 10:15:01.697199 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" event={"ID":"54240eb6-3bf8-4917-b4be-5bcc86d45244","Type":"ContainerStarted","Data":"6dc46d570381d99b51495160520bc106a303c8d03eef3d5aa36d5747bc52ab84"} Mar 14 10:15:01 crc kubenswrapper[5129]: I0314 10:15:01.734811 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" podStartSLOduration=1.7347830210000001 podStartE2EDuration="1.734783021s" podCreationTimestamp="2026-03-14 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:15:01.712026656 +0000 UTC m=+11764.463941850" watchObservedRunningTime="2026-03-14 10:15:01.734783021 +0000 UTC m=+11764.486698215" Mar 14 10:15:02 crc kubenswrapper[5129]: I0314 10:15:02.710911 5129 generic.go:334] "Generic (PLEG): container finished" podID="54240eb6-3bf8-4917-b4be-5bcc86d45244" containerID="926f3d7736f770c3e445cdd266f8337d247603ed664f8ae0da66273938bfe993" exitCode=0 Mar 14 10:15:02 crc kubenswrapper[5129]: I0314 10:15:02.710964 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" event={"ID":"54240eb6-3bf8-4917-b4be-5bcc86d45244","Type":"ContainerDied","Data":"926f3d7736f770c3e445cdd266f8337d247603ed664f8ae0da66273938bfe993"} Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.258891 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.436938 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6ww6\" (UniqueName: \"kubernetes.io/projected/54240eb6-3bf8-4917-b4be-5bcc86d45244-kube-api-access-n6ww6\") pod \"54240eb6-3bf8-4917-b4be-5bcc86d45244\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.437146 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54240eb6-3bf8-4917-b4be-5bcc86d45244-config-volume\") pod \"54240eb6-3bf8-4917-b4be-5bcc86d45244\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.437237 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54240eb6-3bf8-4917-b4be-5bcc86d45244-secret-volume\") pod \"54240eb6-3bf8-4917-b4be-5bcc86d45244\" (UID: \"54240eb6-3bf8-4917-b4be-5bcc86d45244\") " Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.437952 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54240eb6-3bf8-4917-b4be-5bcc86d45244-config-volume" (OuterVolumeSpecName: "config-volume") pod "54240eb6-3bf8-4917-b4be-5bcc86d45244" (UID: "54240eb6-3bf8-4917-b4be-5bcc86d45244"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.438312 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54240eb6-3bf8-4917-b4be-5bcc86d45244-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.446386 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54240eb6-3bf8-4917-b4be-5bcc86d45244-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54240eb6-3bf8-4917-b4be-5bcc86d45244" (UID: "54240eb6-3bf8-4917-b4be-5bcc86d45244"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.450986 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54240eb6-3bf8-4917-b4be-5bcc86d45244-kube-api-access-n6ww6" (OuterVolumeSpecName: "kube-api-access-n6ww6") pod "54240eb6-3bf8-4917-b4be-5bcc86d45244" (UID: "54240eb6-3bf8-4917-b4be-5bcc86d45244"). InnerVolumeSpecName "kube-api-access-n6ww6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.540478 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54240eb6-3bf8-4917-b4be-5bcc86d45244-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.540525 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6ww6\" (UniqueName: \"kubernetes.io/projected/54240eb6-3bf8-4917-b4be-5bcc86d45244-kube-api-access-n6ww6\") on node \"crc\" DevicePath \"\"" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.755695 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" event={"ID":"54240eb6-3bf8-4917-b4be-5bcc86d45244","Type":"ContainerDied","Data":"6dc46d570381d99b51495160520bc106a303c8d03eef3d5aa36d5747bc52ab84"} Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.755752 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc46d570381d99b51495160520bc106a303c8d03eef3d5aa36d5747bc52ab84" Mar 14 10:15:05 crc kubenswrapper[5129]: I0314 10:15:05.755829 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558055-nhdrx" Mar 14 10:15:06 crc kubenswrapper[5129]: I0314 10:15:06.360251 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922"] Mar 14 10:15:06 crc kubenswrapper[5129]: I0314 10:15:06.373267 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-wv922"] Mar 14 10:15:08 crc kubenswrapper[5129]: I0314 10:15:08.057996 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736ecdc4-7395-4bb0-b3e5-fc6556e7c8db" path="/var/lib/kubelet/pods/736ecdc4-7395-4bb0-b3e5-fc6556e7c8db/volumes" Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.574684 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.576024 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.576136 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.577029 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea55a9c69c43b228fc76f8d80751a65180ad34d13576370de81b7c849d750b69"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.577153 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://ea55a9c69c43b228fc76f8d80751a65180ad34d13576370de81b7c849d750b69" gracePeriod=600 Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.937185 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="ea55a9c69c43b228fc76f8d80751a65180ad34d13576370de81b7c849d750b69" exitCode=0 Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.937263 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"ea55a9c69c43b228fc76f8d80751a65180ad34d13576370de81b7c849d750b69"} Mar 14 10:15:19 crc kubenswrapper[5129]: I0314 10:15:19.937530 5129 scope.go:117] "RemoveContainer" containerID="eac6d20213372d682150825b71067186412bcfbcafd79291bf8ee9948fb77311" Mar 14 10:15:20 crc kubenswrapper[5129]: I0314 10:15:20.952426 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1"} Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.659512 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hngs4"] Mar 14 10:15:34 crc kubenswrapper[5129]: E0314 10:15:34.660842 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54240eb6-3bf8-4917-b4be-5bcc86d45244" containerName="collect-profiles" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.660860 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="54240eb6-3bf8-4917-b4be-5bcc86d45244" containerName="collect-profiles" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.661170 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="54240eb6-3bf8-4917-b4be-5bcc86d45244" containerName="collect-profiles" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.663219 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.729444 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hngs4"] Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.769578 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-catalog-content\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.769826 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-utilities\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.769974 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbcl6\" (UniqueName: \"kubernetes.io/projected/b83712a6-1c0a-4179-b51f-2f718d5bbb24-kube-api-access-dbcl6\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.871874 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-utilities\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.871971 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbcl6\" (UniqueName: \"kubernetes.io/projected/b83712a6-1c0a-4179-b51f-2f718d5bbb24-kube-api-access-dbcl6\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.872144 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-catalog-content\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.872372 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-utilities\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.872789 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-catalog-content\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.896506 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbcl6\" (UniqueName: \"kubernetes.io/projected/b83712a6-1c0a-4179-b51f-2f718d5bbb24-kube-api-access-dbcl6\") pod \"certified-operators-hngs4\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:34 crc kubenswrapper[5129]: I0314 10:15:34.997673 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:35 crc kubenswrapper[5129]: I0314 10:15:35.840844 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hngs4"] Mar 14 10:15:36 crc kubenswrapper[5129]: I0314 10:15:36.146493 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hngs4" event={"ID":"b83712a6-1c0a-4179-b51f-2f718d5bbb24","Type":"ContainerDied","Data":"0a6a7c3a474ef4e1d2e7dab9bffc1bd0d83767bac6b4de57b7ac8324182dee39"} Mar 14 10:15:36 crc kubenswrapper[5129]: I0314 10:15:36.146453 5129 generic.go:334] "Generic (PLEG): container finished" podID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerID="0a6a7c3a474ef4e1d2e7dab9bffc1bd0d83767bac6b4de57b7ac8324182dee39" exitCode=0 Mar 14 10:15:36 crc kubenswrapper[5129]: I0314 10:15:36.146811 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hngs4" event={"ID":"b83712a6-1c0a-4179-b51f-2f718d5bbb24","Type":"ContainerStarted","Data":"c16d2c5297a1542d391cb9b593281063e8833f9a36a8c6dc97d28c883c250dd9"} Mar 14 10:15:37 crc kubenswrapper[5129]: I0314 10:15:37.160644 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hngs4" event={"ID":"b83712a6-1c0a-4179-b51f-2f718d5bbb24","Type":"ContainerStarted","Data":"3a8d6936041c0bc0ec60ef5dc6740634198eb7e3245f3313bf51508b7f843604"} Mar 14 10:15:38 crc kubenswrapper[5129]: E0314 10:15:38.976516 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb83712a6_1c0a_4179_b51f_2f718d5bbb24.slice/crio-conmon-3a8d6936041c0bc0ec60ef5dc6740634198eb7e3245f3313bf51508b7f843604.scope\": RecentStats: unable to find data in memory cache]" Mar 14 10:15:39 crc kubenswrapper[5129]: I0314 10:15:39.189950 5129 generic.go:334] "Generic (PLEG): container finished" podID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerID="3a8d6936041c0bc0ec60ef5dc6740634198eb7e3245f3313bf51508b7f843604" exitCode=0 Mar 14 10:15:39 crc kubenswrapper[5129]: I0314 10:15:39.190018 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hngs4" event={"ID":"b83712a6-1c0a-4179-b51f-2f718d5bbb24","Type":"ContainerDied","Data":"3a8d6936041c0bc0ec60ef5dc6740634198eb7e3245f3313bf51508b7f843604"} Mar 14 10:15:40 crc kubenswrapper[5129]: I0314 10:15:40.204482 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hngs4" event={"ID":"b83712a6-1c0a-4179-b51f-2f718d5bbb24","Type":"ContainerStarted","Data":"d080eeef82e9f164b80aa299542fa268eaf5596ecbcac0a56bc5b09b5f865c07"} Mar 14 10:15:40 crc kubenswrapper[5129]: I0314 10:15:40.231567 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hngs4" podStartSLOduration=2.719710483 podStartE2EDuration="6.231547037s" podCreationTimestamp="2026-03-14 10:15:34 +0000 UTC" firstStartedPulling="2026-03-14 10:15:36.148731895 +0000 UTC m=+11798.900647079" lastFinishedPulling="2026-03-14 10:15:39.660568449 +0000 UTC m=+11802.412483633" observedRunningTime="2026-03-14 10:15:40.219732858 +0000 UTC m=+11802.971648042" watchObservedRunningTime="2026-03-14 10:15:40.231547037 +0000 UTC m=+11802.983462221" Mar 14 10:15:41 crc kubenswrapper[5129]: I0314 10:15:41.028816 5129 scope.go:117] "RemoveContainer" containerID="be36b34f644b29c4aac73d6c163cf3d2c25f82d96952f83febab75c1b01c4b8e" Mar 14 10:15:41 crc kubenswrapper[5129]: I0314 10:15:41.079441 5129 scope.go:117] "RemoveContainer" containerID="d16feee0a3e9cb213de96f7af24b0ac8037a4b9829f9ee9b41ff76d2bcaaca51" Mar 14 10:15:41 crc kubenswrapper[5129]: I0314 10:15:41.189728 5129 scope.go:117] "RemoveContainer" containerID="145a96c7a443b4a53dd82374ba4ed34c2101a2144a6533cd8c8736d71661ac69" Mar 14 10:15:41 crc kubenswrapper[5129]: I0314 10:15:41.237523 5129 scope.go:117] "RemoveContainer" containerID="68f1ceaa86d325d6c13a84bf243388f5ae290a2dc6d0df234c45815bedfcaa5f" Mar 14 10:15:44 crc kubenswrapper[5129]: I0314 10:15:44.997800 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:44 crc kubenswrapper[5129]: I0314 10:15:44.999312 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:45 crc kubenswrapper[5129]: I0314 10:15:45.045910 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:45 crc kubenswrapper[5129]: I0314 10:15:45.322712 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:45 crc kubenswrapper[5129]: I0314 10:15:45.385018 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hngs4"] Mar 14 10:15:47 crc kubenswrapper[5129]: I0314 10:15:47.290767 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hngs4" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="registry-server" containerID="cri-o://d080eeef82e9f164b80aa299542fa268eaf5596ecbcac0a56bc5b09b5f865c07" gracePeriod=2 Mar 14 10:15:48 crc kubenswrapper[5129]: I0314 10:15:48.303860 5129 generic.go:334] "Generic (PLEG): container finished" podID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerID="d080eeef82e9f164b80aa299542fa268eaf5596ecbcac0a56bc5b09b5f865c07" exitCode=0 Mar 14 10:15:48 crc kubenswrapper[5129]: I0314 10:15:48.303912 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hngs4" event={"ID":"b83712a6-1c0a-4179-b51f-2f718d5bbb24","Type":"ContainerDied","Data":"d080eeef82e9f164b80aa299542fa268eaf5596ecbcac0a56bc5b09b5f865c07"} Mar 14 10:15:48 crc kubenswrapper[5129]: I0314 10:15:48.904723 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.037528 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-catalog-content\") pod \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.037674 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-utilities\") pod \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.038040 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbcl6\" (UniqueName: \"kubernetes.io/projected/b83712a6-1c0a-4179-b51f-2f718d5bbb24-kube-api-access-dbcl6\") pod \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\" (UID: \"b83712a6-1c0a-4179-b51f-2f718d5bbb24\") " Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.038466 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-utilities" (OuterVolumeSpecName: "utilities") pod "b83712a6-1c0a-4179-b51f-2f718d5bbb24" (UID: "b83712a6-1c0a-4179-b51f-2f718d5bbb24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.038624 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.048374 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83712a6-1c0a-4179-b51f-2f718d5bbb24-kube-api-access-dbcl6" (OuterVolumeSpecName: "kube-api-access-dbcl6") pod "b83712a6-1c0a-4179-b51f-2f718d5bbb24" (UID: "b83712a6-1c0a-4179-b51f-2f718d5bbb24"). InnerVolumeSpecName "kube-api-access-dbcl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.099104 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b83712a6-1c0a-4179-b51f-2f718d5bbb24" (UID: "b83712a6-1c0a-4179-b51f-2f718d5bbb24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.140331 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b83712a6-1c0a-4179-b51f-2f718d5bbb24-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.140369 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbcl6\" (UniqueName: \"kubernetes.io/projected/b83712a6-1c0a-4179-b51f-2f718d5bbb24-kube-api-access-dbcl6\") on node \"crc\" DevicePath \"\"" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.318105 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hngs4" event={"ID":"b83712a6-1c0a-4179-b51f-2f718d5bbb24","Type":"ContainerDied","Data":"c16d2c5297a1542d391cb9b593281063e8833f9a36a8c6dc97d28c883c250dd9"} Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.318171 5129 scope.go:117] "RemoveContainer" containerID="d080eeef82e9f164b80aa299542fa268eaf5596ecbcac0a56bc5b09b5f865c07" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.318208 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hngs4" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.353994 5129 scope.go:117] "RemoveContainer" containerID="3a8d6936041c0bc0ec60ef5dc6740634198eb7e3245f3313bf51508b7f843604" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.369805 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hngs4"] Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.383917 5129 scope.go:117] "RemoveContainer" containerID="0a6a7c3a474ef4e1d2e7dab9bffc1bd0d83767bac6b4de57b7ac8324182dee39" Mar 14 10:15:49 crc kubenswrapper[5129]: I0314 10:15:49.390751 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hngs4"] Mar 14 10:15:50 crc kubenswrapper[5129]: I0314 10:15:50.054486 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" path="/var/lib/kubelet/pods/b83712a6-1c0a-4179-b51f-2f718d5bbb24/volumes" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.587191 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-debug-6z8gr"] Mar 14 10:15:52 crc kubenswrapper[5129]: E0314 10:15:52.588167 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="extract-utilities" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.588185 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="extract-utilities" Mar 14 10:15:52 crc kubenswrapper[5129]: E0314 10:15:52.588204 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="registry-server" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.588212 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="registry-server" Mar 14 10:15:52 crc kubenswrapper[5129]: E0314 10:15:52.588263 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="extract-content" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.588272 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="extract-content" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.588532 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83712a6-1c0a-4179-b51f-2f718d5bbb24" containerName="registry-server" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.590161 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.596879 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.597203 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.613655 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-6z8gr"] Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.720689 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.720771 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-etc-swift\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.720809 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-scripts\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.720988 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-swiftconf\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.721093 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9t7b\" (UniqueName: \"kubernetes.io/projected/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-kube-api-access-f9t7b\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.721399 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-dispersionconf\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.721545 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-ring-data-devices\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.823363 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.823455 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-etc-swift\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.823500 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-scripts\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.823550 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-swiftconf\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.823573 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9t7b\" (UniqueName: \"kubernetes.io/projected/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-kube-api-access-f9t7b\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.823672 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-dispersionconf\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.823713 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-ring-data-devices\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.824429 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-etc-swift\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.824710 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-scripts\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.824747 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-ring-data-devices\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.830735 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-swiftconf\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.831402 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-combined-ca-bundle\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.842029 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-dispersionconf\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.848149 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9t7b\" (UniqueName: \"kubernetes.io/projected/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-kube-api-access-f9t7b\") pod \"swift-ring-rebalance-debug-6z8gr\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:52 crc kubenswrapper[5129]: I0314 10:15:52.911632 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:15:53 crc kubenswrapper[5129]: I0314 10:15:53.647245 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-debug-6z8gr"] Mar 14 10:15:54 crc kubenswrapper[5129]: I0314 10:15:54.418156 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-6z8gr" event={"ID":"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79","Type":"ContainerStarted","Data":"dd10baca5a9e8f9c7459a3e668f50ded1c65ba9635575d9ff9ffd157dab8f356"} Mar 14 10:15:54 crc kubenswrapper[5129]: I0314 10:15:54.418533 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-6z8gr" event={"ID":"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79","Type":"ContainerStarted","Data":"c8946b759bae21f85d5c8c37415fa1f6666a37996834260e44ceca777b9afea1"} Mar 14 10:15:54 crc kubenswrapper[5129]: I0314 10:15:54.450783 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-debug-6z8gr" podStartSLOduration=2.450759434 podStartE2EDuration="2.450759434s" podCreationTimestamp="2026-03-14 10:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:15:54.441755251 +0000 UTC m=+11817.193670455" watchObservedRunningTime="2026-03-14 10:15:54.450759434 +0000 UTC m=+11817.202674618" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.155741 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558056-ljc6z"] Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.158952 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.161481 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.162338 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.165077 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.169421 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558056-ljc6z"] Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.240863 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdd8b\" (UniqueName: \"kubernetes.io/projected/e6ed34c3-be44-4144-9157-21e052ca247f-kube-api-access-jdd8b\") pod \"auto-csr-approver-29558056-ljc6z\" (UID: \"e6ed34c3-be44-4144-9157-21e052ca247f\") " pod="openshift-infra/auto-csr-approver-29558056-ljc6z" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.343059 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdd8b\" (UniqueName: \"kubernetes.io/projected/e6ed34c3-be44-4144-9157-21e052ca247f-kube-api-access-jdd8b\") pod \"auto-csr-approver-29558056-ljc6z\" (UID: \"e6ed34c3-be44-4144-9157-21e052ca247f\") " pod="openshift-infra/auto-csr-approver-29558056-ljc6z" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.363587 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdd8b\" (UniqueName: \"kubernetes.io/projected/e6ed34c3-be44-4144-9157-21e052ca247f-kube-api-access-jdd8b\") pod \"auto-csr-approver-29558056-ljc6z\" (UID: \"e6ed34c3-be44-4144-9157-21e052ca247f\") " pod="openshift-infra/auto-csr-approver-29558056-ljc6z" Mar 14 10:16:00 crc kubenswrapper[5129]: I0314 10:16:00.489544 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" Mar 14 10:16:01 crc kubenswrapper[5129]: I0314 10:16:01.269555 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558056-ljc6z"] Mar 14 10:16:01 crc kubenswrapper[5129]: I0314 10:16:01.527987 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" event={"ID":"e6ed34c3-be44-4144-9157-21e052ca247f","Type":"ContainerStarted","Data":"9a80bee7c2effd5b962b1227c3ef13fdd79041bec75141ff1c38e401f36f3f90"} Mar 14 10:16:01 crc kubenswrapper[5129]: I0314 10:16:01.531153 5129 generic.go:334] "Generic (PLEG): container finished" podID="22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" containerID="dd10baca5a9e8f9c7459a3e668f50ded1c65ba9635575d9ff9ffd157dab8f356" exitCode=0 Mar 14 10:16:01 crc kubenswrapper[5129]: I0314 10:16:01.531208 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-debug-6z8gr" event={"ID":"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79","Type":"ContainerDied","Data":"dd10baca5a9e8f9c7459a3e668f50ded1c65ba9635575d9ff9ffd157dab8f356"} Mar 14 10:16:02 crc kubenswrapper[5129]: I0314 10:16:02.544686 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" event={"ID":"e6ed34c3-be44-4144-9157-21e052ca247f","Type":"ContainerStarted","Data":"5843014066fe8848ce87502046929dda076525bb462301cce89271a23eca35ce"} Mar 14 10:16:02 crc kubenswrapper[5129]: I0314 10:16:02.558775 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" podStartSLOduration=1.765101107 podStartE2EDuration="2.558753019s" podCreationTimestamp="2026-03-14 10:16:00 +0000 UTC" firstStartedPulling="2026-03-14 10:16:01.276703063 +0000 UTC m=+11824.028618237" lastFinishedPulling="2026-03-14 10:16:02.070354955 +0000 UTC m=+11824.822270149" observedRunningTime="2026-03-14 10:16:02.557951687 +0000 UTC m=+11825.309866871" watchObservedRunningTime="2026-03-14 10:16:02.558753019 +0000 UTC m=+11825.310668203" Mar 14 10:16:03 crc kubenswrapper[5129]: I0314 10:16:03.561822 5129 generic.go:334] "Generic (PLEG): container finished" podID="e6ed34c3-be44-4144-9157-21e052ca247f" containerID="5843014066fe8848ce87502046929dda076525bb462301cce89271a23eca35ce" exitCode=0 Mar 14 10:16:03 crc kubenswrapper[5129]: I0314 10:16:03.561875 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" event={"ID":"e6ed34c3-be44-4144-9157-21e052ca247f","Type":"ContainerDied","Data":"5843014066fe8848ce87502046929dda076525bb462301cce89271a23eca35ce"} Mar 14 10:16:03 crc kubenswrapper[5129]: I0314 10:16:03.908084 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:16:03 crc kubenswrapper[5129]: I0314 10:16:03.961388 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-debug-6z8gr"] Mar 14 10:16:03 crc kubenswrapper[5129]: I0314 10:16:03.970399 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-debug-6z8gr"] Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.085364 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9t7b\" (UniqueName: \"kubernetes.io/projected/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-kube-api-access-f9t7b\") pod \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.085453 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-ring-data-devices\") pod \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.085589 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-combined-ca-bundle\") pod \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.085635 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-scripts\") pod \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.085728 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-swiftconf\") pod \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.085783 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-dispersionconf\") pod \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.085828 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-etc-swift\") pod \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\" (UID: \"22cb726e-d0f5-4eac-9a7b-0ea5f4595c79\") " Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.087338 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" (UID: "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.087353 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" (UID: "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.106869 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-kube-api-access-f9t7b" (OuterVolumeSpecName: "kube-api-access-f9t7b") pod "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" (UID: "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79"). InnerVolumeSpecName "kube-api-access-f9t7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.151771 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" (UID: "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.171782 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" (UID: "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.191697 5129 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.191731 5129 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.191741 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9t7b\" (UniqueName: \"kubernetes.io/projected/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-kube-api-access-f9t7b\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.191772 5129 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.191782 5129 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.216385 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" (UID: "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.246082 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-scripts" (OuterVolumeSpecName: "scripts") pod "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" (UID: "22cb726e-d0f5-4eac-9a7b-0ea5f4595c79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.295291 5129 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.295325 5129 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.576274 5129 scope.go:117] "RemoveContainer" containerID="dd10baca5a9e8f9c7459a3e668f50ded1c65ba9635575d9ff9ffd157dab8f356" Mar 14 10:16:04 crc kubenswrapper[5129]: I0314 10:16:04.576314 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-debug-6z8gr" Mar 14 10:16:05 crc kubenswrapper[5129]: I0314 10:16:05.967850 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" Mar 14 10:16:06 crc kubenswrapper[5129]: I0314 10:16:06.055427 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" path="/var/lib/kubelet/pods/22cb726e-d0f5-4eac-9a7b-0ea5f4595c79/volumes" Mar 14 10:16:06 crc kubenswrapper[5129]: I0314 10:16:06.135936 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdd8b\" (UniqueName: \"kubernetes.io/projected/e6ed34c3-be44-4144-9157-21e052ca247f-kube-api-access-jdd8b\") pod \"e6ed34c3-be44-4144-9157-21e052ca247f\" (UID: \"e6ed34c3-be44-4144-9157-21e052ca247f\") " Mar 14 10:16:06 crc kubenswrapper[5129]: I0314 10:16:06.148655 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ed34c3-be44-4144-9157-21e052ca247f-kube-api-access-jdd8b" (OuterVolumeSpecName: "kube-api-access-jdd8b") pod "e6ed34c3-be44-4144-9157-21e052ca247f" (UID: "e6ed34c3-be44-4144-9157-21e052ca247f"). InnerVolumeSpecName "kube-api-access-jdd8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:16:06 crc kubenswrapper[5129]: I0314 10:16:06.239611 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdd8b\" (UniqueName: \"kubernetes.io/projected/e6ed34c3-be44-4144-9157-21e052ca247f-kube-api-access-jdd8b\") on node \"crc\" DevicePath \"\"" Mar 14 10:16:06 crc kubenswrapper[5129]: I0314 10:16:06.602212 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" event={"ID":"e6ed34c3-be44-4144-9157-21e052ca247f","Type":"ContainerDied","Data":"9a80bee7c2effd5b962b1227c3ef13fdd79041bec75141ff1c38e401f36f3f90"} Mar 14 10:16:06 crc kubenswrapper[5129]: I0314 10:16:06.602575 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a80bee7c2effd5b962b1227c3ef13fdd79041bec75141ff1c38e401f36f3f90" Mar 14 10:16:06 crc kubenswrapper[5129]: I0314 10:16:06.602255 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558056-ljc6z" Mar 14 10:16:07 crc kubenswrapper[5129]: I0314 10:16:07.047265 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558050-nqkz4"] Mar 14 10:16:07 crc kubenswrapper[5129]: I0314 10:16:07.065344 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558050-nqkz4"] Mar 14 10:16:08 crc kubenswrapper[5129]: I0314 10:16:08.076536 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80013541-f81c-4b8f-af6b-ee9e5110aa29" path="/var/lib/kubelet/pods/80013541-f81c-4b8f-af6b-ee9e5110aa29/volumes" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.259909 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 10:16:18 crc kubenswrapper[5129]: E0314 10:16:18.262227 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" containerName="swift-ring-rebalance" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.262321 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" containerName="swift-ring-rebalance" Mar 14 10:16:18 crc kubenswrapper[5129]: E0314 10:16:18.262408 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ed34c3-be44-4144-9157-21e052ca247f" containerName="oc" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.262464 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ed34c3-be44-4144-9157-21e052ca247f" containerName="oc" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.263072 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ed34c3-be44-4144-9157-21e052ca247f" containerName="oc" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.263176 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="22cb726e-d0f5-4eac-9a7b-0ea5f4595c79" containerName="swift-ring-rebalance" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.264198 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.268626 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jslnd" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.269823 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.271369 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.273497 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.280962 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315218 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315296 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315329 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315353 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315529 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315681 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315730 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315755 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-464rc\" (UniqueName: \"kubernetes.io/projected/4883291a-4eb0-4a7b-82e9-f0caa4f72148-kube-api-access-464rc\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.315837 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-config-data\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418092 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418166 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418213 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418243 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-464rc\" (UniqueName: \"kubernetes.io/projected/4883291a-4eb0-4a7b-82e9-f0caa4f72148-kube-api-access-464rc\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418273 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-config-data\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418331 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418373 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418396 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.418419 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.419076 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.419133 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.419361 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.419658 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.419847 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-config-data\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.426686 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.429895 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.430141 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.442436 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-464rc\" (UniqueName: \"kubernetes.io/projected/4883291a-4eb0-4a7b-82e9-f0caa4f72148-kube-api-access-464rc\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.456797 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " pod="openstack/tempest-tests-tempest" Mar 14 10:16:18 crc kubenswrapper[5129]: I0314 10:16:18.597311 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 10:16:19 crc kubenswrapper[5129]: I0314 10:16:19.338095 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 10:16:19 crc kubenswrapper[5129]: I0314 10:16:19.780274 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4883291a-4eb0-4a7b-82e9-f0caa4f72148","Type":"ContainerStarted","Data":"493c3ea7f3908f977af9b1bad95e78ece254007f410ba2364bfa2f6d78576088"} Mar 14 10:16:41 crc kubenswrapper[5129]: I0314 10:16:41.364508 5129 scope.go:117] "RemoveContainer" containerID="2d91210a70935a3feaaab09f92765c43629624e839ec5a3205ee784dcdf833b2" Mar 14 10:17:09 crc kubenswrapper[5129]: E0314 10:17:09.580536 5129 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:7002c9136b77c6990bfebf085d6871b3" Mar 14 10:17:09 crc kubenswrapper[5129]: E0314 10:17:09.581144 5129 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:7002c9136b77c6990bfebf085d6871b3" Mar 14 10:17:09 crc kubenswrapper[5129]: E0314 10:17:09.581331 5129 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:7002c9136b77c6990bfebf085d6871b3,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-464rc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4883291a-4eb0-4a7b-82e9-f0caa4f72148): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 10:17:09 crc kubenswrapper[5129]: E0314 10:17:09.582637 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4883291a-4eb0-4a7b-82e9-f0caa4f72148" Mar 14 10:17:10 crc kubenswrapper[5129]: E0314 10:17:10.495186 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:7002c9136b77c6990bfebf085d6871b3\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4883291a-4eb0-4a7b-82e9-f0caa4f72148" Mar 14 10:17:19 crc kubenswrapper[5129]: I0314 10:17:19.574985 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:17:19 crc kubenswrapper[5129]: I0314 10:17:19.575638 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:17:23 crc kubenswrapper[5129]: I0314 10:17:23.039010 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:17:23 crc kubenswrapper[5129]: I0314 10:17:23.275536 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 14 10:17:25 crc kubenswrapper[5129]: I0314 10:17:25.691402 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4883291a-4eb0-4a7b-82e9-f0caa4f72148","Type":"ContainerStarted","Data":"8acf1a76318c854ec6ad2f10ea67f397fdf6fcae07ad90086f5e92f525cf5e33"} Mar 14 10:17:25 crc kubenswrapper[5129]: I0314 10:17:25.717870 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.787007827 podStartE2EDuration="1m8.717846882s" podCreationTimestamp="2026-03-14 10:16:17 +0000 UTC" firstStartedPulling="2026-03-14 10:16:19.341416981 +0000 UTC m=+11842.093332165" lastFinishedPulling="2026-03-14 10:17:23.272256036 +0000 UTC m=+11906.024171220" observedRunningTime="2026-03-14 10:17:25.714825601 +0000 UTC m=+11908.466740795" watchObservedRunningTime="2026-03-14 10:17:25.717846882 +0000 UTC m=+11908.469762066" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.322277 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cv8k9"] Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.325494 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.335714 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cv8k9"] Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.476498 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jmsx\" (UniqueName: \"kubernetes.io/projected/f042adb5-2c9d-48bc-b416-73cbd806dbde-kube-api-access-6jmsx\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.476552 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-utilities\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.477651 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-catalog-content\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.580482 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-catalog-content\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.580618 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jmsx\" (UniqueName: \"kubernetes.io/projected/f042adb5-2c9d-48bc-b416-73cbd806dbde-kube-api-access-6jmsx\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.580652 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-utilities\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.581033 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-catalog-content\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.581310 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-utilities\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.606093 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jmsx\" (UniqueName: \"kubernetes.io/projected/f042adb5-2c9d-48bc-b416-73cbd806dbde-kube-api-access-6jmsx\") pod \"redhat-marketplace-cv8k9\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:37 crc kubenswrapper[5129]: I0314 10:17:37.647738 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:38 crc kubenswrapper[5129]: I0314 10:17:38.457464 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cv8k9"] Mar 14 10:17:38 crc kubenswrapper[5129]: I0314 10:17:38.876310 5129 generic.go:334] "Generic (PLEG): container finished" podID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerID="80bb4d5df43ef20f80bc16eaf67e4ef0131ed9ae1a5e27d93c426c9ca5fc8ded" exitCode=0 Mar 14 10:17:38 crc kubenswrapper[5129]: I0314 10:17:38.876459 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cv8k9" event={"ID":"f042adb5-2c9d-48bc-b416-73cbd806dbde","Type":"ContainerDied","Data":"80bb4d5df43ef20f80bc16eaf67e4ef0131ed9ae1a5e27d93c426c9ca5fc8ded"} Mar 14 10:17:38 crc kubenswrapper[5129]: I0314 10:17:38.876723 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cv8k9" event={"ID":"f042adb5-2c9d-48bc-b416-73cbd806dbde","Type":"ContainerStarted","Data":"dc753664134fdcf1c1419d1cb3bf62dd561d7d86617aedeeebf7ed605eacb957"} Mar 14 10:17:39 crc kubenswrapper[5129]: I0314 10:17:39.891214 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cv8k9" event={"ID":"f042adb5-2c9d-48bc-b416-73cbd806dbde","Type":"ContainerStarted","Data":"7ce61dc7691fa8395b6789b553a0c1029780933947bde998c80bdff943dfd509"} Mar 14 10:17:41 crc kubenswrapper[5129]: I0314 10:17:41.917241 5129 generic.go:334] "Generic (PLEG): container finished" podID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerID="7ce61dc7691fa8395b6789b553a0c1029780933947bde998c80bdff943dfd509" exitCode=0 Mar 14 10:17:41 crc kubenswrapper[5129]: I0314 10:17:41.917327 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cv8k9" event={"ID":"f042adb5-2c9d-48bc-b416-73cbd806dbde","Type":"ContainerDied","Data":"7ce61dc7691fa8395b6789b553a0c1029780933947bde998c80bdff943dfd509"} Mar 14 10:17:42 crc kubenswrapper[5129]: I0314 10:17:42.941265 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cv8k9" event={"ID":"f042adb5-2c9d-48bc-b416-73cbd806dbde","Type":"ContainerStarted","Data":"98c4e911eb315b379df42b88dc81f38f66668861f480d287bf959bd3ee9f95bb"} Mar 14 10:17:42 crc kubenswrapper[5129]: I0314 10:17:42.972246 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cv8k9" podStartSLOduration=2.519184984 podStartE2EDuration="5.972221327s" podCreationTimestamp="2026-03-14 10:17:37 +0000 UTC" firstStartedPulling="2026-03-14 10:17:38.878426717 +0000 UTC m=+11921.630341911" lastFinishedPulling="2026-03-14 10:17:42.33146307 +0000 UTC m=+11925.083378254" observedRunningTime="2026-03-14 10:17:42.958829874 +0000 UTC m=+11925.710745078" watchObservedRunningTime="2026-03-14 10:17:42.972221327 +0000 UTC m=+11925.724136511" Mar 14 10:17:47 crc kubenswrapper[5129]: I0314 10:17:47.648349 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:47 crc kubenswrapper[5129]: I0314 10:17:47.649010 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:47 crc kubenswrapper[5129]: I0314 10:17:47.723874 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:48 crc kubenswrapper[5129]: I0314 10:17:48.056990 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:48 crc kubenswrapper[5129]: I0314 10:17:48.124564 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cv8k9"] Mar 14 10:17:49 crc kubenswrapper[5129]: I0314 10:17:49.574750 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:17:49 crc kubenswrapper[5129]: I0314 10:17:49.575219 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:17:50 crc kubenswrapper[5129]: I0314 10:17:50.009504 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cv8k9" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="registry-server" containerID="cri-o://98c4e911eb315b379df42b88dc81f38f66668861f480d287bf959bd3ee9f95bb" gracePeriod=2 Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.024019 5129 generic.go:334] "Generic (PLEG): container finished" podID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerID="98c4e911eb315b379df42b88dc81f38f66668861f480d287bf959bd3ee9f95bb" exitCode=0 Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.024368 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cv8k9" event={"ID":"f042adb5-2c9d-48bc-b416-73cbd806dbde","Type":"ContainerDied","Data":"98c4e911eb315b379df42b88dc81f38f66668861f480d287bf959bd3ee9f95bb"} Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.589149 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.789884 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jmsx\" (UniqueName: \"kubernetes.io/projected/f042adb5-2c9d-48bc-b416-73cbd806dbde-kube-api-access-6jmsx\") pod \"f042adb5-2c9d-48bc-b416-73cbd806dbde\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.790112 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-catalog-content\") pod \"f042adb5-2c9d-48bc-b416-73cbd806dbde\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.790310 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-utilities\") pod \"f042adb5-2c9d-48bc-b416-73cbd806dbde\" (UID: \"f042adb5-2c9d-48bc-b416-73cbd806dbde\") " Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.791978 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-utilities" (OuterVolumeSpecName: "utilities") pod "f042adb5-2c9d-48bc-b416-73cbd806dbde" (UID: "f042adb5-2c9d-48bc-b416-73cbd806dbde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.799272 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f042adb5-2c9d-48bc-b416-73cbd806dbde-kube-api-access-6jmsx" (OuterVolumeSpecName: "kube-api-access-6jmsx") pod "f042adb5-2c9d-48bc-b416-73cbd806dbde" (UID: "f042adb5-2c9d-48bc-b416-73cbd806dbde"). InnerVolumeSpecName "kube-api-access-6jmsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.814843 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f042adb5-2c9d-48bc-b416-73cbd806dbde" (UID: "f042adb5-2c9d-48bc-b416-73cbd806dbde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.893010 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.893061 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jmsx\" (UniqueName: \"kubernetes.io/projected/f042adb5-2c9d-48bc-b416-73cbd806dbde-kube-api-access-6jmsx\") on node \"crc\" DevicePath \"\"" Mar 14 10:17:51 crc kubenswrapper[5129]: I0314 10:17:51.893080 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f042adb5-2c9d-48bc-b416-73cbd806dbde-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:17:52 crc kubenswrapper[5129]: I0314 10:17:52.052332 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cv8k9" Mar 14 10:17:52 crc kubenswrapper[5129]: I0314 10:17:52.054855 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cv8k9" event={"ID":"f042adb5-2c9d-48bc-b416-73cbd806dbde","Type":"ContainerDied","Data":"dc753664134fdcf1c1419d1cb3bf62dd561d7d86617aedeeebf7ed605eacb957"} Mar 14 10:17:52 crc kubenswrapper[5129]: I0314 10:17:52.054923 5129 scope.go:117] "RemoveContainer" containerID="98c4e911eb315b379df42b88dc81f38f66668861f480d287bf959bd3ee9f95bb" Mar 14 10:17:52 crc kubenswrapper[5129]: I0314 10:17:52.088812 5129 scope.go:117] "RemoveContainer" containerID="7ce61dc7691fa8395b6789b553a0c1029780933947bde998c80bdff943dfd509" Mar 14 10:17:52 crc kubenswrapper[5129]: I0314 10:17:52.121414 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cv8k9"] Mar 14 10:17:52 crc kubenswrapper[5129]: I0314 10:17:52.125913 5129 scope.go:117] "RemoveContainer" containerID="80bb4d5df43ef20f80bc16eaf67e4ef0131ed9ae1a5e27d93c426c9ca5fc8ded" Mar 14 10:17:52 crc kubenswrapper[5129]: I0314 10:17:52.142986 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cv8k9"] Mar 14 10:17:54 crc kubenswrapper[5129]: I0314 10:17:54.048463 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" path="/var/lib/kubelet/pods/f042adb5-2c9d-48bc-b416-73cbd806dbde/volumes" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.199712 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558058-qrwrn"] Mar 14 10:18:00 crc kubenswrapper[5129]: E0314 10:18:00.200719 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="extract-utilities" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.200741 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="extract-utilities" Mar 14 10:18:00 crc kubenswrapper[5129]: E0314 10:18:00.200766 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="registry-server" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.200772 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="registry-server" Mar 14 10:18:00 crc kubenswrapper[5129]: E0314 10:18:00.200789 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="extract-content" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.200795 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="extract-content" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.201005 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f042adb5-2c9d-48bc-b416-73cbd806dbde" containerName="registry-server" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.201787 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558058-qrwrn" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.205010 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.205035 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.209134 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.212639 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558058-qrwrn"] Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.312073 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwh8\" (UniqueName: \"kubernetes.io/projected/82d36578-de52-49e4-b9b5-594afd90d22e-kube-api-access-ccwh8\") pod \"auto-csr-approver-29558058-qrwrn\" (UID: \"82d36578-de52-49e4-b9b5-594afd90d22e\") " pod="openshift-infra/auto-csr-approver-29558058-qrwrn" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.415412 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwh8\" (UniqueName: \"kubernetes.io/projected/82d36578-de52-49e4-b9b5-594afd90d22e-kube-api-access-ccwh8\") pod \"auto-csr-approver-29558058-qrwrn\" (UID: \"82d36578-de52-49e4-b9b5-594afd90d22e\") " pod="openshift-infra/auto-csr-approver-29558058-qrwrn" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.437589 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwh8\" (UniqueName: \"kubernetes.io/projected/82d36578-de52-49e4-b9b5-594afd90d22e-kube-api-access-ccwh8\") pod \"auto-csr-approver-29558058-qrwrn\" (UID: \"82d36578-de52-49e4-b9b5-594afd90d22e\") " pod="openshift-infra/auto-csr-approver-29558058-qrwrn" Mar 14 10:18:00 crc kubenswrapper[5129]: I0314 10:18:00.518731 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558058-qrwrn" Mar 14 10:18:01 crc kubenswrapper[5129]: I0314 10:18:01.294383 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558058-qrwrn"] Mar 14 10:18:02 crc kubenswrapper[5129]: I0314 10:18:02.183733 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558058-qrwrn" event={"ID":"82d36578-de52-49e4-b9b5-594afd90d22e","Type":"ContainerStarted","Data":"0caf90c4ed77798d007afba51daccbf8e0a46994ab8f7dbc0a7f063a059ca588"} Mar 14 10:18:03 crc kubenswrapper[5129]: I0314 10:18:03.197430 5129 generic.go:334] "Generic (PLEG): container finished" podID="82d36578-de52-49e4-b9b5-594afd90d22e" containerID="eb8c2bc4cf39f8bb5969396e344980942c23a6aa66805c7604cf5a5a97c6a4e0" exitCode=0 Mar 14 10:18:03 crc kubenswrapper[5129]: I0314 10:18:03.197842 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558058-qrwrn" event={"ID":"82d36578-de52-49e4-b9b5-594afd90d22e","Type":"ContainerDied","Data":"eb8c2bc4cf39f8bb5969396e344980942c23a6aa66805c7604cf5a5a97c6a4e0"} Mar 14 10:18:05 crc kubenswrapper[5129]: I0314 10:18:05.794948 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558058-qrwrn" Mar 14 10:18:05 crc kubenswrapper[5129]: I0314 10:18:05.850963 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwh8\" (UniqueName: \"kubernetes.io/projected/82d36578-de52-49e4-b9b5-594afd90d22e-kube-api-access-ccwh8\") pod \"82d36578-de52-49e4-b9b5-594afd90d22e\" (UID: \"82d36578-de52-49e4-b9b5-594afd90d22e\") " Mar 14 10:18:05 crc kubenswrapper[5129]: I0314 10:18:05.876899 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d36578-de52-49e4-b9b5-594afd90d22e-kube-api-access-ccwh8" (OuterVolumeSpecName: "kube-api-access-ccwh8") pod "82d36578-de52-49e4-b9b5-594afd90d22e" (UID: "82d36578-de52-49e4-b9b5-594afd90d22e"). InnerVolumeSpecName "kube-api-access-ccwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:18:05 crc kubenswrapper[5129]: I0314 10:18:05.955341 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwh8\" (UniqueName: \"kubernetes.io/projected/82d36578-de52-49e4-b9b5-594afd90d22e-kube-api-access-ccwh8\") on node \"crc\" DevicePath \"\"" Mar 14 10:18:06 crc kubenswrapper[5129]: I0314 10:18:06.234628 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558058-qrwrn" event={"ID":"82d36578-de52-49e4-b9b5-594afd90d22e","Type":"ContainerDied","Data":"0caf90c4ed77798d007afba51daccbf8e0a46994ab8f7dbc0a7f063a059ca588"} Mar 14 10:18:06 crc kubenswrapper[5129]: I0314 10:18:06.234687 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0caf90c4ed77798d007afba51daccbf8e0a46994ab8f7dbc0a7f063a059ca588" Mar 14 10:18:06 crc kubenswrapper[5129]: I0314 10:18:06.234747 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558058-qrwrn" Mar 14 10:18:06 crc kubenswrapper[5129]: I0314 10:18:06.901522 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558052-nm8bn"] Mar 14 10:18:06 crc kubenswrapper[5129]: I0314 10:18:06.917006 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558052-nm8bn"] Mar 14 10:18:08 crc kubenswrapper[5129]: I0314 10:18:08.048925 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e76b0f3-4a63-41f8-bf7d-41057276f813" path="/var/lib/kubelet/pods/7e76b0f3-4a63-41f8-bf7d-41057276f813/volumes" Mar 14 10:18:19 crc kubenswrapper[5129]: I0314 10:18:19.574617 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:18:19 crc kubenswrapper[5129]: I0314 10:18:19.575187 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:18:19 crc kubenswrapper[5129]: I0314 10:18:19.575242 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:18:19 crc kubenswrapper[5129]: I0314 10:18:19.576096 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:18:19 crc kubenswrapper[5129]: I0314 10:18:19.576149 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" gracePeriod=600 Mar 14 10:18:19 crc kubenswrapper[5129]: E0314 10:18:19.702372 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:18:20 crc kubenswrapper[5129]: I0314 10:18:20.390332 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" exitCode=0 Mar 14 10:18:20 crc kubenswrapper[5129]: I0314 10:18:20.390376 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1"} Mar 14 10:18:20 crc kubenswrapper[5129]: I0314 10:18:20.390795 5129 scope.go:117] "RemoveContainer" containerID="ea55a9c69c43b228fc76f8d80751a65180ad34d13576370de81b7c849d750b69" Mar 14 10:18:20 crc kubenswrapper[5129]: I0314 10:18:20.391761 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:18:20 crc kubenswrapper[5129]: E0314 10:18:20.392148 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.801557 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8547d"] Mar 14 10:18:23 crc kubenswrapper[5129]: E0314 10:18:23.802478 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d36578-de52-49e4-b9b5-594afd90d22e" containerName="oc" Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.802492 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d36578-de52-49e4-b9b5-594afd90d22e" containerName="oc" Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.802722 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d36578-de52-49e4-b9b5-594afd90d22e" containerName="oc" Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.804396 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.822256 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8547d"] Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.961256 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drp69\" (UniqueName: \"kubernetes.io/projected/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-kube-api-access-drp69\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.961767 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-utilities\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:23 crc kubenswrapper[5129]: I0314 10:18:23.961906 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-catalog-content\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.063647 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-utilities\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.064131 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-catalog-content\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.064336 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drp69\" (UniqueName: \"kubernetes.io/projected/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-kube-api-access-drp69\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.065210 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-utilities\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.065532 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-catalog-content\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.083660 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drp69\" (UniqueName: \"kubernetes.io/projected/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-kube-api-access-drp69\") pod \"redhat-operators-8547d\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.147957 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:24 crc kubenswrapper[5129]: I0314 10:18:24.915682 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8547d"] Mar 14 10:18:25 crc kubenswrapper[5129]: I0314 10:18:25.453488 5129 generic.go:334] "Generic (PLEG): container finished" podID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerID="cbc08846dae5e4b5ef58abde09888d9081aa833eef8d9ba2ed4acd2c8befa064" exitCode=0 Mar 14 10:18:25 crc kubenswrapper[5129]: I0314 10:18:25.453764 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8547d" event={"ID":"ee645c0c-d59a-4f60-b9af-ba28d8efb25a","Type":"ContainerDied","Data":"cbc08846dae5e4b5ef58abde09888d9081aa833eef8d9ba2ed4acd2c8befa064"} Mar 14 10:18:25 crc kubenswrapper[5129]: I0314 10:18:25.454069 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8547d" event={"ID":"ee645c0c-d59a-4f60-b9af-ba28d8efb25a","Type":"ContainerStarted","Data":"4c55f9beb778b860fa8ee79eb0bc163727d142d0fc557b5db13f452d37b02a57"} Mar 14 10:18:26 crc kubenswrapper[5129]: I0314 10:18:26.474427 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8547d" event={"ID":"ee645c0c-d59a-4f60-b9af-ba28d8efb25a","Type":"ContainerStarted","Data":"823b50c664ca3bb33bec8e8b07d414b35ace7de40ecf9d648a9a1c2cab0fd25f"} Mar 14 10:18:31 crc kubenswrapper[5129]: I0314 10:18:31.564079 5129 generic.go:334] "Generic (PLEG): container finished" podID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerID="823b50c664ca3bb33bec8e8b07d414b35ace7de40ecf9d648a9a1c2cab0fd25f" exitCode=0 Mar 14 10:18:31 crc kubenswrapper[5129]: I0314 10:18:31.564386 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8547d" event={"ID":"ee645c0c-d59a-4f60-b9af-ba28d8efb25a","Type":"ContainerDied","Data":"823b50c664ca3bb33bec8e8b07d414b35ace7de40ecf9d648a9a1c2cab0fd25f"} Mar 14 10:18:32 crc kubenswrapper[5129]: I0314 10:18:32.037051 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:18:32 crc kubenswrapper[5129]: E0314 10:18:32.037343 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:18:32 crc kubenswrapper[5129]: I0314 10:18:32.577486 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8547d" event={"ID":"ee645c0c-d59a-4f60-b9af-ba28d8efb25a","Type":"ContainerStarted","Data":"acaeffb21dfe277bba2c28af06a1e8a936675cd2452ebe3d17974d2c6437ac55"} Mar 14 10:18:32 crc kubenswrapper[5129]: I0314 10:18:32.603211 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8547d" podStartSLOduration=3.079763408 podStartE2EDuration="9.603188982s" podCreationTimestamp="2026-03-14 10:18:23 +0000 UTC" firstStartedPulling="2026-03-14 10:18:25.45557594 +0000 UTC m=+11968.207491124" lastFinishedPulling="2026-03-14 10:18:31.979001514 +0000 UTC m=+11974.730916698" observedRunningTime="2026-03-14 10:18:32.597188439 +0000 UTC m=+11975.349103623" watchObservedRunningTime="2026-03-14 10:18:32.603188982 +0000 UTC m=+11975.355104176" Mar 14 10:18:34 crc kubenswrapper[5129]: I0314 10:18:34.148058 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:34 crc kubenswrapper[5129]: I0314 10:18:34.148338 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:18:35 crc kubenswrapper[5129]: I0314 10:18:35.205461 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8547d" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" probeResult="failure" output=< Mar 14 10:18:35 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:18:35 crc kubenswrapper[5129]: > Mar 14 10:18:41 crc kubenswrapper[5129]: I0314 10:18:41.516030 5129 scope.go:117] "RemoveContainer" containerID="417cb6e18b7733b2b622ddfd5456bc11968427d100e4cd1b243940fbca03946b" Mar 14 10:18:41 crc kubenswrapper[5129]: I0314 10:18:41.543564 5129 scope.go:117] "RemoveContainer" containerID="d2f42ded749c2e5ef1844064e827669f0255b18aa3fe34b11bbe4451c192758c" Mar 14 10:18:41 crc kubenswrapper[5129]: I0314 10:18:41.620848 5129 scope.go:117] "RemoveContainer" containerID="de4ae6f322f49ca3bcd9cdcdf988f0a2c653f8161c92110cd8d8cafc8ff1617f" Mar 14 10:18:41 crc kubenswrapper[5129]: I0314 10:18:41.656400 5129 scope.go:117] "RemoveContainer" containerID="5662ae3ed896bb098aa6e771af3e13e66763a24f51d21092270e116913faf162" Mar 14 10:18:45 crc kubenswrapper[5129]: I0314 10:18:45.207563 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8547d" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" probeResult="failure" output=< Mar 14 10:18:45 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:18:45 crc kubenswrapper[5129]: > Mar 14 10:18:47 crc kubenswrapper[5129]: I0314 10:18:47.036579 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:18:47 crc kubenswrapper[5129]: E0314 10:18:47.037961 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:18:55 crc kubenswrapper[5129]: I0314 10:18:55.199219 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8547d" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" probeResult="failure" output=< Mar 14 10:18:55 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:18:55 crc kubenswrapper[5129]: > Mar 14 10:19:00 crc kubenswrapper[5129]: I0314 10:19:00.037385 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:19:00 crc kubenswrapper[5129]: E0314 10:19:00.038349 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:19:05 crc kubenswrapper[5129]: I0314 10:19:05.205491 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8547d" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" probeResult="failure" output=< Mar 14 10:19:05 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:19:05 crc kubenswrapper[5129]: > Mar 14 10:19:12 crc kubenswrapper[5129]: I0314 10:19:12.037285 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:19:12 crc kubenswrapper[5129]: E0314 10:19:12.038011 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:19:14 crc kubenswrapper[5129]: I0314 10:19:14.196930 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:19:14 crc kubenswrapper[5129]: I0314 10:19:14.269429 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:19:14 crc kubenswrapper[5129]: I0314 10:19:14.432980 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8547d"] Mar 14 10:19:16 crc kubenswrapper[5129]: I0314 10:19:16.033058 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8547d" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" containerID="cri-o://acaeffb21dfe277bba2c28af06a1e8a936675cd2452ebe3d17974d2c6437ac55" gracePeriod=2 Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.057594 5129 generic.go:334] "Generic (PLEG): container finished" podID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerID="acaeffb21dfe277bba2c28af06a1e8a936675cd2452ebe3d17974d2c6437ac55" exitCode=0 Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.058045 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8547d" event={"ID":"ee645c0c-d59a-4f60-b9af-ba28d8efb25a","Type":"ContainerDied","Data":"acaeffb21dfe277bba2c28af06a1e8a936675cd2452ebe3d17974d2c6437ac55"} Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.791009 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.898901 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drp69\" (UniqueName: \"kubernetes.io/projected/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-kube-api-access-drp69\") pod \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.898990 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-utilities\") pod \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.899047 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-catalog-content\") pod \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\" (UID: \"ee645c0c-d59a-4f60-b9af-ba28d8efb25a\") " Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.903914 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-utilities" (OuterVolumeSpecName: "utilities") pod "ee645c0c-d59a-4f60-b9af-ba28d8efb25a" (UID: "ee645c0c-d59a-4f60-b9af-ba28d8efb25a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:19:17 crc kubenswrapper[5129]: I0314 10:19:17.937868 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-kube-api-access-drp69" (OuterVolumeSpecName: "kube-api-access-drp69") pod "ee645c0c-d59a-4f60-b9af-ba28d8efb25a" (UID: "ee645c0c-d59a-4f60-b9af-ba28d8efb25a"). InnerVolumeSpecName "kube-api-access-drp69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.006750 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drp69\" (UniqueName: \"kubernetes.io/projected/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-kube-api-access-drp69\") on node \"crc\" DevicePath \"\"" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.006784 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.070097 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8547d" event={"ID":"ee645c0c-d59a-4f60-b9af-ba28d8efb25a","Type":"ContainerDied","Data":"4c55f9beb778b860fa8ee79eb0bc163727d142d0fc557b5db13f452d37b02a57"} Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.070147 5129 scope.go:117] "RemoveContainer" containerID="acaeffb21dfe277bba2c28af06a1e8a936675cd2452ebe3d17974d2c6437ac55" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.070290 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8547d" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.079205 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee645c0c-d59a-4f60-b9af-ba28d8efb25a" (UID: "ee645c0c-d59a-4f60-b9af-ba28d8efb25a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.109771 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee645c0c-d59a-4f60-b9af-ba28d8efb25a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.148161 5129 scope.go:117] "RemoveContainer" containerID="823b50c664ca3bb33bec8e8b07d414b35ace7de40ecf9d648a9a1c2cab0fd25f" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.177818 5129 scope.go:117] "RemoveContainer" containerID="cbc08846dae5e4b5ef58abde09888d9081aa833eef8d9ba2ed4acd2c8befa064" Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.414028 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8547d"] Mar 14 10:19:18 crc kubenswrapper[5129]: I0314 10:19:18.433142 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8547d"] Mar 14 10:19:20 crc kubenswrapper[5129]: I0314 10:19:20.055107 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" path="/var/lib/kubelet/pods/ee645c0c-d59a-4f60-b9af-ba28d8efb25a/volumes" Mar 14 10:19:24 crc kubenswrapper[5129]: I0314 10:19:24.036912 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:19:24 crc kubenswrapper[5129]: E0314 10:19:24.037677 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:19:37 crc kubenswrapper[5129]: I0314 10:19:37.036725 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:19:37 crc kubenswrapper[5129]: E0314 10:19:37.037630 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:19:48 crc kubenswrapper[5129]: I0314 10:19:48.044347 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:19:48 crc kubenswrapper[5129]: E0314 10:19:48.045228 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.167753 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558060-rv492"] Mar 14 10:20:00 crc kubenswrapper[5129]: E0314 10:20:00.168910 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="extract-utilities" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.168926 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="extract-utilities" Mar 14 10:20:00 crc kubenswrapper[5129]: E0314 10:20:00.168956 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.168963 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" Mar 14 10:20:00 crc kubenswrapper[5129]: E0314 10:20:00.168981 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="extract-content" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.168990 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="extract-content" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.169295 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee645c0c-d59a-4f60-b9af-ba28d8efb25a" containerName="registry-server" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.170087 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558060-rv492" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.172665 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.172685 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.172738 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.190619 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558060-rv492"] Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.316238 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhnqn\" (UniqueName: \"kubernetes.io/projected/f9f66657-de9c-46ad-b19e-eec843ebc32e-kube-api-access-mhnqn\") pod \"auto-csr-approver-29558060-rv492\" (UID: \"f9f66657-de9c-46ad-b19e-eec843ebc32e\") " pod="openshift-infra/auto-csr-approver-29558060-rv492" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.418215 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhnqn\" (UniqueName: \"kubernetes.io/projected/f9f66657-de9c-46ad-b19e-eec843ebc32e-kube-api-access-mhnqn\") pod \"auto-csr-approver-29558060-rv492\" (UID: \"f9f66657-de9c-46ad-b19e-eec843ebc32e\") " pod="openshift-infra/auto-csr-approver-29558060-rv492" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.445674 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhnqn\" (UniqueName: \"kubernetes.io/projected/f9f66657-de9c-46ad-b19e-eec843ebc32e-kube-api-access-mhnqn\") pod \"auto-csr-approver-29558060-rv492\" (UID: \"f9f66657-de9c-46ad-b19e-eec843ebc32e\") " pod="openshift-infra/auto-csr-approver-29558060-rv492" Mar 14 10:20:00 crc kubenswrapper[5129]: I0314 10:20:00.505301 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558060-rv492" Mar 14 10:20:01 crc kubenswrapper[5129]: I0314 10:20:01.499646 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558060-rv492"] Mar 14 10:20:01 crc kubenswrapper[5129]: W0314 10:20:01.503483 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9f66657_de9c_46ad_b19e_eec843ebc32e.slice/crio-390264d18a864edf672b8cb8d4bbfa24d80e177d1cae384890b8485e04f2fbf6 WatchSource:0}: Error finding container 390264d18a864edf672b8cb8d4bbfa24d80e177d1cae384890b8485e04f2fbf6: Status 404 returned error can't find the container with id 390264d18a864edf672b8cb8d4bbfa24d80e177d1cae384890b8485e04f2fbf6 Mar 14 10:20:01 crc kubenswrapper[5129]: I0314 10:20:01.545295 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558060-rv492" event={"ID":"f9f66657-de9c-46ad-b19e-eec843ebc32e","Type":"ContainerStarted","Data":"390264d18a864edf672b8cb8d4bbfa24d80e177d1cae384890b8485e04f2fbf6"} Mar 14 10:20:03 crc kubenswrapper[5129]: I0314 10:20:03.036448 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:20:03 crc kubenswrapper[5129]: E0314 10:20:03.037098 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:20:03 crc kubenswrapper[5129]: I0314 10:20:03.565240 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558060-rv492" event={"ID":"f9f66657-de9c-46ad-b19e-eec843ebc32e","Type":"ContainerStarted","Data":"1098cddd9be3044bc08c5433eedcf1d91c4a8a1077ea4eb2ad490816f1981323"} Mar 14 10:20:04 crc kubenswrapper[5129]: I0314 10:20:04.577121 5129 generic.go:334] "Generic (PLEG): container finished" podID="f9f66657-de9c-46ad-b19e-eec843ebc32e" containerID="1098cddd9be3044bc08c5433eedcf1d91c4a8a1077ea4eb2ad490816f1981323" exitCode=0 Mar 14 10:20:04 crc kubenswrapper[5129]: I0314 10:20:04.577216 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558060-rv492" event={"ID":"f9f66657-de9c-46ad-b19e-eec843ebc32e","Type":"ContainerDied","Data":"1098cddd9be3044bc08c5433eedcf1d91c4a8a1077ea4eb2ad490816f1981323"} Mar 14 10:20:06 crc kubenswrapper[5129]: I0314 10:20:06.597971 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558060-rv492" event={"ID":"f9f66657-de9c-46ad-b19e-eec843ebc32e","Type":"ContainerDied","Data":"390264d18a864edf672b8cb8d4bbfa24d80e177d1cae384890b8485e04f2fbf6"} Mar 14 10:20:06 crc kubenswrapper[5129]: I0314 10:20:06.598402 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390264d18a864edf672b8cb8d4bbfa24d80e177d1cae384890b8485e04f2fbf6" Mar 14 10:20:06 crc kubenswrapper[5129]: I0314 10:20:06.624400 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558060-rv492" Mar 14 10:20:06 crc kubenswrapper[5129]: I0314 10:20:06.668965 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhnqn\" (UniqueName: \"kubernetes.io/projected/f9f66657-de9c-46ad-b19e-eec843ebc32e-kube-api-access-mhnqn\") pod \"f9f66657-de9c-46ad-b19e-eec843ebc32e\" (UID: \"f9f66657-de9c-46ad-b19e-eec843ebc32e\") " Mar 14 10:20:06 crc kubenswrapper[5129]: I0314 10:20:06.686177 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f66657-de9c-46ad-b19e-eec843ebc32e-kube-api-access-mhnqn" (OuterVolumeSpecName: "kube-api-access-mhnqn") pod "f9f66657-de9c-46ad-b19e-eec843ebc32e" (UID: "f9f66657-de9c-46ad-b19e-eec843ebc32e"). InnerVolumeSpecName "kube-api-access-mhnqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:20:06 crc kubenswrapper[5129]: I0314 10:20:06.773592 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhnqn\" (UniqueName: \"kubernetes.io/projected/f9f66657-de9c-46ad-b19e-eec843ebc32e-kube-api-access-mhnqn\") on node \"crc\" DevicePath \"\"" Mar 14 10:20:07 crc kubenswrapper[5129]: I0314 10:20:07.606985 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558060-rv492" Mar 14 10:20:07 crc kubenswrapper[5129]: I0314 10:20:07.695578 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558054-cqqrl"] Mar 14 10:20:07 crc kubenswrapper[5129]: I0314 10:20:07.707064 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558054-cqqrl"] Mar 14 10:20:08 crc kubenswrapper[5129]: I0314 10:20:08.050160 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01532751-0be7-49a7-95d2-d5251776e030" path="/var/lib/kubelet/pods/01532751-0be7-49a7-95d2-d5251776e030/volumes" Mar 14 10:20:14 crc kubenswrapper[5129]: I0314 10:20:14.036717 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:20:14 crc kubenswrapper[5129]: E0314 10:20:14.037544 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:20:28 crc kubenswrapper[5129]: I0314 10:20:28.048438 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:20:28 crc kubenswrapper[5129]: E0314 10:20:28.050181 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:20:39 crc kubenswrapper[5129]: I0314 10:20:39.036766 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:20:39 crc kubenswrapper[5129]: E0314 10:20:39.037751 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:20:41 crc kubenswrapper[5129]: I0314 10:20:41.952118 5129 scope.go:117] "RemoveContainer" containerID="276c5ab932191ba7a0966403f5b2ea7aa3a97c558235e3074dc4d1385d4ac574" Mar 14 10:20:51 crc kubenswrapper[5129]: I0314 10:20:51.036201 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:20:51 crc kubenswrapper[5129]: E0314 10:20:51.037103 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:21:03 crc kubenswrapper[5129]: I0314 10:21:03.036382 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:21:03 crc kubenswrapper[5129]: E0314 10:21:03.037323 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:21:18 crc kubenswrapper[5129]: I0314 10:21:18.043641 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:21:18 crc kubenswrapper[5129]: E0314 10:21:18.044340 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:21:30 crc kubenswrapper[5129]: I0314 10:21:30.037536 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:21:30 crc kubenswrapper[5129]: E0314 10:21:30.038537 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:21:45 crc kubenswrapper[5129]: I0314 10:21:45.039182 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:21:45 crc kubenswrapper[5129]: E0314 10:21:45.040085 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:21:58 crc kubenswrapper[5129]: I0314 10:21:58.047929 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:21:58 crc kubenswrapper[5129]: E0314 10:21:58.049518 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.148822 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558062-9lnqq"] Mar 14 10:22:00 crc kubenswrapper[5129]: E0314 10:22:00.149753 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f66657-de9c-46ad-b19e-eec843ebc32e" containerName="oc" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.149769 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f66657-de9c-46ad-b19e-eec843ebc32e" containerName="oc" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.150071 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f66657-de9c-46ad-b19e-eec843ebc32e" containerName="oc" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.150992 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558062-9lnqq" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.152995 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.153726 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.159263 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.164064 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558062-9lnqq"] Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.279882 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-469px\" (UniqueName: \"kubernetes.io/projected/84efb040-65b9-4606-baf4-01a89cc560fa-kube-api-access-469px\") pod \"auto-csr-approver-29558062-9lnqq\" (UID: \"84efb040-65b9-4606-baf4-01a89cc560fa\") " pod="openshift-infra/auto-csr-approver-29558062-9lnqq" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.381906 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-469px\" (UniqueName: \"kubernetes.io/projected/84efb040-65b9-4606-baf4-01a89cc560fa-kube-api-access-469px\") pod \"auto-csr-approver-29558062-9lnqq\" (UID: \"84efb040-65b9-4606-baf4-01a89cc560fa\") " pod="openshift-infra/auto-csr-approver-29558062-9lnqq" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.401134 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-469px\" (UniqueName: \"kubernetes.io/projected/84efb040-65b9-4606-baf4-01a89cc560fa-kube-api-access-469px\") pod \"auto-csr-approver-29558062-9lnqq\" (UID: \"84efb040-65b9-4606-baf4-01a89cc560fa\") " pod="openshift-infra/auto-csr-approver-29558062-9lnqq" Mar 14 10:22:00 crc kubenswrapper[5129]: I0314 10:22:00.473957 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558062-9lnqq" Mar 14 10:22:01 crc kubenswrapper[5129]: I0314 10:22:01.207441 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558062-9lnqq"] Mar 14 10:22:01 crc kubenswrapper[5129]: I0314 10:22:01.806508 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558062-9lnqq" event={"ID":"84efb040-65b9-4606-baf4-01a89cc560fa","Type":"ContainerStarted","Data":"c81fa00fbbe23e46be86a1ab1f559fdd9a1aab840a9e51bba32b8a854a406376"} Mar 14 10:22:02 crc kubenswrapper[5129]: I0314 10:22:02.818694 5129 generic.go:334] "Generic (PLEG): container finished" podID="84efb040-65b9-4606-baf4-01a89cc560fa" containerID="3329970bd44721c6aad890786ec71ddb48a7e2b25283bc9ab788f7fed7e05f22" exitCode=0 Mar 14 10:22:02 crc kubenswrapper[5129]: I0314 10:22:02.818787 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558062-9lnqq" event={"ID":"84efb040-65b9-4606-baf4-01a89cc560fa","Type":"ContainerDied","Data":"3329970bd44721c6aad890786ec71ddb48a7e2b25283bc9ab788f7fed7e05f22"} Mar 14 10:22:05 crc kubenswrapper[5129]: I0314 10:22:05.403090 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558062-9lnqq" Mar 14 10:22:05 crc kubenswrapper[5129]: I0314 10:22:05.497076 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-469px\" (UniqueName: \"kubernetes.io/projected/84efb040-65b9-4606-baf4-01a89cc560fa-kube-api-access-469px\") pod \"84efb040-65b9-4606-baf4-01a89cc560fa\" (UID: \"84efb040-65b9-4606-baf4-01a89cc560fa\") " Mar 14 10:22:05 crc kubenswrapper[5129]: I0314 10:22:05.510963 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84efb040-65b9-4606-baf4-01a89cc560fa-kube-api-access-469px" (OuterVolumeSpecName: "kube-api-access-469px") pod "84efb040-65b9-4606-baf4-01a89cc560fa" (UID: "84efb040-65b9-4606-baf4-01a89cc560fa"). InnerVolumeSpecName "kube-api-access-469px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:22:05 crc kubenswrapper[5129]: I0314 10:22:05.599698 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-469px\" (UniqueName: \"kubernetes.io/projected/84efb040-65b9-4606-baf4-01a89cc560fa-kube-api-access-469px\") on node \"crc\" DevicePath \"\"" Mar 14 10:22:05 crc kubenswrapper[5129]: I0314 10:22:05.854173 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558062-9lnqq" event={"ID":"84efb040-65b9-4606-baf4-01a89cc560fa","Type":"ContainerDied","Data":"c81fa00fbbe23e46be86a1ab1f559fdd9a1aab840a9e51bba32b8a854a406376"} Mar 14 10:22:05 crc kubenswrapper[5129]: I0314 10:22:05.854218 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c81fa00fbbe23e46be86a1ab1f559fdd9a1aab840a9e51bba32b8a854a406376" Mar 14 10:22:05 crc kubenswrapper[5129]: I0314 10:22:05.854288 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558062-9lnqq" Mar 14 10:22:06 crc kubenswrapper[5129]: I0314 10:22:06.479088 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558056-ljc6z"] Mar 14 10:22:06 crc kubenswrapper[5129]: I0314 10:22:06.492041 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558056-ljc6z"] Mar 14 10:22:08 crc kubenswrapper[5129]: I0314 10:22:08.053377 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ed34c3-be44-4144-9157-21e052ca247f" path="/var/lib/kubelet/pods/e6ed34c3-be44-4144-9157-21e052ca247f/volumes" Mar 14 10:22:11 crc kubenswrapper[5129]: I0314 10:22:11.037665 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:22:11 crc kubenswrapper[5129]: E0314 10:22:11.038505 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:22:26 crc kubenswrapper[5129]: I0314 10:22:26.036505 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:22:26 crc kubenswrapper[5129]: E0314 10:22:26.037252 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:22:39 crc kubenswrapper[5129]: I0314 10:22:39.036826 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:22:39 crc kubenswrapper[5129]: E0314 10:22:39.037471 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:22:42 crc kubenswrapper[5129]: I0314 10:22:42.057120 5129 scope.go:117] "RemoveContainer" containerID="5843014066fe8848ce87502046929dda076525bb462301cce89271a23eca35ce" Mar 14 10:22:50 crc kubenswrapper[5129]: I0314 10:22:50.039076 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:22:50 crc kubenswrapper[5129]: E0314 10:22:50.039852 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:23:05 crc kubenswrapper[5129]: I0314 10:23:05.037161 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:23:05 crc kubenswrapper[5129]: E0314 10:23:05.037996 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:23:20 crc kubenswrapper[5129]: I0314 10:23:20.036942 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:23:20 crc kubenswrapper[5129]: I0314 10:23:20.587569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"52581f64e7a71a41ddbfabf547cebc9449637f9a4f1d89909d5ad911bf52828b"} Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.251402 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wvbzv"] Mar 14 10:23:48 crc kubenswrapper[5129]: E0314 10:23:48.252363 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84efb040-65b9-4606-baf4-01a89cc560fa" containerName="oc" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.252374 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="84efb040-65b9-4606-baf4-01a89cc560fa" containerName="oc" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.252690 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="84efb040-65b9-4606-baf4-01a89cc560fa" containerName="oc" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.254290 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.281538 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvbzv"] Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.330363 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-utilities\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.330445 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdcdt\" (UniqueName: \"kubernetes.io/projected/f81454d8-5034-45a0-8050-2b3042c2817f-kube-api-access-vdcdt\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.330479 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-catalog-content\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.433453 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdcdt\" (UniqueName: \"kubernetes.io/projected/f81454d8-5034-45a0-8050-2b3042c2817f-kube-api-access-vdcdt\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.433557 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-catalog-content\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.433787 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-utilities\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.434076 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-catalog-content\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.434288 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-utilities\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.457449 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdcdt\" (UniqueName: \"kubernetes.io/projected/f81454d8-5034-45a0-8050-2b3042c2817f-kube-api-access-vdcdt\") pod \"community-operators-wvbzv\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:48 crc kubenswrapper[5129]: I0314 10:23:48.623861 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:49 crc kubenswrapper[5129]: I0314 10:23:49.349728 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvbzv"] Mar 14 10:23:49 crc kubenswrapper[5129]: I0314 10:23:49.897936 5129 generic.go:334] "Generic (PLEG): container finished" podID="f81454d8-5034-45a0-8050-2b3042c2817f" containerID="8ab848b18d315d8363bf26ceb9e13a8b22a038557fa3728194e94a3e0b68a903" exitCode=0 Mar 14 10:23:49 crc kubenswrapper[5129]: I0314 10:23:49.898145 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvbzv" event={"ID":"f81454d8-5034-45a0-8050-2b3042c2817f","Type":"ContainerDied","Data":"8ab848b18d315d8363bf26ceb9e13a8b22a038557fa3728194e94a3e0b68a903"} Mar 14 10:23:49 crc kubenswrapper[5129]: I0314 10:23:49.898299 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvbzv" event={"ID":"f81454d8-5034-45a0-8050-2b3042c2817f","Type":"ContainerStarted","Data":"113d6b27f5756f2977c91236a65513f2c1d20ddfb7aab86b502cc7236d3562d8"} Mar 14 10:23:49 crc kubenswrapper[5129]: I0314 10:23:49.899897 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:23:50 crc kubenswrapper[5129]: I0314 10:23:50.924504 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvbzv" event={"ID":"f81454d8-5034-45a0-8050-2b3042c2817f","Type":"ContainerStarted","Data":"ffebe2b218465da6463912c27cfd9c85815fe14416dac44517eb5ddc0b3ceb09"} Mar 14 10:23:52 crc kubenswrapper[5129]: I0314 10:23:52.947354 5129 generic.go:334] "Generic (PLEG): container finished" podID="f81454d8-5034-45a0-8050-2b3042c2817f" containerID="ffebe2b218465da6463912c27cfd9c85815fe14416dac44517eb5ddc0b3ceb09" exitCode=0 Mar 14 10:23:52 crc kubenswrapper[5129]: I0314 10:23:52.947396 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvbzv" event={"ID":"f81454d8-5034-45a0-8050-2b3042c2817f","Type":"ContainerDied","Data":"ffebe2b218465da6463912c27cfd9c85815fe14416dac44517eb5ddc0b3ceb09"} Mar 14 10:23:53 crc kubenswrapper[5129]: I0314 10:23:53.959142 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvbzv" event={"ID":"f81454d8-5034-45a0-8050-2b3042c2817f","Type":"ContainerStarted","Data":"706cada2749d14aa45ffe66c6c915d4c2fa225e966524a3c9c0ff40a76139435"} Mar 14 10:23:53 crc kubenswrapper[5129]: I0314 10:23:53.986746 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wvbzv" podStartSLOduration=2.537236545 podStartE2EDuration="5.986724903s" podCreationTimestamp="2026-03-14 10:23:48 +0000 UTC" firstStartedPulling="2026-03-14 10:23:49.899667956 +0000 UTC m=+12292.651583140" lastFinishedPulling="2026-03-14 10:23:53.349156314 +0000 UTC m=+12296.101071498" observedRunningTime="2026-03-14 10:23:53.977645168 +0000 UTC m=+12296.729560342" watchObservedRunningTime="2026-03-14 10:23:53.986724903 +0000 UTC m=+12296.738640097" Mar 14 10:23:58 crc kubenswrapper[5129]: I0314 10:23:58.624784 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:58 crc kubenswrapper[5129]: I0314 10:23:58.625728 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:23:59 crc kubenswrapper[5129]: I0314 10:23:59.677356 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wvbzv" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="registry-server" probeResult="failure" output=< Mar 14 10:23:59 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:23:59 crc kubenswrapper[5129]: > Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.166516 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558064-csc68"] Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.167867 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558064-csc68" Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.173116 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.173263 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.174781 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.179313 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558064-csc68"] Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.298142 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvnh\" (UniqueName: \"kubernetes.io/projected/8ab2d378-6795-4c41-a142-5c26ec16bc1b-kube-api-access-5cvnh\") pod \"auto-csr-approver-29558064-csc68\" (UID: \"8ab2d378-6795-4c41-a142-5c26ec16bc1b\") " pod="openshift-infra/auto-csr-approver-29558064-csc68" Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.400243 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvnh\" (UniqueName: \"kubernetes.io/projected/8ab2d378-6795-4c41-a142-5c26ec16bc1b-kube-api-access-5cvnh\") pod \"auto-csr-approver-29558064-csc68\" (UID: \"8ab2d378-6795-4c41-a142-5c26ec16bc1b\") " pod="openshift-infra/auto-csr-approver-29558064-csc68" Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.422696 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cvnh\" (UniqueName: \"kubernetes.io/projected/8ab2d378-6795-4c41-a142-5c26ec16bc1b-kube-api-access-5cvnh\") pod \"auto-csr-approver-29558064-csc68\" (UID: \"8ab2d378-6795-4c41-a142-5c26ec16bc1b\") " pod="openshift-infra/auto-csr-approver-29558064-csc68" Mar 14 10:24:00 crc kubenswrapper[5129]: I0314 10:24:00.485408 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558064-csc68" Mar 14 10:24:01 crc kubenswrapper[5129]: I0314 10:24:01.263238 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558064-csc68"] Mar 14 10:24:02 crc kubenswrapper[5129]: I0314 10:24:02.030326 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558064-csc68" event={"ID":"8ab2d378-6795-4c41-a142-5c26ec16bc1b","Type":"ContainerStarted","Data":"b56fa36572a1bc791423e037c5cb247a55bf83d2a560b82941c7ce9427ea1031"} Mar 14 10:24:03 crc kubenswrapper[5129]: I0314 10:24:03.040890 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558064-csc68" event={"ID":"8ab2d378-6795-4c41-a142-5c26ec16bc1b","Type":"ContainerStarted","Data":"eeb51d7a699c802757a16ca7d250f7cd7bccf63628d56a8f33712d20e505bf34"} Mar 14 10:24:03 crc kubenswrapper[5129]: I0314 10:24:03.055667 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558064-csc68" podStartSLOduration=2.140816795 podStartE2EDuration="3.055649096s" podCreationTimestamp="2026-03-14 10:24:00 +0000 UTC" firstStartedPulling="2026-03-14 10:24:01.262558284 +0000 UTC m=+12304.014473468" lastFinishedPulling="2026-03-14 10:24:02.177390585 +0000 UTC m=+12304.929305769" observedRunningTime="2026-03-14 10:24:03.0546629 +0000 UTC m=+12305.806578094" watchObservedRunningTime="2026-03-14 10:24:03.055649096 +0000 UTC m=+12305.807564280" Mar 14 10:24:04 crc kubenswrapper[5129]: I0314 10:24:04.059690 5129 generic.go:334] "Generic (PLEG): container finished" podID="8ab2d378-6795-4c41-a142-5c26ec16bc1b" containerID="eeb51d7a699c802757a16ca7d250f7cd7bccf63628d56a8f33712d20e505bf34" exitCode=0 Mar 14 10:24:04 crc kubenswrapper[5129]: I0314 10:24:04.060069 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558064-csc68" event={"ID":"8ab2d378-6795-4c41-a142-5c26ec16bc1b","Type":"ContainerDied","Data":"eeb51d7a699c802757a16ca7d250f7cd7bccf63628d56a8f33712d20e505bf34"} Mar 14 10:24:06 crc kubenswrapper[5129]: I0314 10:24:06.815003 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558064-csc68" Mar 14 10:24:06 crc kubenswrapper[5129]: I0314 10:24:06.926751 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cvnh\" (UniqueName: \"kubernetes.io/projected/8ab2d378-6795-4c41-a142-5c26ec16bc1b-kube-api-access-5cvnh\") pod \"8ab2d378-6795-4c41-a142-5c26ec16bc1b\" (UID: \"8ab2d378-6795-4c41-a142-5c26ec16bc1b\") " Mar 14 10:24:06 crc kubenswrapper[5129]: I0314 10:24:06.932685 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab2d378-6795-4c41-a142-5c26ec16bc1b-kube-api-access-5cvnh" (OuterVolumeSpecName: "kube-api-access-5cvnh") pod "8ab2d378-6795-4c41-a142-5c26ec16bc1b" (UID: "8ab2d378-6795-4c41-a142-5c26ec16bc1b"). InnerVolumeSpecName "kube-api-access-5cvnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:24:07 crc kubenswrapper[5129]: I0314 10:24:07.029777 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cvnh\" (UniqueName: \"kubernetes.io/projected/8ab2d378-6795-4c41-a142-5c26ec16bc1b-kube-api-access-5cvnh\") on node \"crc\" DevicePath \"\"" Mar 14 10:24:07 crc kubenswrapper[5129]: I0314 10:24:07.103304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558064-csc68" event={"ID":"8ab2d378-6795-4c41-a142-5c26ec16bc1b","Type":"ContainerDied","Data":"b56fa36572a1bc791423e037c5cb247a55bf83d2a560b82941c7ce9427ea1031"} Mar 14 10:24:07 crc kubenswrapper[5129]: I0314 10:24:07.103339 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56fa36572a1bc791423e037c5cb247a55bf83d2a560b82941c7ce9427ea1031" Mar 14 10:24:07 crc kubenswrapper[5129]: I0314 10:24:07.103388 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558064-csc68" Mar 14 10:24:07 crc kubenswrapper[5129]: I0314 10:24:07.932202 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558058-qrwrn"] Mar 14 10:24:07 crc kubenswrapper[5129]: I0314 10:24:07.948541 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558058-qrwrn"] Mar 14 10:24:08 crc kubenswrapper[5129]: I0314 10:24:08.062210 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d36578-de52-49e4-b9b5-594afd90d22e" path="/var/lib/kubelet/pods/82d36578-de52-49e4-b9b5-594afd90d22e/volumes" Mar 14 10:24:09 crc kubenswrapper[5129]: I0314 10:24:09.673053 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wvbzv" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="registry-server" probeResult="failure" output=< Mar 14 10:24:09 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:24:09 crc kubenswrapper[5129]: > Mar 14 10:24:18 crc kubenswrapper[5129]: I0314 10:24:18.687991 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:24:18 crc kubenswrapper[5129]: I0314 10:24:18.743239 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:24:19 crc kubenswrapper[5129]: I0314 10:24:19.458158 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wvbzv"] Mar 14 10:24:20 crc kubenswrapper[5129]: I0314 10:24:20.214494 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wvbzv" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="registry-server" containerID="cri-o://706cada2749d14aa45ffe66c6c915d4c2fa225e966524a3c9c0ff40a76139435" gracePeriod=2 Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.225354 5129 generic.go:334] "Generic (PLEG): container finished" podID="f81454d8-5034-45a0-8050-2b3042c2817f" containerID="706cada2749d14aa45ffe66c6c915d4c2fa225e966524a3c9c0ff40a76139435" exitCode=0 Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.225430 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvbzv" event={"ID":"f81454d8-5034-45a0-8050-2b3042c2817f","Type":"ContainerDied","Data":"706cada2749d14aa45ffe66c6c915d4c2fa225e966524a3c9c0ff40a76139435"} Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.642320 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.744142 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-catalog-content\") pod \"f81454d8-5034-45a0-8050-2b3042c2817f\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.744367 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-utilities\") pod \"f81454d8-5034-45a0-8050-2b3042c2817f\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.744652 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdcdt\" (UniqueName: \"kubernetes.io/projected/f81454d8-5034-45a0-8050-2b3042c2817f-kube-api-access-vdcdt\") pod \"f81454d8-5034-45a0-8050-2b3042c2817f\" (UID: \"f81454d8-5034-45a0-8050-2b3042c2817f\") " Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.747096 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-utilities" (OuterVolumeSpecName: "utilities") pod "f81454d8-5034-45a0-8050-2b3042c2817f" (UID: "f81454d8-5034-45a0-8050-2b3042c2817f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.763837 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81454d8-5034-45a0-8050-2b3042c2817f-kube-api-access-vdcdt" (OuterVolumeSpecName: "kube-api-access-vdcdt") pod "f81454d8-5034-45a0-8050-2b3042c2817f" (UID: "f81454d8-5034-45a0-8050-2b3042c2817f"). InnerVolumeSpecName "kube-api-access-vdcdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.802316 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f81454d8-5034-45a0-8050-2b3042c2817f" (UID: "f81454d8-5034-45a0-8050-2b3042c2817f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.846899 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.846935 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdcdt\" (UniqueName: \"kubernetes.io/projected/f81454d8-5034-45a0-8050-2b3042c2817f-kube-api-access-vdcdt\") on node \"crc\" DevicePath \"\"" Mar 14 10:24:21 crc kubenswrapper[5129]: I0314 10:24:21.846944 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81454d8-5034-45a0-8050-2b3042c2817f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:24:22 crc kubenswrapper[5129]: I0314 10:24:22.241248 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvbzv" event={"ID":"f81454d8-5034-45a0-8050-2b3042c2817f","Type":"ContainerDied","Data":"113d6b27f5756f2977c91236a65513f2c1d20ddfb7aab86b502cc7236d3562d8"} Mar 14 10:24:22 crc kubenswrapper[5129]: I0314 10:24:22.241297 5129 scope.go:117] "RemoveContainer" containerID="706cada2749d14aa45ffe66c6c915d4c2fa225e966524a3c9c0ff40a76139435" Mar 14 10:24:22 crc kubenswrapper[5129]: I0314 10:24:22.241434 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvbzv" Mar 14 10:24:22 crc kubenswrapper[5129]: I0314 10:24:22.268751 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wvbzv"] Mar 14 10:24:22 crc kubenswrapper[5129]: I0314 10:24:22.280560 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wvbzv"] Mar 14 10:24:22 crc kubenswrapper[5129]: I0314 10:24:22.291276 5129 scope.go:117] "RemoveContainer" containerID="ffebe2b218465da6463912c27cfd9c85815fe14416dac44517eb5ddc0b3ceb09" Mar 14 10:24:22 crc kubenswrapper[5129]: I0314 10:24:22.328153 5129 scope.go:117] "RemoveContainer" containerID="8ab848b18d315d8363bf26ceb9e13a8b22a038557fa3728194e94a3e0b68a903" Mar 14 10:24:24 crc kubenswrapper[5129]: I0314 10:24:24.045958 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" path="/var/lib/kubelet/pods/f81454d8-5034-45a0-8050-2b3042c2817f/volumes" Mar 14 10:24:42 crc kubenswrapper[5129]: I0314 10:24:42.199760 5129 scope.go:117] "RemoveContainer" containerID="eb8c2bc4cf39f8bb5969396e344980942c23a6aa66805c7604cf5a5a97c6a4e0" Mar 14 10:25:49 crc kubenswrapper[5129]: I0314 10:25:49.574177 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:25:49 crc kubenswrapper[5129]: I0314 10:25:49.574828 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.143540 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558066-ggh66"] Mar 14 10:26:00 crc kubenswrapper[5129]: E0314 10:26:00.144519 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="extract-content" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.144538 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="extract-content" Mar 14 10:26:00 crc kubenswrapper[5129]: E0314 10:26:00.144593 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab2d378-6795-4c41-a142-5c26ec16bc1b" containerName="oc" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.144618 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab2d378-6795-4c41-a142-5c26ec16bc1b" containerName="oc" Mar 14 10:26:00 crc kubenswrapper[5129]: E0314 10:26:00.144636 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="extract-utilities" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.144644 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="extract-utilities" Mar 14 10:26:00 crc kubenswrapper[5129]: E0314 10:26:00.144684 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="registry-server" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.144692 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="registry-server" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.144913 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab2d378-6795-4c41-a142-5c26ec16bc1b" containerName="oc" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.144944 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81454d8-5034-45a0-8050-2b3042c2817f" containerName="registry-server" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.145574 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558066-ggh66" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.148845 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.149055 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.149293 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.161039 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558066-ggh66"] Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.206588 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrk9\" (UniqueName: \"kubernetes.io/projected/d2917c36-d4d9-41b9-975b-76f8389b51a5-kube-api-access-llrk9\") pod \"auto-csr-approver-29558066-ggh66\" (UID: \"d2917c36-d4d9-41b9-975b-76f8389b51a5\") " pod="openshift-infra/auto-csr-approver-29558066-ggh66" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.308700 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llrk9\" (UniqueName: \"kubernetes.io/projected/d2917c36-d4d9-41b9-975b-76f8389b51a5-kube-api-access-llrk9\") pod \"auto-csr-approver-29558066-ggh66\" (UID: \"d2917c36-d4d9-41b9-975b-76f8389b51a5\") " pod="openshift-infra/auto-csr-approver-29558066-ggh66" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.349702 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrk9\" (UniqueName: \"kubernetes.io/projected/d2917c36-d4d9-41b9-975b-76f8389b51a5-kube-api-access-llrk9\") pod \"auto-csr-approver-29558066-ggh66\" (UID: \"d2917c36-d4d9-41b9-975b-76f8389b51a5\") " pod="openshift-infra/auto-csr-approver-29558066-ggh66" Mar 14 10:26:00 crc kubenswrapper[5129]: I0314 10:26:00.509920 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558066-ggh66" Mar 14 10:26:01 crc kubenswrapper[5129]: I0314 10:26:01.387001 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558066-ggh66"] Mar 14 10:26:02 crc kubenswrapper[5129]: I0314 10:26:02.233458 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558066-ggh66" event={"ID":"d2917c36-d4d9-41b9-975b-76f8389b51a5","Type":"ContainerStarted","Data":"51a21e141e8304f0ca5e7fa08b2748942a21883845e4018b547f4ff5487d483c"} Mar 14 10:26:03 crc kubenswrapper[5129]: I0314 10:26:03.243458 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558066-ggh66" event={"ID":"d2917c36-d4d9-41b9-975b-76f8389b51a5","Type":"ContainerStarted","Data":"1675f046a4e1f003ca77b96269cbb3f1050eee1dd6e3afac6f5fc4b6b5cca0ad"} Mar 14 10:26:03 crc kubenswrapper[5129]: I0314 10:26:03.263560 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558066-ggh66" podStartSLOduration=2.234335444 podStartE2EDuration="3.263542474s" podCreationTimestamp="2026-03-14 10:26:00 +0000 UTC" firstStartedPulling="2026-03-14 10:26:01.393316603 +0000 UTC m=+12424.145231787" lastFinishedPulling="2026-03-14 10:26:02.422523633 +0000 UTC m=+12425.174438817" observedRunningTime="2026-03-14 10:26:03.257763667 +0000 UTC m=+12426.009678851" watchObservedRunningTime="2026-03-14 10:26:03.263542474 +0000 UTC m=+12426.015457648" Mar 14 10:26:04 crc kubenswrapper[5129]: I0314 10:26:04.253993 5129 generic.go:334] "Generic (PLEG): container finished" podID="d2917c36-d4d9-41b9-975b-76f8389b51a5" containerID="1675f046a4e1f003ca77b96269cbb3f1050eee1dd6e3afac6f5fc4b6b5cca0ad" exitCode=0 Mar 14 10:26:04 crc kubenswrapper[5129]: I0314 10:26:04.254060 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558066-ggh66" event={"ID":"d2917c36-d4d9-41b9-975b-76f8389b51a5","Type":"ContainerDied","Data":"1675f046a4e1f003ca77b96269cbb3f1050eee1dd6e3afac6f5fc4b6b5cca0ad"} Mar 14 10:26:06 crc kubenswrapper[5129]: I0314 10:26:06.833129 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558066-ggh66" Mar 14 10:26:06 crc kubenswrapper[5129]: I0314 10:26:06.977569 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llrk9\" (UniqueName: \"kubernetes.io/projected/d2917c36-d4d9-41b9-975b-76f8389b51a5-kube-api-access-llrk9\") pod \"d2917c36-d4d9-41b9-975b-76f8389b51a5\" (UID: \"d2917c36-d4d9-41b9-975b-76f8389b51a5\") " Mar 14 10:26:06 crc kubenswrapper[5129]: I0314 10:26:06.992411 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2917c36-d4d9-41b9-975b-76f8389b51a5-kube-api-access-llrk9" (OuterVolumeSpecName: "kube-api-access-llrk9") pod "d2917c36-d4d9-41b9-975b-76f8389b51a5" (UID: "d2917c36-d4d9-41b9-975b-76f8389b51a5"). InnerVolumeSpecName "kube-api-access-llrk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:26:07 crc kubenswrapper[5129]: I0314 10:26:07.079887 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llrk9\" (UniqueName: \"kubernetes.io/projected/d2917c36-d4d9-41b9-975b-76f8389b51a5-kube-api-access-llrk9\") on node \"crc\" DevicePath \"\"" Mar 14 10:26:07 crc kubenswrapper[5129]: I0314 10:26:07.290094 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558066-ggh66" event={"ID":"d2917c36-d4d9-41b9-975b-76f8389b51a5","Type":"ContainerDied","Data":"51a21e141e8304f0ca5e7fa08b2748942a21883845e4018b547f4ff5487d483c"} Mar 14 10:26:07 crc kubenswrapper[5129]: I0314 10:26:07.290138 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51a21e141e8304f0ca5e7fa08b2748942a21883845e4018b547f4ff5487d483c" Mar 14 10:26:07 crc kubenswrapper[5129]: I0314 10:26:07.290153 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558066-ggh66" Mar 14 10:26:07 crc kubenswrapper[5129]: I0314 10:26:07.914094 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558060-rv492"] Mar 14 10:26:07 crc kubenswrapper[5129]: I0314 10:26:07.922626 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558060-rv492"] Mar 14 10:26:08 crc kubenswrapper[5129]: I0314 10:26:08.104049 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f66657-de9c-46ad-b19e-eec843ebc32e" path="/var/lib/kubelet/pods/f9f66657-de9c-46ad-b19e-eec843ebc32e/volumes" Mar 14 10:26:19 crc kubenswrapper[5129]: I0314 10:26:19.574123 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:26:19 crc kubenswrapper[5129]: I0314 10:26:19.574740 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.724359 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7gqb8"] Mar 14 10:26:23 crc kubenswrapper[5129]: E0314 10:26:23.725249 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2917c36-d4d9-41b9-975b-76f8389b51a5" containerName="oc" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.725261 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2917c36-d4d9-41b9-975b-76f8389b51a5" containerName="oc" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.725485 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2917c36-d4d9-41b9-975b-76f8389b51a5" containerName="oc" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.731189 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.767996 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7gqb8"] Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.830368 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6q5s\" (UniqueName: \"kubernetes.io/projected/f291d60b-b793-4acb-8418-dd0bf4088bc9-kube-api-access-x6q5s\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.830444 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-catalog-content\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.830471 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-utilities\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.931954 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6q5s\" (UniqueName: \"kubernetes.io/projected/f291d60b-b793-4acb-8418-dd0bf4088bc9-kube-api-access-x6q5s\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.932020 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-catalog-content\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.932042 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-utilities\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.932665 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-utilities\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.932813 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-catalog-content\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:23 crc kubenswrapper[5129]: I0314 10:26:23.963111 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6q5s\" (UniqueName: \"kubernetes.io/projected/f291d60b-b793-4acb-8418-dd0bf4088bc9-kube-api-access-x6q5s\") pod \"certified-operators-7gqb8\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:24 crc kubenswrapper[5129]: I0314 10:26:24.075060 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:24 crc kubenswrapper[5129]: I0314 10:26:24.856234 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7gqb8"] Mar 14 10:26:25 crc kubenswrapper[5129]: I0314 10:26:25.467540 5129 generic.go:334] "Generic (PLEG): container finished" podID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerID="952dca3b0f99772bbbcd3add96c1a8a80db4a7d8a00607888029d7ff1c91123f" exitCode=0 Mar 14 10:26:25 crc kubenswrapper[5129]: I0314 10:26:25.468669 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gqb8" event={"ID":"f291d60b-b793-4acb-8418-dd0bf4088bc9","Type":"ContainerDied","Data":"952dca3b0f99772bbbcd3add96c1a8a80db4a7d8a00607888029d7ff1c91123f"} Mar 14 10:26:25 crc kubenswrapper[5129]: I0314 10:26:25.468697 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gqb8" event={"ID":"f291d60b-b793-4acb-8418-dd0bf4088bc9","Type":"ContainerStarted","Data":"806af0077ce4a5970c1f406b4607505ebd60b591cc02063078dc6d73e2ae8500"} Mar 14 10:26:26 crc kubenswrapper[5129]: I0314 10:26:26.479754 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gqb8" event={"ID":"f291d60b-b793-4acb-8418-dd0bf4088bc9","Type":"ContainerStarted","Data":"0cd545bc0c59d7fee0eae317375ebd9e257db3aee6ba02f3db8b5391d1c03785"} Mar 14 10:26:28 crc kubenswrapper[5129]: I0314 10:26:28.501298 5129 generic.go:334] "Generic (PLEG): container finished" podID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerID="0cd545bc0c59d7fee0eae317375ebd9e257db3aee6ba02f3db8b5391d1c03785" exitCode=0 Mar 14 10:26:28 crc kubenswrapper[5129]: I0314 10:26:28.501621 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gqb8" event={"ID":"f291d60b-b793-4acb-8418-dd0bf4088bc9","Type":"ContainerDied","Data":"0cd545bc0c59d7fee0eae317375ebd9e257db3aee6ba02f3db8b5391d1c03785"} Mar 14 10:26:29 crc kubenswrapper[5129]: I0314 10:26:29.513329 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gqb8" event={"ID":"f291d60b-b793-4acb-8418-dd0bf4088bc9","Type":"ContainerStarted","Data":"e7989c427f7358cb8b2fdb3f89e2bea5af7e55cccf4b1547307d7a55b349b5e2"} Mar 14 10:26:29 crc kubenswrapper[5129]: I0314 10:26:29.539173 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7gqb8" podStartSLOduration=3.094853638 podStartE2EDuration="6.539156096s" podCreationTimestamp="2026-03-14 10:26:23 +0000 UTC" firstStartedPulling="2026-03-14 10:26:25.46927904 +0000 UTC m=+12448.221194224" lastFinishedPulling="2026-03-14 10:26:28.913581498 +0000 UTC m=+12451.665496682" observedRunningTime="2026-03-14 10:26:29.529272029 +0000 UTC m=+12452.281187213" watchObservedRunningTime="2026-03-14 10:26:29.539156096 +0000 UTC m=+12452.291071280" Mar 14 10:26:34 crc kubenswrapper[5129]: I0314 10:26:34.075581 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:34 crc kubenswrapper[5129]: I0314 10:26:34.076158 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:35 crc kubenswrapper[5129]: I0314 10:26:35.131361 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7gqb8" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="registry-server" probeResult="failure" output=< Mar 14 10:26:35 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:26:35 crc kubenswrapper[5129]: > Mar 14 10:26:42 crc kubenswrapper[5129]: I0314 10:26:42.335812 5129 scope.go:117] "RemoveContainer" containerID="1098cddd9be3044bc08c5433eedcf1d91c4a8a1077ea4eb2ad490816f1981323" Mar 14 10:26:44 crc kubenswrapper[5129]: I0314 10:26:44.131575 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:44 crc kubenswrapper[5129]: I0314 10:26:44.189808 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:44 crc kubenswrapper[5129]: I0314 10:26:44.381094 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7gqb8"] Mar 14 10:26:45 crc kubenswrapper[5129]: I0314 10:26:45.659307 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7gqb8" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="registry-server" containerID="cri-o://e7989c427f7358cb8b2fdb3f89e2bea5af7e55cccf4b1547307d7a55b349b5e2" gracePeriod=2 Mar 14 10:26:46 crc kubenswrapper[5129]: I0314 10:26:46.672018 5129 generic.go:334] "Generic (PLEG): container finished" podID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerID="e7989c427f7358cb8b2fdb3f89e2bea5af7e55cccf4b1547307d7a55b349b5e2" exitCode=0 Mar 14 10:26:46 crc kubenswrapper[5129]: I0314 10:26:46.672310 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gqb8" event={"ID":"f291d60b-b793-4acb-8418-dd0bf4088bc9","Type":"ContainerDied","Data":"e7989c427f7358cb8b2fdb3f89e2bea5af7e55cccf4b1547307d7a55b349b5e2"} Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.321345 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.429762 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-catalog-content\") pod \"f291d60b-b793-4acb-8418-dd0bf4088bc9\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.429840 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-utilities\") pod \"f291d60b-b793-4acb-8418-dd0bf4088bc9\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.429967 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6q5s\" (UniqueName: \"kubernetes.io/projected/f291d60b-b793-4acb-8418-dd0bf4088bc9-kube-api-access-x6q5s\") pod \"f291d60b-b793-4acb-8418-dd0bf4088bc9\" (UID: \"f291d60b-b793-4acb-8418-dd0bf4088bc9\") " Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.432233 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-utilities" (OuterVolumeSpecName: "utilities") pod "f291d60b-b793-4acb-8418-dd0bf4088bc9" (UID: "f291d60b-b793-4acb-8418-dd0bf4088bc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.469062 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f291d60b-b793-4acb-8418-dd0bf4088bc9-kube-api-access-x6q5s" (OuterVolumeSpecName: "kube-api-access-x6q5s") pod "f291d60b-b793-4acb-8418-dd0bf4088bc9" (UID: "f291d60b-b793-4acb-8418-dd0bf4088bc9"). InnerVolumeSpecName "kube-api-access-x6q5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.494208 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f291d60b-b793-4acb-8418-dd0bf4088bc9" (UID: "f291d60b-b793-4acb-8418-dd0bf4088bc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.532046 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.532084 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f291d60b-b793-4acb-8418-dd0bf4088bc9-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.532095 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6q5s\" (UniqueName: \"kubernetes.io/projected/f291d60b-b793-4acb-8418-dd0bf4088bc9-kube-api-access-x6q5s\") on node \"crc\" DevicePath \"\"" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.682035 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7gqb8" event={"ID":"f291d60b-b793-4acb-8418-dd0bf4088bc9","Type":"ContainerDied","Data":"806af0077ce4a5970c1f406b4607505ebd60b591cc02063078dc6d73e2ae8500"} Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.682087 5129 scope.go:117] "RemoveContainer" containerID="e7989c427f7358cb8b2fdb3f89e2bea5af7e55cccf4b1547307d7a55b349b5e2" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.682214 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7gqb8" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.728275 5129 scope.go:117] "RemoveContainer" containerID="0cd545bc0c59d7fee0eae317375ebd9e257db3aee6ba02f3db8b5391d1c03785" Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.734285 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7gqb8"] Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.746714 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7gqb8"] Mar 14 10:26:47 crc kubenswrapper[5129]: I0314 10:26:47.783756 5129 scope.go:117] "RemoveContainer" containerID="952dca3b0f99772bbbcd3add96c1a8a80db4a7d8a00607888029d7ff1c91123f" Mar 14 10:26:48 crc kubenswrapper[5129]: I0314 10:26:48.047465 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" path="/var/lib/kubelet/pods/f291d60b-b793-4acb-8418-dd0bf4088bc9/volumes" Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.576295 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.576373 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.576430 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.577450 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52581f64e7a71a41ddbfabf547cebc9449637f9a4f1d89909d5ad911bf52828b"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.577793 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://52581f64e7a71a41ddbfabf547cebc9449637f9a4f1d89909d5ad911bf52828b" gracePeriod=600 Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.710971 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="52581f64e7a71a41ddbfabf547cebc9449637f9a4f1d89909d5ad911bf52828b" exitCode=0 Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.711003 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"52581f64e7a71a41ddbfabf547cebc9449637f9a4f1d89909d5ad911bf52828b"} Mar 14 10:26:49 crc kubenswrapper[5129]: I0314 10:26:49.711075 5129 scope.go:117] "RemoveContainer" containerID="f20d7dd819a36033df461b2ca82f92cb412dd3533a59feaba4047b13c68572e1" Mar 14 10:26:50 crc kubenswrapper[5129]: I0314 10:26:50.726880 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194"} Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.144291 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558068-64mg7"] Mar 14 10:28:00 crc kubenswrapper[5129]: E0314 10:28:00.145120 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="extract-utilities" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.145132 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="extract-utilities" Mar 14 10:28:00 crc kubenswrapper[5129]: E0314 10:28:00.145159 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="extract-content" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.145165 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="extract-content" Mar 14 10:28:00 crc kubenswrapper[5129]: E0314 10:28:00.145188 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="registry-server" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.145195 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="registry-server" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.145397 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f291d60b-b793-4acb-8418-dd0bf4088bc9" containerName="registry-server" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.146020 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558068-64mg7" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.148628 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.150887 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.162283 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.165946 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558068-64mg7"] Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.240656 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8657\" (UniqueName: \"kubernetes.io/projected/5378d988-f252-4f3d-865e-12b49ba22b2e-kube-api-access-v8657\") pod \"auto-csr-approver-29558068-64mg7\" (UID: \"5378d988-f252-4f3d-865e-12b49ba22b2e\") " pod="openshift-infra/auto-csr-approver-29558068-64mg7" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.342182 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8657\" (UniqueName: \"kubernetes.io/projected/5378d988-f252-4f3d-865e-12b49ba22b2e-kube-api-access-v8657\") pod \"auto-csr-approver-29558068-64mg7\" (UID: \"5378d988-f252-4f3d-865e-12b49ba22b2e\") " pod="openshift-infra/auto-csr-approver-29558068-64mg7" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.368169 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8657\" (UniqueName: \"kubernetes.io/projected/5378d988-f252-4f3d-865e-12b49ba22b2e-kube-api-access-v8657\") pod \"auto-csr-approver-29558068-64mg7\" (UID: \"5378d988-f252-4f3d-865e-12b49ba22b2e\") " pod="openshift-infra/auto-csr-approver-29558068-64mg7" Mar 14 10:28:00 crc kubenswrapper[5129]: I0314 10:28:00.476270 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558068-64mg7" Mar 14 10:28:01 crc kubenswrapper[5129]: I0314 10:28:01.619172 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558068-64mg7"] Mar 14 10:28:02 crc kubenswrapper[5129]: I0314 10:28:02.491379 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558068-64mg7" event={"ID":"5378d988-f252-4f3d-865e-12b49ba22b2e","Type":"ContainerStarted","Data":"99a7a87ed5613e92a17afa99164113f708b82b9d16235ff1986ba2e8593d5ba9"} Mar 14 10:28:03 crc kubenswrapper[5129]: I0314 10:28:03.514054 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558068-64mg7" event={"ID":"5378d988-f252-4f3d-865e-12b49ba22b2e","Type":"ContainerStarted","Data":"eb0589908b4abd2e57cc29cf8dbfe42d1c9b85953ae554d3e7e07fb9fc2c7850"} Mar 14 10:28:04 crc kubenswrapper[5129]: I0314 10:28:04.524132 5129 generic.go:334] "Generic (PLEG): container finished" podID="5378d988-f252-4f3d-865e-12b49ba22b2e" containerID="eb0589908b4abd2e57cc29cf8dbfe42d1c9b85953ae554d3e7e07fb9fc2c7850" exitCode=0 Mar 14 10:28:04 crc kubenswrapper[5129]: I0314 10:28:04.524216 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558068-64mg7" event={"ID":"5378d988-f252-4f3d-865e-12b49ba22b2e","Type":"ContainerDied","Data":"eb0589908b4abd2e57cc29cf8dbfe42d1c9b85953ae554d3e7e07fb9fc2c7850"} Mar 14 10:28:07 crc kubenswrapper[5129]: I0314 10:28:07.237437 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558068-64mg7" Mar 14 10:28:07 crc kubenswrapper[5129]: I0314 10:28:07.381324 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8657\" (UniqueName: \"kubernetes.io/projected/5378d988-f252-4f3d-865e-12b49ba22b2e-kube-api-access-v8657\") pod \"5378d988-f252-4f3d-865e-12b49ba22b2e\" (UID: \"5378d988-f252-4f3d-865e-12b49ba22b2e\") " Mar 14 10:28:07 crc kubenswrapper[5129]: I0314 10:28:07.392483 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5378d988-f252-4f3d-865e-12b49ba22b2e-kube-api-access-v8657" (OuterVolumeSpecName: "kube-api-access-v8657") pod "5378d988-f252-4f3d-865e-12b49ba22b2e" (UID: "5378d988-f252-4f3d-865e-12b49ba22b2e"). InnerVolumeSpecName "kube-api-access-v8657". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:28:07 crc kubenswrapper[5129]: I0314 10:28:07.484303 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8657\" (UniqueName: \"kubernetes.io/projected/5378d988-f252-4f3d-865e-12b49ba22b2e-kube-api-access-v8657\") on node \"crc\" DevicePath \"\"" Mar 14 10:28:07 crc kubenswrapper[5129]: I0314 10:28:07.553402 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558068-64mg7" event={"ID":"5378d988-f252-4f3d-865e-12b49ba22b2e","Type":"ContainerDied","Data":"99a7a87ed5613e92a17afa99164113f708b82b9d16235ff1986ba2e8593d5ba9"} Mar 14 10:28:07 crc kubenswrapper[5129]: I0314 10:28:07.553444 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a7a87ed5613e92a17afa99164113f708b82b9d16235ff1986ba2e8593d5ba9" Mar 14 10:28:07 crc kubenswrapper[5129]: I0314 10:28:07.553484 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558068-64mg7" Mar 14 10:28:08 crc kubenswrapper[5129]: I0314 10:28:08.331918 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558062-9lnqq"] Mar 14 10:28:08 crc kubenswrapper[5129]: I0314 10:28:08.343841 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558062-9lnqq"] Mar 14 10:28:10 crc kubenswrapper[5129]: I0314 10:28:10.048414 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84efb040-65b9-4606-baf4-01a89cc560fa" path="/var/lib/kubelet/pods/84efb040-65b9-4606-baf4-01a89cc560fa/volumes" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.157652 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qzxcb"] Mar 14 10:28:37 crc kubenswrapper[5129]: E0314 10:28:37.158624 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5378d988-f252-4f3d-865e-12b49ba22b2e" containerName="oc" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.158637 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5378d988-f252-4f3d-865e-12b49ba22b2e" containerName="oc" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.158848 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5378d988-f252-4f3d-865e-12b49ba22b2e" containerName="oc" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.160334 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.171787 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzxcb"] Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.294295 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-utilities\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.294351 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-catalog-content\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.294504 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp76d\" (UniqueName: \"kubernetes.io/projected/b080d0a3-85d6-478f-85ff-2fe82d422b47-kube-api-access-fp76d\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.396529 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-utilities\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.396575 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-catalog-content\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.396676 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp76d\" (UniqueName: \"kubernetes.io/projected/b080d0a3-85d6-478f-85ff-2fe82d422b47-kube-api-access-fp76d\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.397114 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-utilities\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.397195 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-catalog-content\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.415824 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp76d\" (UniqueName: \"kubernetes.io/projected/b080d0a3-85d6-478f-85ff-2fe82d422b47-kube-api-access-fp76d\") pod \"redhat-operators-qzxcb\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:37 crc kubenswrapper[5129]: I0314 10:28:37.491959 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:38 crc kubenswrapper[5129]: I0314 10:28:38.268924 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzxcb"] Mar 14 10:28:38 crc kubenswrapper[5129]: I0314 10:28:38.870037 5129 generic.go:334] "Generic (PLEG): container finished" podID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerID="b172b0d466320d148909fc0daced3135ff16d6a5e73a3b0188f78a19ec107321" exitCode=0 Mar 14 10:28:38 crc kubenswrapper[5129]: I0314 10:28:38.870444 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzxcb" event={"ID":"b080d0a3-85d6-478f-85ff-2fe82d422b47","Type":"ContainerDied","Data":"b172b0d466320d148909fc0daced3135ff16d6a5e73a3b0188f78a19ec107321"} Mar 14 10:28:38 crc kubenswrapper[5129]: I0314 10:28:38.870467 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzxcb" event={"ID":"b080d0a3-85d6-478f-85ff-2fe82d422b47","Type":"ContainerStarted","Data":"1d0584785b5c2806008e9d3990252505cfb70591246b7e46ad6d1e646dae330c"} Mar 14 10:28:39 crc kubenswrapper[5129]: I0314 10:28:39.901467 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzxcb" event={"ID":"b080d0a3-85d6-478f-85ff-2fe82d422b47","Type":"ContainerStarted","Data":"ff94b0770c128004e1db38d505c32fa8461dc5132ba9c4d9874e68c9f73a7ae9"} Mar 14 10:28:42 crc kubenswrapper[5129]: I0314 10:28:42.471011 5129 scope.go:117] "RemoveContainer" containerID="3329970bd44721c6aad890786ec71ddb48a7e2b25283bc9ab788f7fed7e05f22" Mar 14 10:28:45 crc kubenswrapper[5129]: I0314 10:28:45.957850 5129 generic.go:334] "Generic (PLEG): container finished" podID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerID="ff94b0770c128004e1db38d505c32fa8461dc5132ba9c4d9874e68c9f73a7ae9" exitCode=0 Mar 14 10:28:45 crc kubenswrapper[5129]: I0314 10:28:45.957915 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzxcb" event={"ID":"b080d0a3-85d6-478f-85ff-2fe82d422b47","Type":"ContainerDied","Data":"ff94b0770c128004e1db38d505c32fa8461dc5132ba9c4d9874e68c9f73a7ae9"} Mar 14 10:28:46 crc kubenswrapper[5129]: I0314 10:28:46.969732 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzxcb" event={"ID":"b080d0a3-85d6-478f-85ff-2fe82d422b47","Type":"ContainerStarted","Data":"3f1cc0520f4c3a8a88ee0d0b7e937a03e830dbbfde68f6de3932e7679c7106b5"} Mar 14 10:28:47 crc kubenswrapper[5129]: I0314 10:28:47.493546 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:47 crc kubenswrapper[5129]: I0314 10:28:47.494115 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:28:48 crc kubenswrapper[5129]: I0314 10:28:48.550846 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzxcb" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" probeResult="failure" output=< Mar 14 10:28:48 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:28:48 crc kubenswrapper[5129]: > Mar 14 10:28:49 crc kubenswrapper[5129]: I0314 10:28:49.574376 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:28:49 crc kubenswrapper[5129]: I0314 10:28:49.574763 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:28:58 crc kubenswrapper[5129]: I0314 10:28:58.572277 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzxcb" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" probeResult="failure" output=< Mar 14 10:28:58 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:28:58 crc kubenswrapper[5129]: > Mar 14 10:29:08 crc kubenswrapper[5129]: I0314 10:29:08.540320 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzxcb" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" probeResult="failure" output=< Mar 14 10:29:08 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:29:08 crc kubenswrapper[5129]: > Mar 14 10:29:18 crc kubenswrapper[5129]: I0314 10:29:18.548757 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzxcb" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" probeResult="failure" output=< Mar 14 10:29:18 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:29:18 crc kubenswrapper[5129]: > Mar 14 10:29:19 crc kubenswrapper[5129]: I0314 10:29:19.574318 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:29:19 crc kubenswrapper[5129]: I0314 10:29:19.574381 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:29:28 crc kubenswrapper[5129]: I0314 10:29:28.549784 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzxcb" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" probeResult="failure" output=< Mar 14 10:29:28 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:29:28 crc kubenswrapper[5129]: > Mar 14 10:29:37 crc kubenswrapper[5129]: I0314 10:29:37.542456 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:29:37 crc kubenswrapper[5129]: I0314 10:29:37.569977 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qzxcb" podStartSLOduration=53.062274171 podStartE2EDuration="1m0.56996322s" podCreationTimestamp="2026-03-14 10:28:37 +0000 UTC" firstStartedPulling="2026-03-14 10:28:38.872433149 +0000 UTC m=+12581.624348333" lastFinishedPulling="2026-03-14 10:28:46.380122198 +0000 UTC m=+12589.132037382" observedRunningTime="2026-03-14 10:28:46.994190874 +0000 UTC m=+12589.746106048" watchObservedRunningTime="2026-03-14 10:29:37.56996322 +0000 UTC m=+12640.321878404" Mar 14 10:29:37 crc kubenswrapper[5129]: I0314 10:29:37.600571 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:29:38 crc kubenswrapper[5129]: I0314 10:29:38.430727 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzxcb"] Mar 14 10:29:39 crc kubenswrapper[5129]: I0314 10:29:39.465302 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qzxcb" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" containerID="cri-o://3f1cc0520f4c3a8a88ee0d0b7e937a03e830dbbfde68f6de3932e7679c7106b5" gracePeriod=2 Mar 14 10:29:40 crc kubenswrapper[5129]: I0314 10:29:40.477254 5129 generic.go:334] "Generic (PLEG): container finished" podID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerID="3f1cc0520f4c3a8a88ee0d0b7e937a03e830dbbfde68f6de3932e7679c7106b5" exitCode=0 Mar 14 10:29:40 crc kubenswrapper[5129]: I0314 10:29:40.477324 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzxcb" event={"ID":"b080d0a3-85d6-478f-85ff-2fe82d422b47","Type":"ContainerDied","Data":"3f1cc0520f4c3a8a88ee0d0b7e937a03e830dbbfde68f6de3932e7679c7106b5"} Mar 14 10:29:40 crc kubenswrapper[5129]: I0314 10:29:40.998584 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.113704 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-catalog-content\") pod \"b080d0a3-85d6-478f-85ff-2fe82d422b47\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.113847 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp76d\" (UniqueName: \"kubernetes.io/projected/b080d0a3-85d6-478f-85ff-2fe82d422b47-kube-api-access-fp76d\") pod \"b080d0a3-85d6-478f-85ff-2fe82d422b47\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.113887 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-utilities\") pod \"b080d0a3-85d6-478f-85ff-2fe82d422b47\" (UID: \"b080d0a3-85d6-478f-85ff-2fe82d422b47\") " Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.114903 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-utilities" (OuterVolumeSpecName: "utilities") pod "b080d0a3-85d6-478f-85ff-2fe82d422b47" (UID: "b080d0a3-85d6-478f-85ff-2fe82d422b47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.136885 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b080d0a3-85d6-478f-85ff-2fe82d422b47-kube-api-access-fp76d" (OuterVolumeSpecName: "kube-api-access-fp76d") pod "b080d0a3-85d6-478f-85ff-2fe82d422b47" (UID: "b080d0a3-85d6-478f-85ff-2fe82d422b47"). InnerVolumeSpecName "kube-api-access-fp76d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.224209 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.224241 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp76d\" (UniqueName: \"kubernetes.io/projected/b080d0a3-85d6-478f-85ff-2fe82d422b47-kube-api-access-fp76d\") on node \"crc\" DevicePath \"\"" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.311987 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b080d0a3-85d6-478f-85ff-2fe82d422b47" (UID: "b080d0a3-85d6-478f-85ff-2fe82d422b47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.325925 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b080d0a3-85d6-478f-85ff-2fe82d422b47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.487462 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzxcb" event={"ID":"b080d0a3-85d6-478f-85ff-2fe82d422b47","Type":"ContainerDied","Data":"1d0584785b5c2806008e9d3990252505cfb70591246b7e46ad6d1e646dae330c"} Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.487512 5129 scope.go:117] "RemoveContainer" containerID="3f1cc0520f4c3a8a88ee0d0b7e937a03e830dbbfde68f6de3932e7679c7106b5" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.487652 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzxcb" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.526249 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzxcb"] Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.533878 5129 scope.go:117] "RemoveContainer" containerID="ff94b0770c128004e1db38d505c32fa8461dc5132ba9c4d9874e68c9f73a7ae9" Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.537691 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qzxcb"] Mar 14 10:29:41 crc kubenswrapper[5129]: I0314 10:29:41.575151 5129 scope.go:117] "RemoveContainer" containerID="b172b0d466320d148909fc0daced3135ff16d6a5e73a3b0188f78a19ec107321" Mar 14 10:29:42 crc kubenswrapper[5129]: I0314 10:29:42.052376 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" path="/var/lib/kubelet/pods/b080d0a3-85d6-478f-85ff-2fe82d422b47/volumes" Mar 14 10:29:49 crc kubenswrapper[5129]: I0314 10:29:49.573876 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:29:49 crc kubenswrapper[5129]: I0314 10:29:49.574550 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:29:49 crc kubenswrapper[5129]: I0314 10:29:49.574641 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:29:49 crc kubenswrapper[5129]: I0314 10:29:49.575521 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:29:49 crc kubenswrapper[5129]: I0314 10:29:49.575583 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" gracePeriod=600 Mar 14 10:29:49 crc kubenswrapper[5129]: E0314 10:29:49.700267 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:29:50 crc kubenswrapper[5129]: I0314 10:29:50.591172 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" exitCode=0 Mar 14 10:29:50 crc kubenswrapper[5129]: I0314 10:29:50.591220 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194"} Mar 14 10:29:50 crc kubenswrapper[5129]: I0314 10:29:50.591256 5129 scope.go:117] "RemoveContainer" containerID="52581f64e7a71a41ddbfabf547cebc9449637f9a4f1d89909d5ad911bf52828b" Mar 14 10:29:50 crc kubenswrapper[5129]: I0314 10:29:50.592039 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:29:50 crc kubenswrapper[5129]: E0314 10:29:50.592386 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.197640 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r"] Mar 14 10:30:00 crc kubenswrapper[5129]: E0314 10:30:00.198641 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="extract-utilities" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.198653 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="extract-utilities" Mar 14 10:30:00 crc kubenswrapper[5129]: E0314 10:30:00.198679 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="extract-content" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.198685 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="extract-content" Mar 14 10:30:00 crc kubenswrapper[5129]: E0314 10:30:00.198715 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.198722 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.198933 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b080d0a3-85d6-478f-85ff-2fe82d422b47" containerName="registry-server" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.199664 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.201404 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.201676 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.210667 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558070-gpx85"] Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.213359 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558070-gpx85" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.223537 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r"] Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.232660 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558070-gpx85"] Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.255995 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.256001 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.256208 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.318507 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/150c4edd-4813-4712-9f17-e67ddda8dc23-config-volume\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.318593 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prl4k\" (UniqueName: \"kubernetes.io/projected/3089d2a9-975b-4b8c-b3a4-6e1079308c9a-kube-api-access-prl4k\") pod \"auto-csr-approver-29558070-gpx85\" (UID: \"3089d2a9-975b-4b8c-b3a4-6e1079308c9a\") " pod="openshift-infra/auto-csr-approver-29558070-gpx85" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.318668 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qlz\" (UniqueName: \"kubernetes.io/projected/150c4edd-4813-4712-9f17-e67ddda8dc23-kube-api-access-65qlz\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.318714 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/150c4edd-4813-4712-9f17-e67ddda8dc23-secret-volume\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.420763 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/150c4edd-4813-4712-9f17-e67ddda8dc23-secret-volume\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.420859 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/150c4edd-4813-4712-9f17-e67ddda8dc23-config-volume\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.420921 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prl4k\" (UniqueName: \"kubernetes.io/projected/3089d2a9-975b-4b8c-b3a4-6e1079308c9a-kube-api-access-prl4k\") pod \"auto-csr-approver-29558070-gpx85\" (UID: \"3089d2a9-975b-4b8c-b3a4-6e1079308c9a\") " pod="openshift-infra/auto-csr-approver-29558070-gpx85" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.420980 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qlz\" (UniqueName: \"kubernetes.io/projected/150c4edd-4813-4712-9f17-e67ddda8dc23-kube-api-access-65qlz\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.422027 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/150c4edd-4813-4712-9f17-e67ddda8dc23-config-volume\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.429702 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/150c4edd-4813-4712-9f17-e67ddda8dc23-secret-volume\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.436857 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prl4k\" (UniqueName: \"kubernetes.io/projected/3089d2a9-975b-4b8c-b3a4-6e1079308c9a-kube-api-access-prl4k\") pod \"auto-csr-approver-29558070-gpx85\" (UID: \"3089d2a9-975b-4b8c-b3a4-6e1079308c9a\") " pod="openshift-infra/auto-csr-approver-29558070-gpx85" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.440199 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qlz\" (UniqueName: \"kubernetes.io/projected/150c4edd-4813-4712-9f17-e67ddda8dc23-kube-api-access-65qlz\") pod \"collect-profiles-29558070-5rc4r\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.577646 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:00 crc kubenswrapper[5129]: I0314 10:30:00.601734 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558070-gpx85" Mar 14 10:30:01 crc kubenswrapper[5129]: I0314 10:30:01.399425 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r"] Mar 14 10:30:01 crc kubenswrapper[5129]: I0314 10:30:01.697578 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" event={"ID":"150c4edd-4813-4712-9f17-e67ddda8dc23","Type":"ContainerStarted","Data":"1c1f5d9fc584d20284d74484c6f3073d1a8f5ef6a57f86fd82de4b82a4f802e5"} Mar 14 10:30:01 crc kubenswrapper[5129]: I0314 10:30:01.698304 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" event={"ID":"150c4edd-4813-4712-9f17-e67ddda8dc23","Type":"ContainerStarted","Data":"1ae791a9604c42ed3eaa6b39edf0a79949382626e948612439f609129859cb0b"} Mar 14 10:30:01 crc kubenswrapper[5129]: I0314 10:30:01.728235 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" podStartSLOduration=1.728213531 podStartE2EDuration="1.728213531s" podCreationTimestamp="2026-03-14 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:30:01.721102129 +0000 UTC m=+12664.473017313" watchObservedRunningTime="2026-03-14 10:30:01.728213531 +0000 UTC m=+12664.480128715" Mar 14 10:30:01 crc kubenswrapper[5129]: I0314 10:30:01.754823 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558070-gpx85"] Mar 14 10:30:01 crc kubenswrapper[5129]: I0314 10:30:01.798955 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:30:02 crc kubenswrapper[5129]: I0314 10:30:02.716350 5129 generic.go:334] "Generic (PLEG): container finished" podID="150c4edd-4813-4712-9f17-e67ddda8dc23" containerID="1c1f5d9fc584d20284d74484c6f3073d1a8f5ef6a57f86fd82de4b82a4f802e5" exitCode=0 Mar 14 10:30:02 crc kubenswrapper[5129]: I0314 10:30:02.716406 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" event={"ID":"150c4edd-4813-4712-9f17-e67ddda8dc23","Type":"ContainerDied","Data":"1c1f5d9fc584d20284d74484c6f3073d1a8f5ef6a57f86fd82de4b82a4f802e5"} Mar 14 10:30:02 crc kubenswrapper[5129]: I0314 10:30:02.720284 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558070-gpx85" event={"ID":"3089d2a9-975b-4b8c-b3a4-6e1079308c9a","Type":"ContainerStarted","Data":"1cd2ca89607c584ea284fa3470cab1bced274f269fba8f12c6e78a138bfb71e4"} Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.245110 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.319843 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65qlz\" (UniqueName: \"kubernetes.io/projected/150c4edd-4813-4712-9f17-e67ddda8dc23-kube-api-access-65qlz\") pod \"150c4edd-4813-4712-9f17-e67ddda8dc23\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.319886 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/150c4edd-4813-4712-9f17-e67ddda8dc23-secret-volume\") pod \"150c4edd-4813-4712-9f17-e67ddda8dc23\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.320081 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/150c4edd-4813-4712-9f17-e67ddda8dc23-config-volume\") pod \"150c4edd-4813-4712-9f17-e67ddda8dc23\" (UID: \"150c4edd-4813-4712-9f17-e67ddda8dc23\") " Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.320576 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/150c4edd-4813-4712-9f17-e67ddda8dc23-config-volume" (OuterVolumeSpecName: "config-volume") pod "150c4edd-4813-4712-9f17-e67ddda8dc23" (UID: "150c4edd-4813-4712-9f17-e67ddda8dc23"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.327472 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150c4edd-4813-4712-9f17-e67ddda8dc23-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "150c4edd-4813-4712-9f17-e67ddda8dc23" (UID: "150c4edd-4813-4712-9f17-e67ddda8dc23"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.347943 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150c4edd-4813-4712-9f17-e67ddda8dc23-kube-api-access-65qlz" (OuterVolumeSpecName: "kube-api-access-65qlz") pod "150c4edd-4813-4712-9f17-e67ddda8dc23" (UID: "150c4edd-4813-4712-9f17-e67ddda8dc23"). InnerVolumeSpecName "kube-api-access-65qlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.422205 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65qlz\" (UniqueName: \"kubernetes.io/projected/150c4edd-4813-4712-9f17-e67ddda8dc23-kube-api-access-65qlz\") on node \"crc\" DevicePath \"\"" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.422238 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/150c4edd-4813-4712-9f17-e67ddda8dc23-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.422247 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/150c4edd-4813-4712-9f17-e67ddda8dc23-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.752929 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558070-gpx85" event={"ID":"3089d2a9-975b-4b8c-b3a4-6e1079308c9a","Type":"ContainerStarted","Data":"6dfe3b8530daf8a6d288b018230d7fbcfca9fb4811b8fff7751d190a8c89b322"} Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.761792 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" event={"ID":"150c4edd-4813-4712-9f17-e67ddda8dc23","Type":"ContainerDied","Data":"1ae791a9604c42ed3eaa6b39edf0a79949382626e948612439f609129859cb0b"} Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.761836 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ae791a9604c42ed3eaa6b39edf0a79949382626e948612439f609129859cb0b" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.761898 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558070-5rc4r" Mar 14 10:30:05 crc kubenswrapper[5129]: I0314 10:30:05.789636 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558070-gpx85" podStartSLOduration=3.359827336 podStartE2EDuration="5.789593517s" podCreationTimestamp="2026-03-14 10:30:00 +0000 UTC" firstStartedPulling="2026-03-14 10:30:01.79191856 +0000 UTC m=+12664.543833754" lastFinishedPulling="2026-03-14 10:30:04.221684761 +0000 UTC m=+12666.973599935" observedRunningTime="2026-03-14 10:30:05.775545125 +0000 UTC m=+12668.527460309" watchObservedRunningTime="2026-03-14 10:30:05.789593517 +0000 UTC m=+12668.541508711" Mar 14 10:30:06 crc kubenswrapper[5129]: I0314 10:30:06.036910 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:30:06 crc kubenswrapper[5129]: E0314 10:30:06.037255 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:30:06 crc kubenswrapper[5129]: I0314 10:30:06.319538 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp"] Mar 14 10:30:06 crc kubenswrapper[5129]: I0314 10:30:06.331215 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-fjsgp"] Mar 14 10:30:06 crc kubenswrapper[5129]: I0314 10:30:06.781540 5129 generic.go:334] "Generic (PLEG): container finished" podID="3089d2a9-975b-4b8c-b3a4-6e1079308c9a" containerID="6dfe3b8530daf8a6d288b018230d7fbcfca9fb4811b8fff7751d190a8c89b322" exitCode=0 Mar 14 10:30:06 crc kubenswrapper[5129]: I0314 10:30:06.781587 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558070-gpx85" event={"ID":"3089d2a9-975b-4b8c-b3a4-6e1079308c9a","Type":"ContainerDied","Data":"6dfe3b8530daf8a6d288b018230d7fbcfca9fb4811b8fff7751d190a8c89b322"} Mar 14 10:30:08 crc kubenswrapper[5129]: I0314 10:30:08.062136 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed61f96-26b6-4993-a226-76f32ee3e8fe" path="/var/lib/kubelet/pods/0ed61f96-26b6-4993-a226-76f32ee3e8fe/volumes" Mar 14 10:30:09 crc kubenswrapper[5129]: I0314 10:30:09.242781 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558070-gpx85" Mar 14 10:30:09 crc kubenswrapper[5129]: I0314 10:30:09.331363 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prl4k\" (UniqueName: \"kubernetes.io/projected/3089d2a9-975b-4b8c-b3a4-6e1079308c9a-kube-api-access-prl4k\") pod \"3089d2a9-975b-4b8c-b3a4-6e1079308c9a\" (UID: \"3089d2a9-975b-4b8c-b3a4-6e1079308c9a\") " Mar 14 10:30:09 crc kubenswrapper[5129]: I0314 10:30:09.339898 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3089d2a9-975b-4b8c-b3a4-6e1079308c9a-kube-api-access-prl4k" (OuterVolumeSpecName: "kube-api-access-prl4k") pod "3089d2a9-975b-4b8c-b3a4-6e1079308c9a" (UID: "3089d2a9-975b-4b8c-b3a4-6e1079308c9a"). InnerVolumeSpecName "kube-api-access-prl4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:30:09 crc kubenswrapper[5129]: I0314 10:30:09.433707 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prl4k\" (UniqueName: \"kubernetes.io/projected/3089d2a9-975b-4b8c-b3a4-6e1079308c9a-kube-api-access-prl4k\") on node \"crc\" DevicePath \"\"" Mar 14 10:30:09 crc kubenswrapper[5129]: I0314 10:30:09.812513 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558070-gpx85" event={"ID":"3089d2a9-975b-4b8c-b3a4-6e1079308c9a","Type":"ContainerDied","Data":"1cd2ca89607c584ea284fa3470cab1bced274f269fba8f12c6e78a138bfb71e4"} Mar 14 10:30:09 crc kubenswrapper[5129]: I0314 10:30:09.812660 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cd2ca89607c584ea284fa3470cab1bced274f269fba8f12c6e78a138bfb71e4" Mar 14 10:30:09 crc kubenswrapper[5129]: I0314 10:30:09.812715 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558070-gpx85" Mar 14 10:30:10 crc kubenswrapper[5129]: I0314 10:30:10.309297 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558064-csc68"] Mar 14 10:30:10 crc kubenswrapper[5129]: I0314 10:30:10.336021 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558064-csc68"] Mar 14 10:30:12 crc kubenswrapper[5129]: I0314 10:30:12.048102 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab2d378-6795-4c41-a142-5c26ec16bc1b" path="/var/lib/kubelet/pods/8ab2d378-6795-4c41-a142-5c26ec16bc1b/volumes" Mar 14 10:30:17 crc kubenswrapper[5129]: I0314 10:30:17.036908 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:30:17 crc kubenswrapper[5129]: E0314 10:30:17.037615 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:30:29 crc kubenswrapper[5129]: I0314 10:30:29.036989 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:30:29 crc kubenswrapper[5129]: E0314 10:30:29.038820 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:30:42 crc kubenswrapper[5129]: I0314 10:30:42.037380 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:30:42 crc kubenswrapper[5129]: E0314 10:30:42.038171 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:30:42 crc kubenswrapper[5129]: I0314 10:30:42.786888 5129 scope.go:117] "RemoveContainer" containerID="eeb51d7a699c802757a16ca7d250f7cd7bccf63628d56a8f33712d20e505bf34" Mar 14 10:30:42 crc kubenswrapper[5129]: I0314 10:30:42.834593 5129 scope.go:117] "RemoveContainer" containerID="2fa27fa7e32cf9d821cb253ec4e112efb5f921164ab10d2042408a7b48721cbd" Mar 14 10:30:57 crc kubenswrapper[5129]: I0314 10:30:57.037351 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:30:57 crc kubenswrapper[5129]: E0314 10:30:57.038116 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:31:08 crc kubenswrapper[5129]: I0314 10:31:08.047426 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:31:08 crc kubenswrapper[5129]: E0314 10:31:08.048385 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:31:19 crc kubenswrapper[5129]: I0314 10:31:19.036911 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:31:19 crc kubenswrapper[5129]: E0314 10:31:19.039003 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:31:32 crc kubenswrapper[5129]: I0314 10:31:32.036618 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:31:32 crc kubenswrapper[5129]: E0314 10:31:32.037451 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:31:44 crc kubenswrapper[5129]: I0314 10:31:44.037549 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:31:44 crc kubenswrapper[5129]: E0314 10:31:44.039237 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:31:57 crc kubenswrapper[5129]: I0314 10:31:57.965174 5129 generic.go:334] "Generic (PLEG): container finished" podID="4883291a-4eb0-4a7b-82e9-f0caa4f72148" containerID="8acf1a76318c854ec6ad2f10ea67f397fdf6fcae07ad90086f5e92f525cf5e33" exitCode=0 Mar 14 10:31:57 crc kubenswrapper[5129]: I0314 10:31:57.965282 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4883291a-4eb0-4a7b-82e9-f0caa4f72148","Type":"ContainerDied","Data":"8acf1a76318c854ec6ad2f10ea67f397fdf6fcae07ad90086f5e92f525cf5e33"} Mar 14 10:31:58 crc kubenswrapper[5129]: I0314 10:31:58.066024 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:31:58 crc kubenswrapper[5129]: E0314 10:31:58.066420 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.166461 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558072-bch5s"] Mar 14 10:32:00 crc kubenswrapper[5129]: E0314 10:32:00.167843 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3089d2a9-975b-4b8c-b3a4-6e1079308c9a" containerName="oc" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.167868 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="3089d2a9-975b-4b8c-b3a4-6e1079308c9a" containerName="oc" Mar 14 10:32:00 crc kubenswrapper[5129]: E0314 10:32:00.167960 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150c4edd-4813-4712-9f17-e67ddda8dc23" containerName="collect-profiles" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.167975 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="150c4edd-4813-4712-9f17-e67ddda8dc23" containerName="collect-profiles" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.168373 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="3089d2a9-975b-4b8c-b3a4-6e1079308c9a" containerName="oc" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.168422 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="150c4edd-4813-4712-9f17-e67ddda8dc23" containerName="collect-profiles" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.169679 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558072-bch5s" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.173901 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.174472 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.182438 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.192465 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558072-bch5s"] Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.292634 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnllv\" (UniqueName: \"kubernetes.io/projected/1bb00404-30aa-431f-a70b-6d62565f1526-kube-api-access-lnllv\") pod \"auto-csr-approver-29558072-bch5s\" (UID: \"1bb00404-30aa-431f-a70b-6d62565f1526\") " pod="openshift-infra/auto-csr-approver-29558072-bch5s" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.394695 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnllv\" (UniqueName: \"kubernetes.io/projected/1bb00404-30aa-431f-a70b-6d62565f1526-kube-api-access-lnllv\") pod \"auto-csr-approver-29558072-bch5s\" (UID: \"1bb00404-30aa-431f-a70b-6d62565f1526\") " pod="openshift-infra/auto-csr-approver-29558072-bch5s" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.432473 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnllv\" (UniqueName: \"kubernetes.io/projected/1bb00404-30aa-431f-a70b-6d62565f1526-kube-api-access-lnllv\") pod \"auto-csr-approver-29558072-bch5s\" (UID: \"1bb00404-30aa-431f-a70b-6d62565f1526\") " pod="openshift-infra/auto-csr-approver-29558072-bch5s" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.497581 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558072-bch5s" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.631214 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704057 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-464rc\" (UniqueName: \"kubernetes.io/projected/4883291a-4eb0-4a7b-82e9-f0caa4f72148-kube-api-access-464rc\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704175 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704221 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-temporary\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704294 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config-secret\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704328 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ca-certs\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704522 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ssh-key\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704728 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-config-data\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704787 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-workdir\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.704850 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config\") pod \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\" (UID: \"4883291a-4eb0-4a7b-82e9-f0caa4f72148\") " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.705358 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.705770 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-config-data" (OuterVolumeSpecName: "config-data") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.706690 5129 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.706722 5129 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.711814 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.717501 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.719015 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4883291a-4eb0-4a7b-82e9-f0caa4f72148-kube-api-access-464rc" (OuterVolumeSpecName: "kube-api-access-464rc") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "kube-api-access-464rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.748806 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.762946 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.768266 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.772200 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4883291a-4eb0-4a7b-82e9-f0caa4f72148" (UID: "4883291a-4eb0-4a7b-82e9-f0caa4f72148"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.808335 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.808375 5129 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.808387 5129 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4883291a-4eb0-4a7b-82e9-f0caa4f72148-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.808398 5129 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4883291a-4eb0-4a7b-82e9-f0caa4f72148-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.808410 5129 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4883291a-4eb0-4a7b-82e9-f0caa4f72148-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.808423 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-464rc\" (UniqueName: \"kubernetes.io/projected/4883291a-4eb0-4a7b-82e9-f0caa4f72148-kube-api-access-464rc\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.810181 5129 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.836792 5129 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.911802 5129 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.999079 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4883291a-4eb0-4a7b-82e9-f0caa4f72148","Type":"ContainerDied","Data":"493c3ea7f3908f977af9b1bad95e78ece254007f410ba2364bfa2f6d78576088"} Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.999124 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="493c3ea7f3908f977af9b1bad95e78ece254007f410ba2364bfa2f6d78576088" Mar 14 10:32:00 crc kubenswrapper[5129]: I0314 10:32:00.999197 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 10:32:01 crc kubenswrapper[5129]: I0314 10:32:01.206046 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558072-bch5s"] Mar 14 10:32:02 crc kubenswrapper[5129]: I0314 10:32:02.033739 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558072-bch5s" event={"ID":"1bb00404-30aa-431f-a70b-6d62565f1526","Type":"ContainerStarted","Data":"5ea214f842bfc587fe824c5c27bdce289bb987ef8c0f02fdc65b59235fd8ab9e"} Mar 14 10:32:03 crc kubenswrapper[5129]: I0314 10:32:03.047104 5129 generic.go:334] "Generic (PLEG): container finished" podID="1bb00404-30aa-431f-a70b-6d62565f1526" containerID="abde1e518bdb5bf06ae20d23cf74e90b5663ce2a8a8061252994f2b5d5201fbb" exitCode=0 Mar 14 10:32:03 crc kubenswrapper[5129]: I0314 10:32:03.047163 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558072-bch5s" event={"ID":"1bb00404-30aa-431f-a70b-6d62565f1526","Type":"ContainerDied","Data":"abde1e518bdb5bf06ae20d23cf74e90b5663ce2a8a8061252994f2b5d5201fbb"} Mar 14 10:32:05 crc kubenswrapper[5129]: I0314 10:32:05.590798 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558072-bch5s" Mar 14 10:32:05 crc kubenswrapper[5129]: I0314 10:32:05.754283 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnllv\" (UniqueName: \"kubernetes.io/projected/1bb00404-30aa-431f-a70b-6d62565f1526-kube-api-access-lnllv\") pod \"1bb00404-30aa-431f-a70b-6d62565f1526\" (UID: \"1bb00404-30aa-431f-a70b-6d62565f1526\") " Mar 14 10:32:05 crc kubenswrapper[5129]: I0314 10:32:05.760527 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb00404-30aa-431f-a70b-6d62565f1526-kube-api-access-lnllv" (OuterVolumeSpecName: "kube-api-access-lnllv") pod "1bb00404-30aa-431f-a70b-6d62565f1526" (UID: "1bb00404-30aa-431f-a70b-6d62565f1526"). InnerVolumeSpecName "kube-api-access-lnllv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:32:05 crc kubenswrapper[5129]: I0314 10:32:05.856401 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnllv\" (UniqueName: \"kubernetes.io/projected/1bb00404-30aa-431f-a70b-6d62565f1526-kube-api-access-lnllv\") on node \"crc\" DevicePath \"\"" Mar 14 10:32:06 crc kubenswrapper[5129]: I0314 10:32:06.088221 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558072-bch5s" event={"ID":"1bb00404-30aa-431f-a70b-6d62565f1526","Type":"ContainerDied","Data":"5ea214f842bfc587fe824c5c27bdce289bb987ef8c0f02fdc65b59235fd8ab9e"} Mar 14 10:32:06 crc kubenswrapper[5129]: I0314 10:32:06.088685 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea214f842bfc587fe824c5c27bdce289bb987ef8c0f02fdc65b59235fd8ab9e" Mar 14 10:32:06 crc kubenswrapper[5129]: I0314 10:32:06.088476 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558072-bch5s" Mar 14 10:32:06 crc kubenswrapper[5129]: I0314 10:32:06.659195 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558066-ggh66"] Mar 14 10:32:06 crc kubenswrapper[5129]: I0314 10:32:06.695041 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558066-ggh66"] Mar 14 10:32:08 crc kubenswrapper[5129]: I0314 10:32:08.054056 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2917c36-d4d9-41b9-975b-76f8389b51a5" path="/var/lib/kubelet/pods/d2917c36-d4d9-41b9-975b-76f8389b51a5/volumes" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.280902 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 10:32:10 crc kubenswrapper[5129]: E0314 10:32:10.281543 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4883291a-4eb0-4a7b-82e9-f0caa4f72148" containerName="tempest-tests-tempest-tests-runner" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.281572 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4883291a-4eb0-4a7b-82e9-f0caa4f72148" containerName="tempest-tests-tempest-tests-runner" Mar 14 10:32:10 crc kubenswrapper[5129]: E0314 10:32:10.281660 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb00404-30aa-431f-a70b-6d62565f1526" containerName="oc" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.281674 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb00404-30aa-431f-a70b-6d62565f1526" containerName="oc" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.282034 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb00404-30aa-431f-a70b-6d62565f1526" containerName="oc" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.282099 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4883291a-4eb0-4a7b-82e9-f0caa4f72148" containerName="tempest-tests-tempest-tests-runner" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.283191 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.287914 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jslnd" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.298369 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.454162 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44fw\" (UniqueName: \"kubernetes.io/projected/912cd5e3-94b8-4559-a20e-470b124e747a-kube-api-access-c44fw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"912cd5e3-94b8-4559-a20e-470b124e747a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.454226 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"912cd5e3-94b8-4559-a20e-470b124e747a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.555729 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44fw\" (UniqueName: \"kubernetes.io/projected/912cd5e3-94b8-4559-a20e-470b124e747a-kube-api-access-c44fw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"912cd5e3-94b8-4559-a20e-470b124e747a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.555781 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"912cd5e3-94b8-4559-a20e-470b124e747a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.558365 5129 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"912cd5e3-94b8-4559-a20e-470b124e747a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.589331 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44fw\" (UniqueName: \"kubernetes.io/projected/912cd5e3-94b8-4559-a20e-470b124e747a-kube-api-access-c44fw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"912cd5e3-94b8-4559-a20e-470b124e747a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.592853 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"912cd5e3-94b8-4559-a20e-470b124e747a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:10 crc kubenswrapper[5129]: I0314 10:32:10.660878 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 10:32:11 crc kubenswrapper[5129]: I0314 10:32:11.038620 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:32:11 crc kubenswrapper[5129]: E0314 10:32:11.039112 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:32:11 crc kubenswrapper[5129]: I0314 10:32:11.364615 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 10:32:12 crc kubenswrapper[5129]: I0314 10:32:12.164062 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"912cd5e3-94b8-4559-a20e-470b124e747a","Type":"ContainerStarted","Data":"f76a93a9d7ea7af8489caca28ef04e5c6e1ef0bc932a6825a23f559e169b9027"} Mar 14 10:32:13 crc kubenswrapper[5129]: I0314 10:32:13.177986 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"912cd5e3-94b8-4559-a20e-470b124e747a","Type":"ContainerStarted","Data":"9ead54f0c3b37d88eb20196f2215f89eaefe1c896868a1a7d728e12205106b66"} Mar 14 10:32:13 crc kubenswrapper[5129]: I0314 10:32:13.193097 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.8729390989999999 podStartE2EDuration="3.1930825s" podCreationTimestamp="2026-03-14 10:32:10 +0000 UTC" firstStartedPulling="2026-03-14 10:32:11.369867334 +0000 UTC m=+12794.121782518" lastFinishedPulling="2026-03-14 10:32:12.690010695 +0000 UTC m=+12795.441925919" observedRunningTime="2026-03-14 10:32:13.191298061 +0000 UTC m=+12795.943213245" watchObservedRunningTime="2026-03-14 10:32:13.1930825 +0000 UTC m=+12795.944997684" Mar 14 10:32:25 crc kubenswrapper[5129]: I0314 10:32:25.036253 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:32:25 crc kubenswrapper[5129]: E0314 10:32:25.037048 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:32:38 crc kubenswrapper[5129]: I0314 10:32:38.044129 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:32:38 crc kubenswrapper[5129]: E0314 10:32:38.044977 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:32:42 crc kubenswrapper[5129]: I0314 10:32:42.972810 5129 scope.go:117] "RemoveContainer" containerID="1675f046a4e1f003ca77b96269cbb3f1050eee1dd6e3afac6f5fc4b6b5cca0ad" Mar 14 10:32:49 crc kubenswrapper[5129]: I0314 10:32:49.035965 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:32:49 crc kubenswrapper[5129]: E0314 10:32:49.036709 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:33:00 crc kubenswrapper[5129]: I0314 10:33:00.036686 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:33:00 crc kubenswrapper[5129]: E0314 10:33:00.038240 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:33:14 crc kubenswrapper[5129]: I0314 10:33:14.036661 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:33:14 crc kubenswrapper[5129]: E0314 10:33:14.038467 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:33:27 crc kubenswrapper[5129]: I0314 10:33:27.912164 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kjdpq"] Mar 14 10:33:27 crc kubenswrapper[5129]: I0314 10:33:27.916933 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:27 crc kubenswrapper[5129]: I0314 10:33:27.929557 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjdpq"] Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.043750 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:33:28 crc kubenswrapper[5129]: E0314 10:33:28.043995 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.061519 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-catalog-content\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.061575 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jrp\" (UniqueName: \"kubernetes.io/projected/2b775e98-0df5-49bc-b929-61bd0c676412-kube-api-access-z5jrp\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.061673 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-utilities\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.163058 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-catalog-content\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.163110 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jrp\" (UniqueName: \"kubernetes.io/projected/2b775e98-0df5-49bc-b929-61bd0c676412-kube-api-access-z5jrp\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.163210 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-utilities\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.164144 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-catalog-content\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.164425 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-utilities\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.183864 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jrp\" (UniqueName: \"kubernetes.io/projected/2b775e98-0df5-49bc-b929-61bd0c676412-kube-api-access-z5jrp\") pod \"redhat-marketplace-kjdpq\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:28 crc kubenswrapper[5129]: I0314 10:33:28.241978 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:29 crc kubenswrapper[5129]: I0314 10:33:29.181954 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjdpq"] Mar 14 10:33:30 crc kubenswrapper[5129]: I0314 10:33:30.023053 5129 generic.go:334] "Generic (PLEG): container finished" podID="2b775e98-0df5-49bc-b929-61bd0c676412" containerID="93f36916471098e4a90711f030dc4b595a66762d75a4add6de58663383a84e53" exitCode=0 Mar 14 10:33:30 crc kubenswrapper[5129]: I0314 10:33:30.023128 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjdpq" event={"ID":"2b775e98-0df5-49bc-b929-61bd0c676412","Type":"ContainerDied","Data":"93f36916471098e4a90711f030dc4b595a66762d75a4add6de58663383a84e53"} Mar 14 10:33:30 crc kubenswrapper[5129]: I0314 10:33:30.023305 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjdpq" event={"ID":"2b775e98-0df5-49bc-b929-61bd0c676412","Type":"ContainerStarted","Data":"e794a6548526f7963129cb17d6890b2029bc550d67b12178f058d7caf68a0a8a"} Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.040179 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjdpq" event={"ID":"2b775e98-0df5-49bc-b929-61bd0c676412","Type":"ContainerStarted","Data":"db2f9177cd3c539e7abf3f86ea8faa84678c13eed5803c7d67d7af2681b93a52"} Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.060382 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4f4t/must-gather-zcqk2"] Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.062987 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.073532 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g4f4t"/"openshift-service-ca.crt" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.073801 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-g4f4t"/"default-dockercfg-xx9pj" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.073951 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g4f4t"/"kube-root-ca.crt" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.091468 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4f4t/must-gather-zcqk2"] Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.233424 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8038849-2f50-4aa8-a28e-9a6b346f12e9-must-gather-output\") pod \"must-gather-zcqk2\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.233548 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k24t9\" (UniqueName: \"kubernetes.io/projected/f8038849-2f50-4aa8-a28e-9a6b346f12e9-kube-api-access-k24t9\") pod \"must-gather-zcqk2\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.336575 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8038849-2f50-4aa8-a28e-9a6b346f12e9-must-gather-output\") pod \"must-gather-zcqk2\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.336672 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k24t9\" (UniqueName: \"kubernetes.io/projected/f8038849-2f50-4aa8-a28e-9a6b346f12e9-kube-api-access-k24t9\") pod \"must-gather-zcqk2\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.337161 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8038849-2f50-4aa8-a28e-9a6b346f12e9-must-gather-output\") pod \"must-gather-zcqk2\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.355114 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k24t9\" (UniqueName: \"kubernetes.io/projected/f8038849-2f50-4aa8-a28e-9a6b346f12e9-kube-api-access-k24t9\") pod \"must-gather-zcqk2\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:31 crc kubenswrapper[5129]: I0314 10:33:31.398307 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:33:32 crc kubenswrapper[5129]: I0314 10:33:32.058911 5129 generic.go:334] "Generic (PLEG): container finished" podID="2b775e98-0df5-49bc-b929-61bd0c676412" containerID="db2f9177cd3c539e7abf3f86ea8faa84678c13eed5803c7d67d7af2681b93a52" exitCode=0 Mar 14 10:33:32 crc kubenswrapper[5129]: I0314 10:33:32.059954 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjdpq" event={"ID":"2b775e98-0df5-49bc-b929-61bd0c676412","Type":"ContainerDied","Data":"db2f9177cd3c539e7abf3f86ea8faa84678c13eed5803c7d67d7af2681b93a52"} Mar 14 10:33:32 crc kubenswrapper[5129]: I0314 10:33:32.270525 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4f4t/must-gather-zcqk2"] Mar 14 10:33:33 crc kubenswrapper[5129]: I0314 10:33:33.078024 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjdpq" event={"ID":"2b775e98-0df5-49bc-b929-61bd0c676412","Type":"ContainerStarted","Data":"6067a51bbb9b2d4d8570a51049f43e9045ede48934626998d379a03445fc3fe7"} Mar 14 10:33:33 crc kubenswrapper[5129]: I0314 10:33:33.082557 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" event={"ID":"f8038849-2f50-4aa8-a28e-9a6b346f12e9","Type":"ContainerStarted","Data":"de5286213fea0aa562fd5a005a9992c7e5a68c4fb7ecff43e317cd2ab27ee691"} Mar 14 10:33:33 crc kubenswrapper[5129]: I0314 10:33:33.104419 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kjdpq" podStartSLOduration=3.66276621 podStartE2EDuration="6.104398631s" podCreationTimestamp="2026-03-14 10:33:27 +0000 UTC" firstStartedPulling="2026-03-14 10:33:30.025664227 +0000 UTC m=+12872.777579491" lastFinishedPulling="2026-03-14 10:33:32.467296728 +0000 UTC m=+12875.219211912" observedRunningTime="2026-03-14 10:33:33.094530012 +0000 UTC m=+12875.846445216" watchObservedRunningTime="2026-03-14 10:33:33.104398631 +0000 UTC m=+12875.856313825" Mar 14 10:33:38 crc kubenswrapper[5129]: I0314 10:33:38.242315 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:38 crc kubenswrapper[5129]: I0314 10:33:38.242545 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:38 crc kubenswrapper[5129]: I0314 10:33:38.311076 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:39 crc kubenswrapper[5129]: I0314 10:33:39.212039 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:39 crc kubenswrapper[5129]: I0314 10:33:39.289334 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjdpq"] Mar 14 10:33:40 crc kubenswrapper[5129]: I0314 10:33:40.177740 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" event={"ID":"f8038849-2f50-4aa8-a28e-9a6b346f12e9","Type":"ContainerStarted","Data":"ae27cc552a65078897ff027698bfea5c4d2bd188b407d2bfaed3967ba151d4ab"} Mar 14 10:33:41 crc kubenswrapper[5129]: I0314 10:33:41.037037 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:33:41 crc kubenswrapper[5129]: E0314 10:33:41.037776 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:33:41 crc kubenswrapper[5129]: I0314 10:33:41.191395 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" event={"ID":"f8038849-2f50-4aa8-a28e-9a6b346f12e9","Type":"ContainerStarted","Data":"e06f467eafe274ec8e5cc7181f490aebb536e0b420b9a00c0bd8e2f08bc15c48"} Mar 14 10:33:41 crc kubenswrapper[5129]: I0314 10:33:41.191551 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kjdpq" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="registry-server" containerID="cri-o://6067a51bbb9b2d4d8570a51049f43e9045ede48934626998d379a03445fc3fe7" gracePeriod=2 Mar 14 10:33:41 crc kubenswrapper[5129]: I0314 10:33:41.221659 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" podStartSLOduration=2.611211966 podStartE2EDuration="10.221557809s" podCreationTimestamp="2026-03-14 10:33:31 +0000 UTC" firstStartedPulling="2026-03-14 10:33:32.274498704 +0000 UTC m=+12875.026413888" lastFinishedPulling="2026-03-14 10:33:39.884844537 +0000 UTC m=+12882.636759731" observedRunningTime="2026-03-14 10:33:41.208117164 +0000 UTC m=+12883.960032348" watchObservedRunningTime="2026-03-14 10:33:41.221557809 +0000 UTC m=+12883.973472993" Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.202377 5129 generic.go:334] "Generic (PLEG): container finished" podID="2b775e98-0df5-49bc-b929-61bd0c676412" containerID="6067a51bbb9b2d4d8570a51049f43e9045ede48934626998d379a03445fc3fe7" exitCode=0 Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.202453 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjdpq" event={"ID":"2b775e98-0df5-49bc-b929-61bd0c676412","Type":"ContainerDied","Data":"6067a51bbb9b2d4d8570a51049f43e9045ede48934626998d379a03445fc3fe7"} Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.823176 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.906317 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-catalog-content\") pod \"2b775e98-0df5-49bc-b929-61bd0c676412\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.906398 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5jrp\" (UniqueName: \"kubernetes.io/projected/2b775e98-0df5-49bc-b929-61bd0c676412-kube-api-access-z5jrp\") pod \"2b775e98-0df5-49bc-b929-61bd0c676412\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.906466 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-utilities\") pod \"2b775e98-0df5-49bc-b929-61bd0c676412\" (UID: \"2b775e98-0df5-49bc-b929-61bd0c676412\") " Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.911749 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-utilities" (OuterVolumeSpecName: "utilities") pod "2b775e98-0df5-49bc-b929-61bd0c676412" (UID: "2b775e98-0df5-49bc-b929-61bd0c676412"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.915831 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b775e98-0df5-49bc-b929-61bd0c676412-kube-api-access-z5jrp" (OuterVolumeSpecName: "kube-api-access-z5jrp") pod "2b775e98-0df5-49bc-b929-61bd0c676412" (UID: "2b775e98-0df5-49bc-b929-61bd0c676412"). InnerVolumeSpecName "kube-api-access-z5jrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:33:42 crc kubenswrapper[5129]: I0314 10:33:42.932745 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b775e98-0df5-49bc-b929-61bd0c676412" (UID: "2b775e98-0df5-49bc-b929-61bd0c676412"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.008732 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.008766 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5jrp\" (UniqueName: \"kubernetes.io/projected/2b775e98-0df5-49bc-b929-61bd0c676412-kube-api-access-z5jrp\") on node \"crc\" DevicePath \"\"" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.008778 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b775e98-0df5-49bc-b929-61bd0c676412-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.213759 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjdpq" event={"ID":"2b775e98-0df5-49bc-b929-61bd0c676412","Type":"ContainerDied","Data":"e794a6548526f7963129cb17d6890b2029bc550d67b12178f058d7caf68a0a8a"} Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.214363 5129 scope.go:117] "RemoveContainer" containerID="6067a51bbb9b2d4d8570a51049f43e9045ede48934626998d379a03445fc3fe7" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.213870 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjdpq" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.233551 5129 scope.go:117] "RemoveContainer" containerID="db2f9177cd3c539e7abf3f86ea8faa84678c13eed5803c7d67d7af2681b93a52" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.262398 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjdpq"] Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.262811 5129 scope.go:117] "RemoveContainer" containerID="93f36916471098e4a90711f030dc4b595a66762d75a4add6de58663383a84e53" Mar 14 10:33:43 crc kubenswrapper[5129]: I0314 10:33:43.272157 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjdpq"] Mar 14 10:33:44 crc kubenswrapper[5129]: I0314 10:33:44.052577 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" path="/var/lib/kubelet/pods/2b775e98-0df5-49bc-b929-61bd0c676412/volumes" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.064195 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-ws7fp"] Mar 14 10:33:46 crc kubenswrapper[5129]: E0314 10:33:46.064829 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="extract-utilities" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.064841 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="extract-utilities" Mar 14 10:33:46 crc kubenswrapper[5129]: E0314 10:33:46.064863 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="extract-content" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.064870 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="extract-content" Mar 14 10:33:46 crc kubenswrapper[5129]: E0314 10:33:46.064894 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="registry-server" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.064899 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="registry-server" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.065108 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b775e98-0df5-49bc-b929-61bd0c676412" containerName="registry-server" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.066121 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.177843 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89a9bb44-9832-46e0-a812-51ab20ad7a85-host\") pod \"crc-debug-ws7fp\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.177888 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjx2b\" (UniqueName: \"kubernetes.io/projected/89a9bb44-9832-46e0-a812-51ab20ad7a85-kube-api-access-gjx2b\") pod \"crc-debug-ws7fp\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.279524 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89a9bb44-9832-46e0-a812-51ab20ad7a85-host\") pod \"crc-debug-ws7fp\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.279566 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjx2b\" (UniqueName: \"kubernetes.io/projected/89a9bb44-9832-46e0-a812-51ab20ad7a85-kube-api-access-gjx2b\") pod \"crc-debug-ws7fp\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.281549 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89a9bb44-9832-46e0-a812-51ab20ad7a85-host\") pod \"crc-debug-ws7fp\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.309963 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjx2b\" (UniqueName: \"kubernetes.io/projected/89a9bb44-9832-46e0-a812-51ab20ad7a85-kube-api-access-gjx2b\") pod \"crc-debug-ws7fp\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: I0314 10:33:46.386669 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:33:46 crc kubenswrapper[5129]: W0314 10:33:46.423442 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89a9bb44_9832_46e0_a812_51ab20ad7a85.slice/crio-0f9886aa3a0e34edeed96702907f560453a2720029deb38cff5ae1cfc84a1a95 WatchSource:0}: Error finding container 0f9886aa3a0e34edeed96702907f560453a2720029deb38cff5ae1cfc84a1a95: Status 404 returned error can't find the container with id 0f9886aa3a0e34edeed96702907f560453a2720029deb38cff5ae1cfc84a1a95 Mar 14 10:33:47 crc kubenswrapper[5129]: I0314 10:33:47.285721 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" event={"ID":"89a9bb44-9832-46e0-a812-51ab20ad7a85","Type":"ContainerStarted","Data":"0f9886aa3a0e34edeed96702907f560453a2720029deb38cff5ae1cfc84a1a95"} Mar 14 10:33:55 crc kubenswrapper[5129]: I0314 10:33:55.036162 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:33:55 crc kubenswrapper[5129]: E0314 10:33:55.036861 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:33:57 crc kubenswrapper[5129]: I0314 10:33:57.406892 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" event={"ID":"89a9bb44-9832-46e0-a812-51ab20ad7a85","Type":"ContainerStarted","Data":"458eb38c5bcb0112756de448bf0a6f01e1580a74b67c04bedc417134aaeb934b"} Mar 14 10:33:57 crc kubenswrapper[5129]: I0314 10:33:57.422706 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" podStartSLOduration=0.689342598 podStartE2EDuration="11.422690352s" podCreationTimestamp="2026-03-14 10:33:46 +0000 UTC" firstStartedPulling="2026-03-14 10:33:46.424795248 +0000 UTC m=+12889.176710422" lastFinishedPulling="2026-03-14 10:33:57.158142992 +0000 UTC m=+12899.910058176" observedRunningTime="2026-03-14 10:33:57.41820082 +0000 UTC m=+12900.170116004" watchObservedRunningTime="2026-03-14 10:33:57.422690352 +0000 UTC m=+12900.174605536" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.156760 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558074-jhqvj"] Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.164412 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558074-jhqvj" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.167981 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.168188 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.168340 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.202479 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558074-jhqvj"] Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.309534 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzrj2\" (UniqueName: \"kubernetes.io/projected/598c8171-4e5b-4d59-a1bb-ace53444a2a5-kube-api-access-wzrj2\") pod \"auto-csr-approver-29558074-jhqvj\" (UID: \"598c8171-4e5b-4d59-a1bb-ace53444a2a5\") " pod="openshift-infra/auto-csr-approver-29558074-jhqvj" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.412292 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrj2\" (UniqueName: \"kubernetes.io/projected/598c8171-4e5b-4d59-a1bb-ace53444a2a5-kube-api-access-wzrj2\") pod \"auto-csr-approver-29558074-jhqvj\" (UID: \"598c8171-4e5b-4d59-a1bb-ace53444a2a5\") " pod="openshift-infra/auto-csr-approver-29558074-jhqvj" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.451902 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzrj2\" (UniqueName: \"kubernetes.io/projected/598c8171-4e5b-4d59-a1bb-ace53444a2a5-kube-api-access-wzrj2\") pod \"auto-csr-approver-29558074-jhqvj\" (UID: \"598c8171-4e5b-4d59-a1bb-ace53444a2a5\") " pod="openshift-infra/auto-csr-approver-29558074-jhqvj" Mar 14 10:34:00 crc kubenswrapper[5129]: I0314 10:34:00.486392 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558074-jhqvj" Mar 14 10:34:01 crc kubenswrapper[5129]: I0314 10:34:01.241022 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558074-jhqvj"] Mar 14 10:34:01 crc kubenswrapper[5129]: W0314 10:34:01.253341 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598c8171_4e5b_4d59_a1bb_ace53444a2a5.slice/crio-f6539c62d3af98fd8d6910192bf51dba2b3b4d6402683539f70c21a5fb30b5ab WatchSource:0}: Error finding container f6539c62d3af98fd8d6910192bf51dba2b3b4d6402683539f70c21a5fb30b5ab: Status 404 returned error can't find the container with id f6539c62d3af98fd8d6910192bf51dba2b3b4d6402683539f70c21a5fb30b5ab Mar 14 10:34:01 crc kubenswrapper[5129]: I0314 10:34:01.456045 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558074-jhqvj" event={"ID":"598c8171-4e5b-4d59-a1bb-ace53444a2a5","Type":"ContainerStarted","Data":"f6539c62d3af98fd8d6910192bf51dba2b3b4d6402683539f70c21a5fb30b5ab"} Mar 14 10:34:05 crc kubenswrapper[5129]: I0314 10:34:05.494759 5129 generic.go:334] "Generic (PLEG): container finished" podID="598c8171-4e5b-4d59-a1bb-ace53444a2a5" containerID="0569d9f35d84e09f7f0162296da1f20d5ba425c78509af7a6942bef423bd0d7b" exitCode=0 Mar 14 10:34:05 crc kubenswrapper[5129]: I0314 10:34:05.494821 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558074-jhqvj" event={"ID":"598c8171-4e5b-4d59-a1bb-ace53444a2a5","Type":"ContainerDied","Data":"0569d9f35d84e09f7f0162296da1f20d5ba425c78509af7a6942bef423bd0d7b"} Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.130509 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mm9jw"] Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.137531 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.145496 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mm9jw"] Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.258924 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-utilities\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.259100 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-catalog-content\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.259131 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ss4\" (UniqueName: \"kubernetes.io/projected/a3742942-3be8-427d-9c75-30d1ade4167c-kube-api-access-h5ss4\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.360654 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-catalog-content\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.360953 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ss4\" (UniqueName: \"kubernetes.io/projected/a3742942-3be8-427d-9c75-30d1ade4167c-kube-api-access-h5ss4\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.361029 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-utilities\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.361631 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-catalog-content\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.361775 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-utilities\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.385047 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ss4\" (UniqueName: \"kubernetes.io/projected/a3742942-3be8-427d-9c75-30d1ade4167c-kube-api-access-h5ss4\") pod \"community-operators-mm9jw\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:06 crc kubenswrapper[5129]: I0314 10:34:06.465507 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:07 crc kubenswrapper[5129]: I0314 10:34:07.628520 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mm9jw"] Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.055143 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558074-jhqvj" Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.199436 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzrj2\" (UniqueName: \"kubernetes.io/projected/598c8171-4e5b-4d59-a1bb-ace53444a2a5-kube-api-access-wzrj2\") pod \"598c8171-4e5b-4d59-a1bb-ace53444a2a5\" (UID: \"598c8171-4e5b-4d59-a1bb-ace53444a2a5\") " Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.211861 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598c8171-4e5b-4d59-a1bb-ace53444a2a5-kube-api-access-wzrj2" (OuterVolumeSpecName: "kube-api-access-wzrj2") pod "598c8171-4e5b-4d59-a1bb-ace53444a2a5" (UID: "598c8171-4e5b-4d59-a1bb-ace53444a2a5"). InnerVolumeSpecName "kube-api-access-wzrj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.302348 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzrj2\" (UniqueName: \"kubernetes.io/projected/598c8171-4e5b-4d59-a1bb-ace53444a2a5-kube-api-access-wzrj2\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.542152 5129 generic.go:334] "Generic (PLEG): container finished" podID="a3742942-3be8-427d-9c75-30d1ade4167c" containerID="948fe32bb7af7d00fdd16710f51de663ed6b2eb928d58e7bd1a06ca2976d3938" exitCode=0 Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.542635 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jw" event={"ID":"a3742942-3be8-427d-9c75-30d1ade4167c","Type":"ContainerDied","Data":"948fe32bb7af7d00fdd16710f51de663ed6b2eb928d58e7bd1a06ca2976d3938"} Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.542689 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jw" event={"ID":"a3742942-3be8-427d-9c75-30d1ade4167c","Type":"ContainerStarted","Data":"918880a40597e8a062f9e8a36182b48dbf0f2a8ab8aa6cd843f9a33b31db1934"} Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.547143 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558074-jhqvj" event={"ID":"598c8171-4e5b-4d59-a1bb-ace53444a2a5","Type":"ContainerDied","Data":"f6539c62d3af98fd8d6910192bf51dba2b3b4d6402683539f70c21a5fb30b5ab"} Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.547175 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6539c62d3af98fd8d6910192bf51dba2b3b4d6402683539f70c21a5fb30b5ab" Mar 14 10:34:08 crc kubenswrapper[5129]: I0314 10:34:08.547246 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558074-jhqvj" Mar 14 10:34:09 crc kubenswrapper[5129]: I0314 10:34:09.157304 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558068-64mg7"] Mar 14 10:34:09 crc kubenswrapper[5129]: I0314 10:34:09.167410 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558068-64mg7"] Mar 14 10:34:10 crc kubenswrapper[5129]: I0314 10:34:10.038552 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:34:10 crc kubenswrapper[5129]: E0314 10:34:10.039338 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:34:10 crc kubenswrapper[5129]: I0314 10:34:10.051344 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5378d988-f252-4f3d-865e-12b49ba22b2e" path="/var/lib/kubelet/pods/5378d988-f252-4f3d-865e-12b49ba22b2e/volumes" Mar 14 10:34:11 crc kubenswrapper[5129]: I0314 10:34:11.586463 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jw" event={"ID":"a3742942-3be8-427d-9c75-30d1ade4167c","Type":"ContainerStarted","Data":"13a98eab04f676a5c74136f92454a9e7fbea72702e0b03c43eba159bb75fed0a"} Mar 14 10:34:13 crc kubenswrapper[5129]: I0314 10:34:13.606509 5129 generic.go:334] "Generic (PLEG): container finished" podID="a3742942-3be8-427d-9c75-30d1ade4167c" containerID="13a98eab04f676a5c74136f92454a9e7fbea72702e0b03c43eba159bb75fed0a" exitCode=0 Mar 14 10:34:13 crc kubenswrapper[5129]: I0314 10:34:13.606581 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jw" event={"ID":"a3742942-3be8-427d-9c75-30d1ade4167c","Type":"ContainerDied","Data":"13a98eab04f676a5c74136f92454a9e7fbea72702e0b03c43eba159bb75fed0a"} Mar 14 10:34:15 crc kubenswrapper[5129]: I0314 10:34:15.634660 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jw" event={"ID":"a3742942-3be8-427d-9c75-30d1ade4167c","Type":"ContainerStarted","Data":"407002b185a6bd69c0d3460cc99ae616f8c753f9e3f7e02dbef373eda9309964"} Mar 14 10:34:15 crc kubenswrapper[5129]: I0314 10:34:15.658163 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mm9jw" podStartSLOduration=3.460671251 podStartE2EDuration="9.658146376s" podCreationTimestamp="2026-03-14 10:34:06 +0000 UTC" firstStartedPulling="2026-03-14 10:34:08.552380668 +0000 UTC m=+12911.304295842" lastFinishedPulling="2026-03-14 10:34:14.749855783 +0000 UTC m=+12917.501770967" observedRunningTime="2026-03-14 10:34:15.654906588 +0000 UTC m=+12918.406821782" watchObservedRunningTime="2026-03-14 10:34:15.658146376 +0000 UTC m=+12918.410061560" Mar 14 10:34:16 crc kubenswrapper[5129]: I0314 10:34:16.466219 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:16 crc kubenswrapper[5129]: I0314 10:34:16.466376 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:17 crc kubenswrapper[5129]: I0314 10:34:17.526052 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mm9jw" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="registry-server" probeResult="failure" output=< Mar 14 10:34:17 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:34:17 crc kubenswrapper[5129]: > Mar 14 10:34:25 crc kubenswrapper[5129]: I0314 10:34:25.036362 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:34:25 crc kubenswrapper[5129]: E0314 10:34:25.037030 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:34:26 crc kubenswrapper[5129]: I0314 10:34:26.514531 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:26 crc kubenswrapper[5129]: I0314 10:34:26.568218 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:26 crc kubenswrapper[5129]: I0314 10:34:26.754171 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mm9jw"] Mar 14 10:34:27 crc kubenswrapper[5129]: I0314 10:34:27.774107 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mm9jw" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="registry-server" containerID="cri-o://407002b185a6bd69c0d3460cc99ae616f8c753f9e3f7e02dbef373eda9309964" gracePeriod=2 Mar 14 10:34:28 crc kubenswrapper[5129]: I0314 10:34:28.787932 5129 generic.go:334] "Generic (PLEG): container finished" podID="a3742942-3be8-427d-9c75-30d1ade4167c" containerID="407002b185a6bd69c0d3460cc99ae616f8c753f9e3f7e02dbef373eda9309964" exitCode=0 Mar 14 10:34:28 crc kubenswrapper[5129]: I0314 10:34:28.787986 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jw" event={"ID":"a3742942-3be8-427d-9c75-30d1ade4167c","Type":"ContainerDied","Data":"407002b185a6bd69c0d3460cc99ae616f8c753f9e3f7e02dbef373eda9309964"} Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.263141 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.324483 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-catalog-content\") pod \"a3742942-3be8-427d-9c75-30d1ade4167c\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.324548 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-utilities\") pod \"a3742942-3be8-427d-9c75-30d1ade4167c\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.324612 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5ss4\" (UniqueName: \"kubernetes.io/projected/a3742942-3be8-427d-9c75-30d1ade4167c-kube-api-access-h5ss4\") pod \"a3742942-3be8-427d-9c75-30d1ade4167c\" (UID: \"a3742942-3be8-427d-9c75-30d1ade4167c\") " Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.325957 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-utilities" (OuterVolumeSpecName: "utilities") pod "a3742942-3be8-427d-9c75-30d1ade4167c" (UID: "a3742942-3be8-427d-9c75-30d1ade4167c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.336019 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3742942-3be8-427d-9c75-30d1ade4167c-kube-api-access-h5ss4" (OuterVolumeSpecName: "kube-api-access-h5ss4") pod "a3742942-3be8-427d-9c75-30d1ade4167c" (UID: "a3742942-3be8-427d-9c75-30d1ade4167c"). InnerVolumeSpecName "kube-api-access-h5ss4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.371212 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3742942-3be8-427d-9c75-30d1ade4167c" (UID: "a3742942-3be8-427d-9c75-30d1ade4167c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.427471 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.427500 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3742942-3be8-427d-9c75-30d1ade4167c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.427509 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5ss4\" (UniqueName: \"kubernetes.io/projected/a3742942-3be8-427d-9c75-30d1ade4167c-kube-api-access-h5ss4\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.804356 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mm9jw" event={"ID":"a3742942-3be8-427d-9c75-30d1ade4167c","Type":"ContainerDied","Data":"918880a40597e8a062f9e8a36182b48dbf0f2a8ab8aa6cd843f9a33b31db1934"} Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.804428 5129 scope.go:117] "RemoveContainer" containerID="407002b185a6bd69c0d3460cc99ae616f8c753f9e3f7e02dbef373eda9309964" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.804535 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mm9jw" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.837639 5129 scope.go:117] "RemoveContainer" containerID="13a98eab04f676a5c74136f92454a9e7fbea72702e0b03c43eba159bb75fed0a" Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.841443 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mm9jw"] Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.851977 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mm9jw"] Mar 14 10:34:29 crc kubenswrapper[5129]: I0314 10:34:29.885330 5129 scope.go:117] "RemoveContainer" containerID="948fe32bb7af7d00fdd16710f51de663ed6b2eb928d58e7bd1a06ca2976d3938" Mar 14 10:34:30 crc kubenswrapper[5129]: I0314 10:34:30.051038 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" path="/var/lib/kubelet/pods/a3742942-3be8-427d-9c75-30d1ade4167c/volumes" Mar 14 10:34:37 crc kubenswrapper[5129]: I0314 10:34:37.038139 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:34:37 crc kubenswrapper[5129]: E0314 10:34:37.038927 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:34:41 crc kubenswrapper[5129]: I0314 10:34:41.918523 5129 generic.go:334] "Generic (PLEG): container finished" podID="89a9bb44-9832-46e0-a812-51ab20ad7a85" containerID="458eb38c5bcb0112756de448bf0a6f01e1580a74b67c04bedc417134aaeb934b" exitCode=0 Mar 14 10:34:41 crc kubenswrapper[5129]: I0314 10:34:41.918590 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" event={"ID":"89a9bb44-9832-46e0-a812-51ab20ad7a85","Type":"ContainerDied","Data":"458eb38c5bcb0112756de448bf0a6f01e1580a74b67c04bedc417134aaeb934b"} Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.054497 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.098872 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-ws7fp"] Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.109759 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-ws7fp"] Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.112697 5129 scope.go:117] "RemoveContainer" containerID="eb0589908b4abd2e57cc29cf8dbfe42d1c9b85953ae554d3e7e07fb9fc2c7850" Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.126424 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89a9bb44-9832-46e0-a812-51ab20ad7a85-host\") pod \"89a9bb44-9832-46e0-a812-51ab20ad7a85\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.126513 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjx2b\" (UniqueName: \"kubernetes.io/projected/89a9bb44-9832-46e0-a812-51ab20ad7a85-kube-api-access-gjx2b\") pod \"89a9bb44-9832-46e0-a812-51ab20ad7a85\" (UID: \"89a9bb44-9832-46e0-a812-51ab20ad7a85\") " Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.128299 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89a9bb44-9832-46e0-a812-51ab20ad7a85-host" (OuterVolumeSpecName: "host") pod "89a9bb44-9832-46e0-a812-51ab20ad7a85" (UID: "89a9bb44-9832-46e0-a812-51ab20ad7a85"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.144848 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a9bb44-9832-46e0-a812-51ab20ad7a85-kube-api-access-gjx2b" (OuterVolumeSpecName: "kube-api-access-gjx2b") pod "89a9bb44-9832-46e0-a812-51ab20ad7a85" (UID: "89a9bb44-9832-46e0-a812-51ab20ad7a85"). InnerVolumeSpecName "kube-api-access-gjx2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.229660 5129 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89a9bb44-9832-46e0-a812-51ab20ad7a85-host\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.229694 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjx2b\" (UniqueName: \"kubernetes.io/projected/89a9bb44-9832-46e0-a812-51ab20ad7a85-kube-api-access-gjx2b\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.942730 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9886aa3a0e34edeed96702907f560453a2720029deb38cff5ae1cfc84a1a95" Mar 14 10:34:43 crc kubenswrapper[5129]: I0314 10:34:43.942779 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-ws7fp" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.053577 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a9bb44-9832-46e0-a812-51ab20ad7a85" path="/var/lib/kubelet/pods/89a9bb44-9832-46e0-a812-51ab20ad7a85/volumes" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.267208 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-r2fkj"] Mar 14 10:34:44 crc kubenswrapper[5129]: E0314 10:34:44.267937 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="registry-server" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.267953 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="registry-server" Mar 14 10:34:44 crc kubenswrapper[5129]: E0314 10:34:44.267980 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="extract-content" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.267988 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="extract-content" Mar 14 10:34:44 crc kubenswrapper[5129]: E0314 10:34:44.268013 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598c8171-4e5b-4d59-a1bb-ace53444a2a5" containerName="oc" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.268022 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="598c8171-4e5b-4d59-a1bb-ace53444a2a5" containerName="oc" Mar 14 10:34:44 crc kubenswrapper[5129]: E0314 10:34:44.268042 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a9bb44-9832-46e0-a812-51ab20ad7a85" containerName="container-00" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.268050 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a9bb44-9832-46e0-a812-51ab20ad7a85" containerName="container-00" Mar 14 10:34:44 crc kubenswrapper[5129]: E0314 10:34:44.268073 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="extract-utilities" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.268081 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="extract-utilities" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.268296 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3742942-3be8-427d-9c75-30d1ade4167c" containerName="registry-server" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.268332 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a9bb44-9832-46e0-a812-51ab20ad7a85" containerName="container-00" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.268348 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="598c8171-4e5b-4d59-a1bb-ace53444a2a5" containerName="oc" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.269137 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.353851 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e171a5e-96ad-4333-a40a-88f8bcbad645-host\") pod \"crc-debug-r2fkj\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.353985 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25wpq\" (UniqueName: \"kubernetes.io/projected/5e171a5e-96ad-4333-a40a-88f8bcbad645-kube-api-access-25wpq\") pod \"crc-debug-r2fkj\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.454964 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25wpq\" (UniqueName: \"kubernetes.io/projected/5e171a5e-96ad-4333-a40a-88f8bcbad645-kube-api-access-25wpq\") pod \"crc-debug-r2fkj\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.455105 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e171a5e-96ad-4333-a40a-88f8bcbad645-host\") pod \"crc-debug-r2fkj\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.455204 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e171a5e-96ad-4333-a40a-88f8bcbad645-host\") pod \"crc-debug-r2fkj\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.472684 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25wpq\" (UniqueName: \"kubernetes.io/projected/5e171a5e-96ad-4333-a40a-88f8bcbad645-kube-api-access-25wpq\") pod \"crc-debug-r2fkj\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.598303 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.962212 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" event={"ID":"5e171a5e-96ad-4333-a40a-88f8bcbad645","Type":"ContainerStarted","Data":"741e86bae75bc824cb15782135de32af33240d667bbcba25bb193c5dda262d14"} Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.962626 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" event={"ID":"5e171a5e-96ad-4333-a40a-88f8bcbad645","Type":"ContainerStarted","Data":"a4b4083b68640e3b0fe8ea479fddb56fc400ded77f0493726c55aa9fa5c91829"} Mar 14 10:34:44 crc kubenswrapper[5129]: I0314 10:34:44.989329 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" podStartSLOduration=0.989312722 podStartE2EDuration="989.312722ms" podCreationTimestamp="2026-03-14 10:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:34:44.980004119 +0000 UTC m=+12947.731919313" watchObservedRunningTime="2026-03-14 10:34:44.989312722 +0000 UTC m=+12947.741227906" Mar 14 10:34:45 crc kubenswrapper[5129]: I0314 10:34:45.975213 5129 generic.go:334] "Generic (PLEG): container finished" podID="5e171a5e-96ad-4333-a40a-88f8bcbad645" containerID="741e86bae75bc824cb15782135de32af33240d667bbcba25bb193c5dda262d14" exitCode=0 Mar 14 10:34:45 crc kubenswrapper[5129]: I0314 10:34:45.975249 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" event={"ID":"5e171a5e-96ad-4333-a40a-88f8bcbad645","Type":"ContainerDied","Data":"741e86bae75bc824cb15782135de32af33240d667bbcba25bb193c5dda262d14"} Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.084289 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.148872 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-r2fkj"] Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.159385 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-r2fkj"] Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.209753 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25wpq\" (UniqueName: \"kubernetes.io/projected/5e171a5e-96ad-4333-a40a-88f8bcbad645-kube-api-access-25wpq\") pod \"5e171a5e-96ad-4333-a40a-88f8bcbad645\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.209830 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e171a5e-96ad-4333-a40a-88f8bcbad645-host\") pod \"5e171a5e-96ad-4333-a40a-88f8bcbad645\" (UID: \"5e171a5e-96ad-4333-a40a-88f8bcbad645\") " Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.209981 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e171a5e-96ad-4333-a40a-88f8bcbad645-host" (OuterVolumeSpecName: "host") pod "5e171a5e-96ad-4333-a40a-88f8bcbad645" (UID: "5e171a5e-96ad-4333-a40a-88f8bcbad645"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.210387 5129 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e171a5e-96ad-4333-a40a-88f8bcbad645-host\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.218864 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e171a5e-96ad-4333-a40a-88f8bcbad645-kube-api-access-25wpq" (OuterVolumeSpecName: "kube-api-access-25wpq") pod "5e171a5e-96ad-4333-a40a-88f8bcbad645" (UID: "5e171a5e-96ad-4333-a40a-88f8bcbad645"). InnerVolumeSpecName "kube-api-access-25wpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.312533 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25wpq\" (UniqueName: \"kubernetes.io/projected/5e171a5e-96ad-4333-a40a-88f8bcbad645-kube-api-access-25wpq\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.997152 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b4083b68640e3b0fe8ea479fddb56fc400ded77f0493726c55aa9fa5c91829" Mar 14 10:34:47 crc kubenswrapper[5129]: I0314 10:34:47.997279 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-r2fkj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.044771 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:34:48 crc kubenswrapper[5129]: E0314 10:34:48.045255 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.051328 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e171a5e-96ad-4333-a40a-88f8bcbad645" path="/var/lib/kubelet/pods/5e171a5e-96ad-4333-a40a-88f8bcbad645/volumes" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.353025 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-9k9sj"] Mar 14 10:34:48 crc kubenswrapper[5129]: E0314 10:34:48.354031 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e171a5e-96ad-4333-a40a-88f8bcbad645" containerName="container-00" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.354056 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e171a5e-96ad-4333-a40a-88f8bcbad645" containerName="container-00" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.354431 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e171a5e-96ad-4333-a40a-88f8bcbad645" containerName="container-00" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.355438 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.431789 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2876449-a906-4a08-8209-82421d460b29-host\") pod \"crc-debug-9k9sj\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.432025 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvcjc\" (UniqueName: \"kubernetes.io/projected/d2876449-a906-4a08-8209-82421d460b29-kube-api-access-rvcjc\") pod \"crc-debug-9k9sj\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.533431 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2876449-a906-4a08-8209-82421d460b29-host\") pod \"crc-debug-9k9sj\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.533559 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2876449-a906-4a08-8209-82421d460b29-host\") pod \"crc-debug-9k9sj\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.533577 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvcjc\" (UniqueName: \"kubernetes.io/projected/d2876449-a906-4a08-8209-82421d460b29-kube-api-access-rvcjc\") pod \"crc-debug-9k9sj\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.549452 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvcjc\" (UniqueName: \"kubernetes.io/projected/d2876449-a906-4a08-8209-82421d460b29-kube-api-access-rvcjc\") pod \"crc-debug-9k9sj\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: I0314 10:34:48.679254 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:48 crc kubenswrapper[5129]: W0314 10:34:48.720882 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2876449_a906_4a08_8209_82421d460b29.slice/crio-07ca1a895583c3541f4d62f2fbd2da5b84f05a3112a0ff540a0deb7dda8ef80c WatchSource:0}: Error finding container 07ca1a895583c3541f4d62f2fbd2da5b84f05a3112a0ff540a0deb7dda8ef80c: Status 404 returned error can't find the container with id 07ca1a895583c3541f4d62f2fbd2da5b84f05a3112a0ff540a0deb7dda8ef80c Mar 14 10:34:49 crc kubenswrapper[5129]: I0314 10:34:49.012120 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" event={"ID":"d2876449-a906-4a08-8209-82421d460b29","Type":"ContainerStarted","Data":"60aa9b66c316f1543d1f463f3d54c84b50ed6b4e58b0c2ed62169343489b7d6b"} Mar 14 10:34:49 crc kubenswrapper[5129]: I0314 10:34:49.012512 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" event={"ID":"d2876449-a906-4a08-8209-82421d460b29","Type":"ContainerStarted","Data":"07ca1a895583c3541f4d62f2fbd2da5b84f05a3112a0ff540a0deb7dda8ef80c"} Mar 14 10:34:49 crc kubenswrapper[5129]: I0314 10:34:49.051361 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-9k9sj"] Mar 14 10:34:49 crc kubenswrapper[5129]: I0314 10:34:49.060144 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g4f4t/crc-debug-9k9sj"] Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.023353 5129 generic.go:334] "Generic (PLEG): container finished" podID="d2876449-a906-4a08-8209-82421d460b29" containerID="60aa9b66c316f1543d1f463f3d54c84b50ed6b4e58b0c2ed62169343489b7d6b" exitCode=0 Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.145438 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.269299 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvcjc\" (UniqueName: \"kubernetes.io/projected/d2876449-a906-4a08-8209-82421d460b29-kube-api-access-rvcjc\") pod \"d2876449-a906-4a08-8209-82421d460b29\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.269473 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2876449-a906-4a08-8209-82421d460b29-host\") pod \"d2876449-a906-4a08-8209-82421d460b29\" (UID: \"d2876449-a906-4a08-8209-82421d460b29\") " Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.269522 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2876449-a906-4a08-8209-82421d460b29-host" (OuterVolumeSpecName: "host") pod "d2876449-a906-4a08-8209-82421d460b29" (UID: "d2876449-a906-4a08-8209-82421d460b29"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.269986 5129 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2876449-a906-4a08-8209-82421d460b29-host\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.277220 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2876449-a906-4a08-8209-82421d460b29-kube-api-access-rvcjc" (OuterVolumeSpecName: "kube-api-access-rvcjc") pod "d2876449-a906-4a08-8209-82421d460b29" (UID: "d2876449-a906-4a08-8209-82421d460b29"). InnerVolumeSpecName "kube-api-access-rvcjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:34:50 crc kubenswrapper[5129]: I0314 10:34:50.371321 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvcjc\" (UniqueName: \"kubernetes.io/projected/d2876449-a906-4a08-8209-82421d460b29-kube-api-access-rvcjc\") on node \"crc\" DevicePath \"\"" Mar 14 10:34:51 crc kubenswrapper[5129]: I0314 10:34:51.132441 5129 scope.go:117] "RemoveContainer" containerID="60aa9b66c316f1543d1f463f3d54c84b50ed6b4e58b0c2ed62169343489b7d6b" Mar 14 10:34:51 crc kubenswrapper[5129]: I0314 10:34:51.132796 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/crc-debug-9k9sj" Mar 14 10:34:52 crc kubenswrapper[5129]: I0314 10:34:52.049880 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2876449-a906-4a08-8209-82421d460b29" path="/var/lib/kubelet/pods/d2876449-a906-4a08-8209-82421d460b29/volumes" Mar 14 10:35:02 crc kubenswrapper[5129]: I0314 10:35:02.036471 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:35:03 crc kubenswrapper[5129]: I0314 10:35:03.257283 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"d5a7a7f050827fb797d412717f100d7b334050f76e813160cf2256327e999ae1"} Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.142193 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558076-p87t5"] Mar 14 10:36:00 crc kubenswrapper[5129]: E0314 10:36:00.143158 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2876449-a906-4a08-8209-82421d460b29" containerName="container-00" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.143170 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2876449-a906-4a08-8209-82421d460b29" containerName="container-00" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.143388 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2876449-a906-4a08-8209-82421d460b29" containerName="container-00" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.144044 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558076-p87t5" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.154297 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558076-p87t5"] Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.181220 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.181396 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.182328 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.285560 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8gzs\" (UniqueName: \"kubernetes.io/projected/39c7a1a6-bfca-4c41-a034-80f600602fc5-kube-api-access-j8gzs\") pod \"auto-csr-approver-29558076-p87t5\" (UID: \"39c7a1a6-bfca-4c41-a034-80f600602fc5\") " pod="openshift-infra/auto-csr-approver-29558076-p87t5" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.387741 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8gzs\" (UniqueName: \"kubernetes.io/projected/39c7a1a6-bfca-4c41-a034-80f600602fc5-kube-api-access-j8gzs\") pod \"auto-csr-approver-29558076-p87t5\" (UID: \"39c7a1a6-bfca-4c41-a034-80f600602fc5\") " pod="openshift-infra/auto-csr-approver-29558076-p87t5" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.406571 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8gzs\" (UniqueName: \"kubernetes.io/projected/39c7a1a6-bfca-4c41-a034-80f600602fc5-kube-api-access-j8gzs\") pod \"auto-csr-approver-29558076-p87t5\" (UID: \"39c7a1a6-bfca-4c41-a034-80f600602fc5\") " pod="openshift-infra/auto-csr-approver-29558076-p87t5" Mar 14 10:36:00 crc kubenswrapper[5129]: I0314 10:36:00.493546 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558076-p87t5" Mar 14 10:36:01 crc kubenswrapper[5129]: W0314 10:36:01.205151 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39c7a1a6_bfca_4c41_a034_80f600602fc5.slice/crio-4407f42205b1f350948694088b9f06be7fdaa25c472e896f944d079fdcfff9d3 WatchSource:0}: Error finding container 4407f42205b1f350948694088b9f06be7fdaa25c472e896f944d079fdcfff9d3: Status 404 returned error can't find the container with id 4407f42205b1f350948694088b9f06be7fdaa25c472e896f944d079fdcfff9d3 Mar 14 10:36:01 crc kubenswrapper[5129]: I0314 10:36:01.210356 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:36:01 crc kubenswrapper[5129]: I0314 10:36:01.223787 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558076-p87t5"] Mar 14 10:36:01 crc kubenswrapper[5129]: I0314 10:36:01.799484 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558076-p87t5" event={"ID":"39c7a1a6-bfca-4c41-a034-80f600602fc5","Type":"ContainerStarted","Data":"4407f42205b1f350948694088b9f06be7fdaa25c472e896f944d079fdcfff9d3"} Mar 14 10:36:02 crc kubenswrapper[5129]: I0314 10:36:02.812699 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558076-p87t5" event={"ID":"39c7a1a6-bfca-4c41-a034-80f600602fc5","Type":"ContainerStarted","Data":"65176f0fbd0d0a6f9fb995aaaed4e12bb118cc5ac76f519c304ce529524710ec"} Mar 14 10:36:02 crc kubenswrapper[5129]: I0314 10:36:02.837716 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558076-p87t5" podStartSLOduration=1.770652633 podStartE2EDuration="2.837695975s" podCreationTimestamp="2026-03-14 10:36:00 +0000 UTC" firstStartedPulling="2026-03-14 10:36:01.21013848 +0000 UTC m=+13023.962053664" lastFinishedPulling="2026-03-14 10:36:02.277181822 +0000 UTC m=+13025.029097006" observedRunningTime="2026-03-14 10:36:02.826924964 +0000 UTC m=+13025.578840158" watchObservedRunningTime="2026-03-14 10:36:02.837695975 +0000 UTC m=+13025.589611179" Mar 14 10:36:03 crc kubenswrapper[5129]: I0314 10:36:03.826113 5129 generic.go:334] "Generic (PLEG): container finished" podID="39c7a1a6-bfca-4c41-a034-80f600602fc5" containerID="65176f0fbd0d0a6f9fb995aaaed4e12bb118cc5ac76f519c304ce529524710ec" exitCode=0 Mar 14 10:36:03 crc kubenswrapper[5129]: I0314 10:36:03.826222 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558076-p87t5" event={"ID":"39c7a1a6-bfca-4c41-a034-80f600602fc5","Type":"ContainerDied","Data":"65176f0fbd0d0a6f9fb995aaaed4e12bb118cc5ac76f519c304ce529524710ec"} Mar 14 10:36:06 crc kubenswrapper[5129]: I0314 10:36:06.237339 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558076-p87t5" Mar 14 10:36:06 crc kubenswrapper[5129]: I0314 10:36:06.407496 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8gzs\" (UniqueName: \"kubernetes.io/projected/39c7a1a6-bfca-4c41-a034-80f600602fc5-kube-api-access-j8gzs\") pod \"39c7a1a6-bfca-4c41-a034-80f600602fc5\" (UID: \"39c7a1a6-bfca-4c41-a034-80f600602fc5\") " Mar 14 10:36:06 crc kubenswrapper[5129]: I0314 10:36:06.414871 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c7a1a6-bfca-4c41-a034-80f600602fc5-kube-api-access-j8gzs" (OuterVolumeSpecName: "kube-api-access-j8gzs") pod "39c7a1a6-bfca-4c41-a034-80f600602fc5" (UID: "39c7a1a6-bfca-4c41-a034-80f600602fc5"). InnerVolumeSpecName "kube-api-access-j8gzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:36:06 crc kubenswrapper[5129]: I0314 10:36:06.510175 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8gzs\" (UniqueName: \"kubernetes.io/projected/39c7a1a6-bfca-4c41-a034-80f600602fc5-kube-api-access-j8gzs\") on node \"crc\" DevicePath \"\"" Mar 14 10:36:06 crc kubenswrapper[5129]: I0314 10:36:06.858428 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558076-p87t5" event={"ID":"39c7a1a6-bfca-4c41-a034-80f600602fc5","Type":"ContainerDied","Data":"4407f42205b1f350948694088b9f06be7fdaa25c472e896f944d079fdcfff9d3"} Mar 14 10:36:06 crc kubenswrapper[5129]: I0314 10:36:06.858976 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4407f42205b1f350948694088b9f06be7fdaa25c472e896f944d079fdcfff9d3" Mar 14 10:36:06 crc kubenswrapper[5129]: I0314 10:36:06.858750 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558076-p87t5" Mar 14 10:36:07 crc kubenswrapper[5129]: I0314 10:36:07.307541 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558070-gpx85"] Mar 14 10:36:07 crc kubenswrapper[5129]: I0314 10:36:07.321514 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558070-gpx85"] Mar 14 10:36:08 crc kubenswrapper[5129]: I0314 10:36:08.050162 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3089d2a9-975b-4b8c-b3a4-6e1079308c9a" path="/var/lib/kubelet/pods/3089d2a9-975b-4b8c-b3a4-6e1079308c9a/volumes" Mar 14 10:36:43 crc kubenswrapper[5129]: I0314 10:36:43.270613 5129 scope.go:117] "RemoveContainer" containerID="6dfe3b8530daf8a6d288b018230d7fbcfca9fb4811b8fff7751d190a8c89b322" Mar 14 10:37:19 crc kubenswrapper[5129]: I0314 10:37:19.574664 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:37:19 crc kubenswrapper[5129]: I0314 10:37:19.575140 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.128730 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-49h95"] Mar 14 10:37:31 crc kubenswrapper[5129]: E0314 10:37:31.129991 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c7a1a6-bfca-4c41-a034-80f600602fc5" containerName="oc" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.130010 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7a1a6-bfca-4c41-a034-80f600602fc5" containerName="oc" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.130330 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c7a1a6-bfca-4c41-a034-80f600602fc5" containerName="oc" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.135844 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.140577 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49h95"] Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.222227 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2lj5\" (UniqueName: \"kubernetes.io/projected/4bacfacc-5c03-4550-a863-dfb174bb7049-kube-api-access-z2lj5\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.222299 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-utilities\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.222441 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-catalog-content\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.324742 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lj5\" (UniqueName: \"kubernetes.io/projected/4bacfacc-5c03-4550-a863-dfb174bb7049-kube-api-access-z2lj5\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.324811 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-utilities\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.324966 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-catalog-content\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.325391 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-catalog-content\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.325533 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-utilities\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.342507 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2lj5\" (UniqueName: \"kubernetes.io/projected/4bacfacc-5c03-4550-a863-dfb174bb7049-kube-api-access-z2lj5\") pod \"certified-operators-49h95\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:31 crc kubenswrapper[5129]: I0314 10:37:31.463482 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:32 crc kubenswrapper[5129]: I0314 10:37:32.235405 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49h95"] Mar 14 10:37:32 crc kubenswrapper[5129]: I0314 10:37:32.846354 5129 generic.go:334] "Generic (PLEG): container finished" podID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerID="b7a30d1a6b4568625150b4facbe998ca91c9a42d1a6ea6b41cdb4fdf479901bb" exitCode=0 Mar 14 10:37:32 crc kubenswrapper[5129]: I0314 10:37:32.846461 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h95" event={"ID":"4bacfacc-5c03-4550-a863-dfb174bb7049","Type":"ContainerDied","Data":"b7a30d1a6b4568625150b4facbe998ca91c9a42d1a6ea6b41cdb4fdf479901bb"} Mar 14 10:37:32 crc kubenswrapper[5129]: I0314 10:37:32.846641 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h95" event={"ID":"4bacfacc-5c03-4550-a863-dfb174bb7049","Type":"ContainerStarted","Data":"1ac48a918d18eea6a3ac88f07cba403fb0b7f3a006e6e2252c37385167d89299"} Mar 14 10:37:33 crc kubenswrapper[5129]: I0314 10:37:33.857004 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h95" event={"ID":"4bacfacc-5c03-4550-a863-dfb174bb7049","Type":"ContainerStarted","Data":"43272bacaa40835667f47b4cbeb31c31dd3df5e1c4a164117c3d570be43e1acc"} Mar 14 10:37:35 crc kubenswrapper[5129]: E0314 10:37:35.586147 5129 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bacfacc_5c03_4550_a863_dfb174bb7049.slice/crio-conmon-43272bacaa40835667f47b4cbeb31c31dd3df5e1c4a164117c3d570be43e1acc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bacfacc_5c03_4550_a863_dfb174bb7049.slice/crio-43272bacaa40835667f47b4cbeb31c31dd3df5e1c4a164117c3d570be43e1acc.scope\": RecentStats: unable to find data in memory cache]" Mar 14 10:37:35 crc kubenswrapper[5129]: I0314 10:37:35.879201 5129 generic.go:334] "Generic (PLEG): container finished" podID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerID="43272bacaa40835667f47b4cbeb31c31dd3df5e1c4a164117c3d570be43e1acc" exitCode=0 Mar 14 10:37:35 crc kubenswrapper[5129]: I0314 10:37:35.879255 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h95" event={"ID":"4bacfacc-5c03-4550-a863-dfb174bb7049","Type":"ContainerDied","Data":"43272bacaa40835667f47b4cbeb31c31dd3df5e1c4a164117c3d570be43e1acc"} Mar 14 10:37:36 crc kubenswrapper[5129]: I0314 10:37:36.893224 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h95" event={"ID":"4bacfacc-5c03-4550-a863-dfb174bb7049","Type":"ContainerStarted","Data":"1238462a4ca94fae6666f84ee4ce689536ac3c58ce80e7c2017a8003179bb271"} Mar 14 10:37:36 crc kubenswrapper[5129]: I0314 10:37:36.912204 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-49h95" podStartSLOduration=2.445893241 podStartE2EDuration="5.912185853s" podCreationTimestamp="2026-03-14 10:37:31 +0000 UTC" firstStartedPulling="2026-03-14 10:37:32.847942721 +0000 UTC m=+13115.599857905" lastFinishedPulling="2026-03-14 10:37:36.314235313 +0000 UTC m=+13119.066150517" observedRunningTime="2026-03-14 10:37:36.910171788 +0000 UTC m=+13119.662086982" watchObservedRunningTime="2026-03-14 10:37:36.912185853 +0000 UTC m=+13119.664101037" Mar 14 10:37:41 crc kubenswrapper[5129]: I0314 10:37:41.464364 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:41 crc kubenswrapper[5129]: I0314 10:37:41.464828 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:41 crc kubenswrapper[5129]: I0314 10:37:41.526271 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:41 crc kubenswrapper[5129]: I0314 10:37:41.991158 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:42 crc kubenswrapper[5129]: I0314 10:37:42.034826 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49h95"] Mar 14 10:37:43 crc kubenswrapper[5129]: I0314 10:37:43.226043 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f08af818-92ad-48e1-b7f4-7e02562a816a/init-config-reloader/0.log" Mar 14 10:37:43 crc kubenswrapper[5129]: I0314 10:37:43.665729 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f08af818-92ad-48e1-b7f4-7e02562a816a/init-config-reloader/0.log" Mar 14 10:37:43 crc kubenswrapper[5129]: I0314 10:37:43.695192 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f08af818-92ad-48e1-b7f4-7e02562a816a/config-reloader/0.log" Mar 14 10:37:43 crc kubenswrapper[5129]: I0314 10:37:43.705780 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f08af818-92ad-48e1-b7f4-7e02562a816a/alertmanager/0.log" Mar 14 10:37:43 crc kubenswrapper[5129]: I0314 10:37:43.954160 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-49h95" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="registry-server" containerID="cri-o://1238462a4ca94fae6666f84ee4ce689536ac3c58ce80e7c2017a8003179bb271" gracePeriod=2 Mar 14 10:37:43 crc kubenswrapper[5129]: I0314 10:37:43.972258 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7a72b7ce-3f7c-46f2-b282-8bf3268588ca/aodh-listener/0.log" Mar 14 10:37:43 crc kubenswrapper[5129]: I0314 10:37:43.974479 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7a72b7ce-3f7c-46f2-b282-8bf3268588ca/aodh-evaluator/0.log" Mar 14 10:37:44 crc kubenswrapper[5129]: I0314 10:37:44.041347 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7a72b7ce-3f7c-46f2-b282-8bf3268588ca/aodh-api/0.log" Mar 14 10:37:44 crc kubenswrapper[5129]: I0314 10:37:44.103984 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7a72b7ce-3f7c-46f2-b282-8bf3268588ca/aodh-notifier/0.log" Mar 14 10:37:44 crc kubenswrapper[5129]: I0314 10:37:44.345981 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-m6vlz_35d9420c-6bf5-4e2b-a80c-9e2b364f6a64/bootstrap-openstack-openstack-cell1/0.log" Mar 14 10:37:44 crc kubenswrapper[5129]: I0314 10:37:44.436491 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-networker-6dnqp_5e53276d-c8cb-4fb1-aea6-d436e50e4490/bootstrap-openstack-openstack-networker/0.log" Mar 14 10:37:44 crc kubenswrapper[5129]: I0314 10:37:44.674425 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1c530f52-0988-468f-95d9-c45c3550a14c/ceilometer-central-agent/0.log" Mar 14 10:37:44 crc kubenswrapper[5129]: I0314 10:37:44.971741 5129 generic.go:334] "Generic (PLEG): container finished" podID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerID="1238462a4ca94fae6666f84ee4ce689536ac3c58ce80e7c2017a8003179bb271" exitCode=0 Mar 14 10:37:44 crc kubenswrapper[5129]: I0314 10:37:44.971785 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h95" event={"ID":"4bacfacc-5c03-4550-a863-dfb174bb7049","Type":"ContainerDied","Data":"1238462a4ca94fae6666f84ee4ce689536ac3c58ce80e7c2017a8003179bb271"} Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.155863 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1c530f52-0988-468f-95d9-c45c3550a14c/sg-core/0.log" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.248938 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1c530f52-0988-468f-95d9-c45c3550a14c/proxy-httpd/0.log" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.284701 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1c530f52-0988-468f-95d9-c45c3550a14c/ceilometer-notification-agent/0.log" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.574427 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.580927 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_49ef9df1-2d5c-406a-9adf-26fd7bd95731/cinder-api/0.log" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.668281 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2lj5\" (UniqueName: \"kubernetes.io/projected/4bacfacc-5c03-4550-a863-dfb174bb7049-kube-api-access-z2lj5\") pod \"4bacfacc-5c03-4550-a863-dfb174bb7049\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.668388 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-catalog-content\") pod \"4bacfacc-5c03-4550-a863-dfb174bb7049\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.668442 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-utilities\") pod \"4bacfacc-5c03-4550-a863-dfb174bb7049\" (UID: \"4bacfacc-5c03-4550-a863-dfb174bb7049\") " Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.676415 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_346b9da9-f8da-416b-aa79-e42409eb111a/cinder-scheduler/0.log" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.682072 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-utilities" (OuterVolumeSpecName: "utilities") pod "4bacfacc-5c03-4550-a863-dfb174bb7049" (UID: "4bacfacc-5c03-4550-a863-dfb174bb7049"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.692539 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bacfacc-5c03-4550-a863-dfb174bb7049-kube-api-access-z2lj5" (OuterVolumeSpecName: "kube-api-access-z2lj5") pod "4bacfacc-5c03-4550-a863-dfb174bb7049" (UID: "4bacfacc-5c03-4550-a863-dfb174bb7049"). InnerVolumeSpecName "kube-api-access-z2lj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.770247 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bacfacc-5c03-4550-a863-dfb174bb7049" (UID: "4bacfacc-5c03-4550-a863-dfb174bb7049"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.771899 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.771927 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2lj5\" (UniqueName: \"kubernetes.io/projected/4bacfacc-5c03-4550-a863-dfb174bb7049-kube-api-access-z2lj5\") on node \"crc\" DevicePath \"\"" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.771952 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bacfacc-5c03-4550-a863-dfb174bb7049-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.798957 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_49ef9df1-2d5c-406a-9adf-26fd7bd95731/cinder-api-log/0.log" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.964050 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_346b9da9-f8da-416b-aa79-e42409eb111a/probe/0.log" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.984626 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h95" event={"ID":"4bacfacc-5c03-4550-a863-dfb174bb7049","Type":"ContainerDied","Data":"1ac48a918d18eea6a3ac88f07cba403fb0b7f3a006e6e2252c37385167d89299"} Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.984678 5129 scope.go:117] "RemoveContainer" containerID="1238462a4ca94fae6666f84ee4ce689536ac3c58ce80e7c2017a8003179bb271" Mar 14 10:37:45 crc kubenswrapper[5129]: I0314 10:37:45.984822 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h95" Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.007138 5129 scope.go:117] "RemoveContainer" containerID="43272bacaa40835667f47b4cbeb31c31dd3df5e1c4a164117c3d570be43e1acc" Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.036877 5129 scope.go:117] "RemoveContainer" containerID="b7a30d1a6b4568625150b4facbe998ca91c9a42d1a6ea6b41cdb4fdf479901bb" Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.065914 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49h95"] Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.069207 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-49h95"] Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.203845 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-m6xpr_1c2b2580-47f6-4635-8e4d-d8e6dbfb82f3/configure-network-openstack-openstack-cell1/0.log" Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.332181 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-networker-64ksz_40be015e-e0ec-4758-ab71-d20347e005c2/configure-network-openstack-openstack-networker/0.log" Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.581574 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-wdjj9_c0e67d08-ffac-4bd2-ad70-190d2a1808df/configure-os-openstack-openstack-cell1/0.log" Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.634917 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-networker-p48r8_9cc591e9-0dd5-47e9-b60e-fd1476f5a130/configure-os-openstack-openstack-networker/0.log" Mar 14 10:37:46 crc kubenswrapper[5129]: I0314 10:37:46.884068 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fc88b7669-hjn88_4fb66716-a152-4be1-a683-93241f8397d9/init/0.log" Mar 14 10:37:47 crc kubenswrapper[5129]: I0314 10:37:47.064439 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fc88b7669-hjn88_4fb66716-a152-4be1-a683-93241f8397d9/init/0.log" Mar 14 10:37:47 crc kubenswrapper[5129]: I0314 10:37:47.173802 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-2vl8w_123dc78e-9c5e-4d0b-9b8a-16d2217f9f72/download-cache-openstack-openstack-cell1/0.log" Mar 14 10:37:47 crc kubenswrapper[5129]: I0314 10:37:47.569919 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-networker-wdpqk_f314e854-c21b-4a43-aa81-87d2ff072c30/download-cache-openstack-openstack-networker/0.log" Mar 14 10:37:47 crc kubenswrapper[5129]: I0314 10:37:47.861434 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_12904d5e-9819-4d47-b8bd-94b42f8b12d2/glance-log/0.log" Mar 14 10:37:47 crc kubenswrapper[5129]: I0314 10:37:47.985403 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_12904d5e-9819-4d47-b8bd-94b42f8b12d2/glance-httpd/0.log" Mar 14 10:37:48 crc kubenswrapper[5129]: I0314 10:37:48.046103 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" path="/var/lib/kubelet/pods/4bacfacc-5c03-4550-a863-dfb174bb7049/volumes" Mar 14 10:37:48 crc kubenswrapper[5129]: I0314 10:37:48.358868 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9958e447-a9f1-4513-bd78-f13804f89650/glance-httpd/0.log" Mar 14 10:37:48 crc kubenswrapper[5129]: I0314 10:37:48.390032 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9958e447-a9f1-4513-bd78-f13804f89650/glance-log/0.log" Mar 14 10:37:49 crc kubenswrapper[5129]: I0314 10:37:49.160163 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-85c4d5f8c-p2pkn_6c6f4af4-bff1-4dd0-9ed8-3fb0d246acfc/heat-engine/0.log" Mar 14 10:37:49 crc kubenswrapper[5129]: I0314 10:37:49.578522 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:37:49 crc kubenswrapper[5129]: I0314 10:37:49.578588 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:37:49 crc kubenswrapper[5129]: I0314 10:37:49.690738 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-589757bd64-ph527_ccf2191a-b66b-4e06-a811-d38b6b465662/horizon/0.log" Mar 14 10:37:49 crc kubenswrapper[5129]: I0314 10:37:49.983970 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-76b5f79468-xdxdp_66f931d0-1eee-42cf-8ac3-998559b831ae/heat-api/0.log" Mar 14 10:37:50 crc kubenswrapper[5129]: I0314 10:37:50.051353 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-66dc9496c6-nvsvn_1b4b6a8a-c1a0-4104-aa02-b24f51abec71/heat-cfnapi/0.log" Mar 14 10:37:50 crc kubenswrapper[5129]: I0314 10:37:50.259635 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-pvr9b_7829da62-c2ff-4358-9ed3-147321c2292c/install-certs-openstack-openstack-cell1/0.log" Mar 14 10:37:50 crc kubenswrapper[5129]: I0314 10:37:50.478308 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-networker-74szz_330ed053-e545-4c73-987a-0289f46396ff/install-certs-openstack-openstack-networker/0.log" Mar 14 10:37:50 crc kubenswrapper[5129]: I0314 10:37:50.773272 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-fdhr7_abf4eec2-f79d-4f05-af43-da60a28bde7e/install-os-openstack-openstack-cell1/0.log" Mar 14 10:37:50 crc kubenswrapper[5129]: I0314 10:37:50.867244 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-589757bd64-ph527_ccf2191a-b66b-4e06-a811-d38b6b465662/horizon-log/0.log" Mar 14 10:37:51 crc kubenswrapper[5129]: I0314 10:37:51.076625 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-networker-m2ssc_9a5b9f85-5af8-4cde-9ab7-8e0e37ec7512/install-os-openstack-openstack-networker/0.log" Mar 14 10:37:51 crc kubenswrapper[5129]: I0314 10:37:51.444123 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fc88b7669-hjn88_4fb66716-a152-4be1-a683-93241f8397d9/dnsmasq-dns/0.log" Mar 14 10:37:51 crc kubenswrapper[5129]: I0314 10:37:51.539210 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29558041-fprmt_d2815225-50bd-4399-a38d-732afc1e06be/keystone-cron/0.log" Mar 14 10:37:51 crc kubenswrapper[5129]: I0314 10:37:51.546902 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29557981-mvkq7_06f8fa26-0897-4a17-a055-86534de558f7/keystone-cron/0.log" Mar 14 10:37:51 crc kubenswrapper[5129]: I0314 10:37:51.772136 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f66a75c1-a404-4f9f-9017-3bb734aee917/kube-state-metrics/0.log" Mar 14 10:37:52 crc kubenswrapper[5129]: I0314 10:37:52.217301 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-p8qzl_623aba17-af6f-4ec2-8d79-1d71984816d2/libvirt-openstack-openstack-cell1/0.log" Mar 14 10:37:52 crc kubenswrapper[5129]: I0314 10:37:52.404573 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6b78d4f6dc-hl89h_59ccb607-3439-4284-b171-93d68e3ee432/keystone-api/0.log" Mar 14 10:37:52 crc kubenswrapper[5129]: I0314 10:37:52.947288 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-tg2d5_0d6afb2b-f47f-41f1-90f4-cfdd1eef8c4a/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 14 10:37:53 crc kubenswrapper[5129]: I0314 10:37:53.165647 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76f68cc57f-g29rv_764337bd-5689-4897-82aa-cc5d7e55e39f/neutron-httpd/0.log" Mar 14 10:37:53 crc kubenswrapper[5129]: I0314 10:37:53.202076 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76f68cc57f-g29rv_764337bd-5689-4897-82aa-cc5d7e55e39f/neutron-api/0.log" Mar 14 10:37:53 crc kubenswrapper[5129]: I0314 10:37:53.235750 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-t694n_cad1b172-56bf-4ac5-b262-f336f90e825c/neutron-metadata-openstack-openstack-cell1/0.log" Mar 14 10:37:53 crc kubenswrapper[5129]: I0314 10:37:53.575961 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-networker-hpflm_62dd10c7-1cda-42f7-891e-6ec9740de425/neutron-metadata-openstack-openstack-networker/0.log" Mar 14 10:37:53 crc kubenswrapper[5129]: I0314 10:37:53.772403 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-56qrq_1b7e8f66-393d-491c-b548-2eb8e08b7b1c/neutron-sriov-openstack-openstack-cell1/0.log" Mar 14 10:37:54 crc kubenswrapper[5129]: I0314 10:37:54.397469 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2ff2b01c-c6d6-4b6c-b50f-a3fd15cd2429/memcached/0.log" Mar 14 10:37:54 crc kubenswrapper[5129]: I0314 10:37:54.549166 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b1870a10-2b74-40a6-9906-b370d1a00d1b/nova-cell0-conductor-conductor/0.log" Mar 14 10:37:54 crc kubenswrapper[5129]: I0314 10:37:54.585921 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_48f4f8c7-9270-42e8-aa7e-1ebe66a772e6/nova-api-log/0.log" Mar 14 10:37:54 crc kubenswrapper[5129]: I0314 10:37:54.772063 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_48f4f8c7-9270-42e8-aa7e-1ebe66a772e6/nova-api-api/0.log" Mar 14 10:37:54 crc kubenswrapper[5129]: I0314 10:37:54.844494 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_43a2480b-1483-449f-b24a-7d2213ad8a2f/nova-cell1-conductor-conductor/0.log" Mar 14 10:37:54 crc kubenswrapper[5129]: I0314 10:37:54.992447 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_407815b2-d340-427a-8b62-b87fab475772/nova-cell1-novncproxy-novncproxy/0.log" Mar 14 10:37:55 crc kubenswrapper[5129]: I0314 10:37:55.122838 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellzjkxr_4dba6caf-580f-48ed-a541-cfa5b8d62d6d/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 14 10:37:55 crc kubenswrapper[5129]: I0314 10:37:55.325900 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-86mwr_ed1b368f-4cc5-4887-90ba-59b7ea16c6e9/nova-cell1-openstack-openstack-cell1/0.log" Mar 14 10:37:55 crc kubenswrapper[5129]: I0314 10:37:55.658066 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_363a41dc-efe8-4ae5-9939-e11a752eaa7f/nova-metadata-log/0.log" Mar 14 10:37:55 crc kubenswrapper[5129]: I0314 10:37:55.844270 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_86be847c-e666-4563-86ae-9c3e5300fe45/nova-scheduler-scheduler/0.log" Mar 14 10:37:55 crc kubenswrapper[5129]: I0314 10:37:55.846071 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8731d474-6329-4a56-be08-fe3d12bb33cd/mysql-bootstrap/0.log" Mar 14 10:37:55 crc kubenswrapper[5129]: I0314 10:37:55.897516 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_363a41dc-efe8-4ae5-9939-e11a752eaa7f/nova-metadata-metadata/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.208143 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8731d474-6329-4a56-be08-fe3d12bb33cd/galera/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.326388 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8731d474-6329-4a56-be08-fe3d12bb33cd/mysql-bootstrap/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.361445 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_069305d5-891e-4158-ba2e-9fc30afeadcb/mysql-bootstrap/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.473530 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_069305d5-891e-4158-ba2e-9fc30afeadcb/galera/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.527855 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_069305d5-891e-4158-ba2e-9fc30afeadcb/mysql-bootstrap/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.690899 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ad913420-c19c-4d99-8d9c-b854a4a605d6/openstackclient/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.777978 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00027bfa-9ff4-4472-8f9d-3763d68530d3/ovn-northd/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.817478 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00027bfa-9ff4-4472-8f9d-3763d68530d3/openstack-network-exporter/0.log" Mar 14 10:37:56 crc kubenswrapper[5129]: I0314 10:37:56.997643 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-9g2lm_70bcb47f-9330-44d8-8527-f73c8066eac0/ovn-openstack-openstack-cell1/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.098967 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-networker-2fl4w_98b9e1f1-5d2f-44b8-b30c-9ec2958abac1/ovn-openstack-openstack-networker/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.245750 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8ecea838-e1a9-4aa0-8602-ffe9621ff137/openstack-network-exporter/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.326370 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8ecea838-e1a9-4aa0-8602-ffe9621ff137/ovsdbserver-nb/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.360796 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_92b6acaf-6100-4355-abb5-08af58c73021/ovsdbserver-nb/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.464092 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_92b6acaf-6100-4355-abb5-08af58c73021/openstack-network-exporter/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.573715 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa/openstack-network-exporter/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.663529 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0f7f65bf-8ebe-48e5-933e-8fa422a7dbfa/ovsdbserver-nb/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.678239 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c3371490-9878-4c69-8390-4b2aed82dd1d/openstack-network-exporter/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.857533 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c3371490-9878-4c69-8390-4b2aed82dd1d/ovsdbserver-sb/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.898625 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_e80a5513-d89e-43d3-9d57-d95ee6b3295c/openstack-network-exporter/0.log" Mar 14 10:37:57 crc kubenswrapper[5129]: I0314 10:37:57.990345 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_e80a5513-d89e-43d3-9d57-d95ee6b3295c/ovsdbserver-sb/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.150913 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_db6190b7-2620-4a00-b2bd-ce56d2c94069/ovsdbserver-sb/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.192512 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_db6190b7-2620-4a00-b2bd-ce56d2c94069/openstack-network-exporter/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.323717 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b7d77b964-ktb5v_e9b25878-27e7-4295-8d72-0011b2031ba3/placement-api/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.543035 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cpwtc4_e5aeecff-523e-415e-bb7c-121a3ca25973/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.584772 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b7d77b964-ktb5v_e9b25878-27e7-4295-8d72-0011b2031ba3/placement-log/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.692776 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-nhf944_8719eb25-616a-4a7d-9f70-ebf7f6216b59/pre-adoption-validation-openstack-pre-adoption-openstack-networ/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.789857 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_58775514-86d8-43ef-8b77-d25a3a2e0380/init-config-reloader/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.964666 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_58775514-86d8-43ef-8b77-d25a3a2e0380/init-config-reloader/0.log" Mar 14 10:37:58 crc kubenswrapper[5129]: I0314 10:37:58.965935 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_58775514-86d8-43ef-8b77-d25a3a2e0380/thanos-sidecar/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.002662 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_58775514-86d8-43ef-8b77-d25a3a2e0380/config-reloader/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.009750 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_58775514-86d8-43ef-8b77-d25a3a2e0380/prometheus/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.149799 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1c3e6ea4-bab0-434c-82b1-d5301345b1ac/setup-container/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.349366 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1c3e6ea4-bab0-434c-82b1-d5301345b1ac/rabbitmq/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.368107 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1c3e6ea4-bab0-434c-82b1-d5301345b1ac/setup-container/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.410898 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b433fc5a-ef90-4dc7-9648-f081946560f4/setup-container/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.826503 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b433fc5a-ef90-4dc7-9648-f081946560f4/rabbitmq/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.859446 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b433fc5a-ef90-4dc7-9648-f081946560f4/setup-container/0.log" Mar 14 10:37:59 crc kubenswrapper[5129]: I0314 10:37:59.871770 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-qkvqk_dd1ca20d-4a40-4708-a05d-645ce7f315a2/reboot-os-openstack-openstack-cell1/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.012204 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-networker-8hjzt_e24d1244-561f-4d4a-a744-dd7255702ffb/reboot-os-openstack-openstack-networker/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.096487 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-fdmgm_d4f670da-3d07-4a74-a292-c7359810c516/run-os-openstack-openstack-cell1/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.116962 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-networker-7dm4w_7e48906b-10f5-468a-97d2-be4873be5eaa/run-os-openstack-openstack-networker/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.139027 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558078-m7mj2"] Mar 14 10:38:00 crc kubenswrapper[5129]: E0314 10:38:00.139512 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="extract-utilities" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.139526 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="extract-utilities" Mar 14 10:38:00 crc kubenswrapper[5129]: E0314 10:38:00.139563 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="extract-content" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.139569 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="extract-content" Mar 14 10:38:00 crc kubenswrapper[5129]: E0314 10:38:00.139578 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="registry-server" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.139586 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="registry-server" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.139804 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bacfacc-5c03-4550-a863-dfb174bb7049" containerName="registry-server" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.140493 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558078-m7mj2" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.142625 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.142645 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.143038 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.152165 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558078-m7mj2"] Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.282911 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx2wx\" (UniqueName: \"kubernetes.io/projected/6435d505-cd76-4e43-b41c-7eb92c46b41e-kube-api-access-mx2wx\") pod \"auto-csr-approver-29558078-m7mj2\" (UID: \"6435d505-cd76-4e43-b41c-7eb92c46b41e\") " pod="openshift-infra/auto-csr-approver-29558078-m7mj2" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.288775 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-49tl9_72f9865d-1bad-481b-8303-99a656a45ea5/ssh-known-hosts-openstack/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.384217 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx2wx\" (UniqueName: \"kubernetes.io/projected/6435d505-cd76-4e43-b41c-7eb92c46b41e-kube-api-access-mx2wx\") pod \"auto-csr-approver-29558078-m7mj2\" (UID: \"6435d505-cd76-4e43-b41c-7eb92c46b41e\") " pod="openshift-infra/auto-csr-approver-29558078-m7mj2" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.414228 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx2wx\" (UniqueName: \"kubernetes.io/projected/6435d505-cd76-4e43-b41c-7eb92c46b41e-kube-api-access-mx2wx\") pod \"auto-csr-approver-29558078-m7mj2\" (UID: \"6435d505-cd76-4e43-b41c-7eb92c46b41e\") " pod="openshift-infra/auto-csr-approver-29558078-m7mj2" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.491550 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558078-m7mj2" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.519069 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5679cc5964-ql2sl_fd82fc6b-2f78-4b93-9c8a-2135600be3e0/proxy-server/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.585444 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tbxqz_b53ff87d-b7f7-4d68-834c-7fa95020d95a/swift-ring-rebalance/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.799688 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5679cc5964-ql2sl_fd82fc6b-2f78-4b93-9c8a-2135600be3e0/proxy-httpd/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.822404 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/account-auditor/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.846825 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/account-replicator/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.849313 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/account-reaper/0.log" Mar 14 10:38:00 crc kubenswrapper[5129]: I0314 10:38:00.982036 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/container-auditor/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.056280 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/account-server/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.056945 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/container-updater/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.158639 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/container-replicator/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.223820 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558078-m7mj2"] Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.326064 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/object-auditor/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.377137 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/object-expirer/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.402489 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/object-replicator/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.473450 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/container-server/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.602741 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/object-updater/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.681114 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/swift-recon-cron/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.801643 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/account-auditor/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.875042 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/object-server/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.917762 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/account-reaper/0.log" Mar 14 10:38:01 crc kubenswrapper[5129]: I0314 10:38:01.991348 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74005c3b-1ed5-4e99-ae27-26b92fdee7a1/rsync/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.046782 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/account-replicator/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.147110 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/container-auditor/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.173975 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/account-server/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.215163 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/container-replicator/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.227998 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558078-m7mj2" event={"ID":"6435d505-cd76-4e43-b41c-7eb92c46b41e","Type":"ContainerStarted","Data":"e5de09119791428cb25705e25fd440d72c86765b4f32fda036cb4e5f452c9203"} Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.292980 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/container-updater/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.472174 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/object-auditor/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.510911 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/object-expirer/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.534530 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/object-replicator/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.691127 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/container-server/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.719733 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/object-updater/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.886081 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/swift-recon-cron/0.log" Mar 14 10:38:02 crc kubenswrapper[5129]: I0314 10:38:02.948006 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/account-auditor/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.119270 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/object-server/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.151459 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/account-reaper/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.225580 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/account-replicator/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.225702 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-1_40db9059-374b-4b74-8e84-5c360deb5e34/rsync/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.237673 5129 generic.go:334] "Generic (PLEG): container finished" podID="6435d505-cd76-4e43-b41c-7eb92c46b41e" containerID="bfcd4769cc567bd4313440d1e97a7460a0698bdf36833a5f3848fd7a10fe083a" exitCode=0 Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.237714 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558078-m7mj2" event={"ID":"6435d505-cd76-4e43-b41c-7eb92c46b41e","Type":"ContainerDied","Data":"bfcd4769cc567bd4313440d1e97a7460a0698bdf36833a5f3848fd7a10fe083a"} Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.343062 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/container-auditor/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.475818 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/container-replicator/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.479276 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/container-updater/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.539176 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/account-server/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.638961 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/object-auditor/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.791427 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/object-expirer/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.816906 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/object-replicator/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.874322 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/object-updater/0.log" Mar 14 10:38:03 crc kubenswrapper[5129]: I0314 10:38:03.939752 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/container-server/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.020445 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/swift-recon-cron/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.303285 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-9zmjr_5e83070c-b8b4-468d-bf05-414509537764/telemetry-openstack-openstack-cell1/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.345954 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/object-server/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.424850 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4883291a-4eb0-4a7b-82e9-f0caa4f72148/tempest-tests-tempest-tests-runner/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.577421 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-2_9149055f-2604-48d3-85d4-7e109aeb13db/rsync/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.587851 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_912cd5e3-94b8-4559-a20e-470b124e747a/test-operator-logs-container/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.728882 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-bnnch_78038caa-d464-4b63-a304-057a4393be51/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 14 10:38:04 crc kubenswrapper[5129]: I0314 10:38:04.926257 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-networker-kqp4q_93cba590-03e6-448b-a8d0-d61dbe971a6f/tripleo-cleanup-tripleo-cleanup-openstack-networker/0.log" Mar 14 10:38:05 crc kubenswrapper[5129]: I0314 10:38:05.050211 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-zrt44_62bc5cd6-1f77-4220-ba45-f5376a5b0dec/validate-network-openstack-openstack-cell1/0.log" Mar 14 10:38:05 crc kubenswrapper[5129]: I0314 10:38:05.137440 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-networker-x4xn2_5f757610-030c-48d6-ae59-dfe99c5edb1a/validate-network-openstack-openstack-networker/0.log" Mar 14 10:38:05 crc kubenswrapper[5129]: I0314 10:38:05.607317 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558078-m7mj2" Mar 14 10:38:05 crc kubenswrapper[5129]: I0314 10:38:05.735523 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx2wx\" (UniqueName: \"kubernetes.io/projected/6435d505-cd76-4e43-b41c-7eb92c46b41e-kube-api-access-mx2wx\") pod \"6435d505-cd76-4e43-b41c-7eb92c46b41e\" (UID: \"6435d505-cd76-4e43-b41c-7eb92c46b41e\") " Mar 14 10:38:05 crc kubenswrapper[5129]: I0314 10:38:05.742857 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6435d505-cd76-4e43-b41c-7eb92c46b41e-kube-api-access-mx2wx" (OuterVolumeSpecName: "kube-api-access-mx2wx") pod "6435d505-cd76-4e43-b41c-7eb92c46b41e" (UID: "6435d505-cd76-4e43-b41c-7eb92c46b41e"). InnerVolumeSpecName "kube-api-access-mx2wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:38:05 crc kubenswrapper[5129]: I0314 10:38:05.837909 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx2wx\" (UniqueName: \"kubernetes.io/projected/6435d505-cd76-4e43-b41c-7eb92c46b41e-kube-api-access-mx2wx\") on node \"crc\" DevicePath \"\"" Mar 14 10:38:06 crc kubenswrapper[5129]: I0314 10:38:06.265306 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558078-m7mj2" event={"ID":"6435d505-cd76-4e43-b41c-7eb92c46b41e","Type":"ContainerDied","Data":"e5de09119791428cb25705e25fd440d72c86765b4f32fda036cb4e5f452c9203"} Mar 14 10:38:06 crc kubenswrapper[5129]: I0314 10:38:06.265347 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5de09119791428cb25705e25fd440d72c86765b4f32fda036cb4e5f452c9203" Mar 14 10:38:06 crc kubenswrapper[5129]: I0314 10:38:06.265352 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558078-m7mj2" Mar 14 10:38:06 crc kubenswrapper[5129]: I0314 10:38:06.670226 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558072-bch5s"] Mar 14 10:38:06 crc kubenswrapper[5129]: I0314 10:38:06.682302 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558072-bch5s"] Mar 14 10:38:08 crc kubenswrapper[5129]: I0314 10:38:08.053891 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb00404-30aa-431f-a70b-6d62565f1526" path="/var/lib/kubelet/pods/1bb00404-30aa-431f-a70b-6d62565f1526/volumes" Mar 14 10:38:19 crc kubenswrapper[5129]: I0314 10:38:19.573800 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:38:19 crc kubenswrapper[5129]: I0314 10:38:19.574306 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:38:19 crc kubenswrapper[5129]: I0314 10:38:19.574353 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:38:19 crc kubenswrapper[5129]: I0314 10:38:19.575113 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5a7a7f050827fb797d412717f100d7b334050f76e813160cf2256327e999ae1"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:38:19 crc kubenswrapper[5129]: I0314 10:38:19.575166 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://d5a7a7f050827fb797d412717f100d7b334050f76e813160cf2256327e999ae1" gracePeriod=600 Mar 14 10:38:20 crc kubenswrapper[5129]: I0314 10:38:20.245390 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="d5a7a7f050827fb797d412717f100d7b334050f76e813160cf2256327e999ae1" exitCode=0 Mar 14 10:38:20 crc kubenswrapper[5129]: I0314 10:38:20.245660 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"d5a7a7f050827fb797d412717f100d7b334050f76e813160cf2256327e999ae1"} Mar 14 10:38:20 crc kubenswrapper[5129]: I0314 10:38:20.245687 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553"} Mar 14 10:38:20 crc kubenswrapper[5129]: I0314 10:38:20.245703 5129 scope.go:117] "RemoveContainer" containerID="d98b69b2941cdb149de10970bbaa120e392270875dde2b052ac0bc5c2ff3c194" Mar 14 10:38:39 crc kubenswrapper[5129]: I0314 10:38:39.014814 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c_5f53d8e2-e38d-425d-948c-bd705f3c273f/util/0.log" Mar 14 10:38:39 crc kubenswrapper[5129]: I0314 10:38:39.265438 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c_5f53d8e2-e38d-425d-948c-bd705f3c273f/util/0.log" Mar 14 10:38:39 crc kubenswrapper[5129]: I0314 10:38:39.315836 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c_5f53d8e2-e38d-425d-948c-bd705f3c273f/pull/0.log" Mar 14 10:38:39 crc kubenswrapper[5129]: I0314 10:38:39.319946 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c_5f53d8e2-e38d-425d-948c-bd705f3c273f/pull/0.log" Mar 14 10:38:39 crc kubenswrapper[5129]: I0314 10:38:39.731695 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c_5f53d8e2-e38d-425d-948c-bd705f3c273f/extract/0.log" Mar 14 10:38:39 crc kubenswrapper[5129]: I0314 10:38:39.772640 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c_5f53d8e2-e38d-425d-948c-bd705f3c273f/util/0.log" Mar 14 10:38:39 crc kubenswrapper[5129]: I0314 10:38:39.778847 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298kvp2c_5f53d8e2-e38d-425d-948c-bd705f3c273f/pull/0.log" Mar 14 10:38:40 crc kubenswrapper[5129]: I0314 10:38:40.033515 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-s8dzz_ff95ea02-39f3-4d5c-85c7-c194f3ee2ef5/manager/0.log" Mar 14 10:38:40 crc kubenswrapper[5129]: I0314 10:38:40.289192 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-rkcbz_689ffa0e-d6ac-4dfb-bd55-3a957aeb1cb6/manager/0.log" Mar 14 10:38:40 crc kubenswrapper[5129]: I0314 10:38:40.524015 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-rzk97_33a84627-b97d-4f3b-84ec-d81a54c1e56c/manager/0.log" Mar 14 10:38:40 crc kubenswrapper[5129]: I0314 10:38:40.676238 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-gmxp8_0ece0928-a8a0-46b8-98bd-8b35c8d07fca/manager/0.log" Mar 14 10:38:40 crc kubenswrapper[5129]: I0314 10:38:40.890665 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fjnr7_a56d7e4a-22ff-40c9-b6f6-b070fc628880/manager/0.log" Mar 14 10:38:41 crc kubenswrapper[5129]: I0314 10:38:41.315652 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-cwhxs_2c7b2dc9-e244-426d-b611-ee2629816c17/manager/0.log" Mar 14 10:38:41 crc kubenswrapper[5129]: I0314 10:38:41.751109 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-n2qxq_0bc8a10a-075b-4eb1-96cd-081c4ce39a30/manager/0.log" Mar 14 10:38:41 crc kubenswrapper[5129]: I0314 10:38:41.954567 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-fb6px_04e8856d-e262-463a-9162-cb7ceef75a38/manager/0.log" Mar 14 10:38:42 crc kubenswrapper[5129]: I0314 10:38:42.011056 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-8b67l_5b7ec223-c34a-45c1-926c-e957e8cd3086/manager/0.log" Mar 14 10:38:42 crc kubenswrapper[5129]: I0314 10:38:42.470821 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-wzdf7_cb4dadcd-075b-4d15-b6e8-90baf37ff7d0/manager/0.log" Mar 14 10:38:42 crc kubenswrapper[5129]: I0314 10:38:42.504292 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-kvm84_cc8b7c57-6f95-4a5c-ba46-2d6099c2d2b5/manager/0.log" Mar 14 10:38:42 crc kubenswrapper[5129]: I0314 10:38:42.894651 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-2dmkf_f447b14b-2d09-416b-96b4-126ab3dc2515/manager/0.log" Mar 14 10:38:43 crc kubenswrapper[5129]: I0314 10:38:43.025780 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-lprxs_91e2b49a-9bac-44a1-9b90-22f62a1ce727/manager/0.log" Mar 14 10:38:43 crc kubenswrapper[5129]: I0314 10:38:43.180258 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f7958d7744kbgv_e789f354-e686-4cc9-a705-3af685a25988/manager/0.log" Mar 14 10:38:43 crc kubenswrapper[5129]: I0314 10:38:43.389072 5129 scope.go:117] "RemoveContainer" containerID="abde1e518bdb5bf06ae20d23cf74e90b5663ce2a8a8061252994f2b5d5201fbb" Mar 14 10:38:43 crc kubenswrapper[5129]: I0314 10:38:43.608269 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6dc56d8cd6-r5vg4_ee2defc7-719a-4d20-94ee-0ce74a6015c6/operator/0.log" Mar 14 10:38:44 crc kubenswrapper[5129]: I0314 10:38:44.127133 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-9z48t_e21e58a4-940c-4131-9d23-645393687367/manager/0.log" Mar 14 10:38:44 crc kubenswrapper[5129]: I0314 10:38:44.197782 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-56jrj_39018862-672e-47fd-85bb-c1baa5a8db7b/registry-server/0.log" Mar 14 10:38:44 crc kubenswrapper[5129]: I0314 10:38:44.270181 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-6vg2z_3b985683-b068-4b76-b702-927b15cc10ff/manager/0.log" Mar 14 10:38:44 crc kubenswrapper[5129]: I0314 10:38:44.383874 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-h5zmn_1bf2bb51-4fd1-4d88-b663-3d41e4236ecd/manager/0.log" Mar 14 10:38:44 crc kubenswrapper[5129]: I0314 10:38:44.523897 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xgnfq_0be8f03b-22f7-421f-9da8-a3653e323613/operator/0.log" Mar 14 10:38:44 crc kubenswrapper[5129]: I0314 10:38:44.761730 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-4tm5b_3a4f21bd-d487-4af5-b741-4971cf4b11d1/manager/0.log" Mar 14 10:38:45 crc kubenswrapper[5129]: I0314 10:38:45.058416 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-nmvb4_1d8c0991-2d49-42c4-bed5-62c86ef72f24/manager/0.log" Mar 14 10:38:45 crc kubenswrapper[5129]: I0314 10:38:45.192798 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6854b8b9d9-kpb92_15bf3262-1e7c-42a9-bf65-f8507856d922/manager/0.log" Mar 14 10:38:45 crc kubenswrapper[5129]: I0314 10:38:45.612542 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-58xnz_b1d9e89a-c41d-44a3-8fae-5a6b69f0f99f/manager/0.log" Mar 14 10:38:46 crc kubenswrapper[5129]: I0314 10:38:46.434574 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6484b7b757-jwxjp_4484eedf-8b6d-45a2-af19-09ada3258a22/manager/0.log" Mar 14 10:39:09 crc kubenswrapper[5129]: I0314 10:39:09.207920 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rrt62_67f3717d-a52d-46b9-9132-044239d564c3/control-plane-machine-set-operator/0.log" Mar 14 10:39:09 crc kubenswrapper[5129]: I0314 10:39:09.414509 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vtzj7_095d17bf-b9c3-42e6-b8c9-39b929c52d50/kube-rbac-proxy/0.log" Mar 14 10:39:09 crc kubenswrapper[5129]: I0314 10:39:09.453837 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vtzj7_095d17bf-b9c3-42e6-b8c9-39b929c52d50/machine-api-operator/0.log" Mar 14 10:39:27 crc kubenswrapper[5129]: I0314 10:39:27.466761 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-bjkz4_a08bceb1-f043-47d5-adea-41adb89f8acc/cert-manager-controller/0.log" Mar 14 10:39:27 crc kubenswrapper[5129]: I0314 10:39:27.686242 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-z95qp_38b33329-e9fa-4b4c-8287-e85ece3d93d6/cert-manager-cainjector/0.log" Mar 14 10:39:27 crc kubenswrapper[5129]: I0314 10:39:27.806243 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-kgd5m_8fef9f0a-fc01-4562-ab16-343197009953/cert-manager-webhook/0.log" Mar 14 10:39:44 crc kubenswrapper[5129]: I0314 10:39:44.187011 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-zpxkf_1a4817ce-3ee7-4738-a1f2-5ae751ad564f/nmstate-console-plugin/0.log" Mar 14 10:39:44 crc kubenswrapper[5129]: I0314 10:39:44.422178 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jlwnk_d2f313f4-3ae1-4c6e-b8ff-69e4e9c189d3/nmstate-handler/0.log" Mar 14 10:39:44 crc kubenswrapper[5129]: I0314 10:39:44.689945 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tqfq9_684cf0e0-3f2e-4904-a9a3-257015dd0f03/kube-rbac-proxy/0.log" Mar 14 10:39:44 crc kubenswrapper[5129]: I0314 10:39:44.690388 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tqfq9_684cf0e0-3f2e-4904-a9a3-257015dd0f03/nmstate-metrics/0.log" Mar 14 10:39:44 crc kubenswrapper[5129]: I0314 10:39:44.924794 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zkslc_4400f598-1fed-45cc-b987-ee2190cef8b4/nmstate-operator/0.log" Mar 14 10:39:45 crc kubenswrapper[5129]: I0314 10:39:45.062657 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-4rmxr_ea3fee1e-d109-4b58-86a8-919ace67ad6a/nmstate-webhook/0.log" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.738597 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6cdjh"] Mar 14 10:39:56 crc kubenswrapper[5129]: E0314 10:39:56.739473 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6435d505-cd76-4e43-b41c-7eb92c46b41e" containerName="oc" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.739484 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="6435d505-cd76-4e43-b41c-7eb92c46b41e" containerName="oc" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.739759 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="6435d505-cd76-4e43-b41c-7eb92c46b41e" containerName="oc" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.742260 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.751149 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6cdjh"] Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.833218 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbw8\" (UniqueName: \"kubernetes.io/projected/150003e3-0c5e-4b1a-8370-df181d086f5b-kube-api-access-7pbw8\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.833379 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-utilities\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.833404 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-catalog-content\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.935274 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-utilities\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.935324 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-catalog-content\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.935406 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbw8\" (UniqueName: \"kubernetes.io/projected/150003e3-0c5e-4b1a-8370-df181d086f5b-kube-api-access-7pbw8\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.935744 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-utilities\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.935984 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-catalog-content\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:56 crc kubenswrapper[5129]: I0314 10:39:56.955059 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbw8\" (UniqueName: \"kubernetes.io/projected/150003e3-0c5e-4b1a-8370-df181d086f5b-kube-api-access-7pbw8\") pod \"redhat-operators-6cdjh\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:57 crc kubenswrapper[5129]: I0314 10:39:57.086250 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:39:57 crc kubenswrapper[5129]: I0314 10:39:57.888295 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6cdjh"] Mar 14 10:39:58 crc kubenswrapper[5129]: I0314 10:39:58.257033 5129 generic.go:334] "Generic (PLEG): container finished" podID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerID="40545c6d237a5b8bc99b154c3c7f2a7b89429dada9fd0fa79d4a50802922b175" exitCode=0 Mar 14 10:39:58 crc kubenswrapper[5129]: I0314 10:39:58.257109 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cdjh" event={"ID":"150003e3-0c5e-4b1a-8370-df181d086f5b","Type":"ContainerDied","Data":"40545c6d237a5b8bc99b154c3c7f2a7b89429dada9fd0fa79d4a50802922b175"} Mar 14 10:39:58 crc kubenswrapper[5129]: I0314 10:39:58.257310 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cdjh" event={"ID":"150003e3-0c5e-4b1a-8370-df181d086f5b","Type":"ContainerStarted","Data":"a3f6100f1a1a48af09ac4a4bf49467d0788a04d5c1f81e0b4fcc1a306fcb647f"} Mar 14 10:39:59 crc kubenswrapper[5129]: I0314 10:39:59.272105 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cdjh" event={"ID":"150003e3-0c5e-4b1a-8370-df181d086f5b","Type":"ContainerStarted","Data":"22f48016f1b4adf5603c9dc2fd909130210e88f1333da8f3c4bc573646eaad13"} Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.160909 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558080-vhsf9"] Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.162853 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.164729 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.165692 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.165908 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.182284 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558080-vhsf9"] Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.310353 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5lg\" (UniqueName: \"kubernetes.io/projected/b0c52b71-8ab2-4690-a0ae-900bbc1bddac-kube-api-access-2p5lg\") pod \"auto-csr-approver-29558080-vhsf9\" (UID: \"b0c52b71-8ab2-4690-a0ae-900bbc1bddac\") " pod="openshift-infra/auto-csr-approver-29558080-vhsf9" Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.412958 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p5lg\" (UniqueName: \"kubernetes.io/projected/b0c52b71-8ab2-4690-a0ae-900bbc1bddac-kube-api-access-2p5lg\") pod \"auto-csr-approver-29558080-vhsf9\" (UID: \"b0c52b71-8ab2-4690-a0ae-900bbc1bddac\") " pod="openshift-infra/auto-csr-approver-29558080-vhsf9" Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.457656 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p5lg\" (UniqueName: \"kubernetes.io/projected/b0c52b71-8ab2-4690-a0ae-900bbc1bddac-kube-api-access-2p5lg\") pod \"auto-csr-approver-29558080-vhsf9\" (UID: \"b0c52b71-8ab2-4690-a0ae-900bbc1bddac\") " pod="openshift-infra/auto-csr-approver-29558080-vhsf9" Mar 14 10:40:00 crc kubenswrapper[5129]: I0314 10:40:00.505136 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" Mar 14 10:40:01 crc kubenswrapper[5129]: I0314 10:40:01.186259 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558080-vhsf9"] Mar 14 10:40:01 crc kubenswrapper[5129]: W0314 10:40:01.188896 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0c52b71_8ab2_4690_a0ae_900bbc1bddac.slice/crio-5b2410ca8f0b03e4c9f23d0f8a782bbec5d6603edae9b5ec71836c39fff4a00f WatchSource:0}: Error finding container 5b2410ca8f0b03e4c9f23d0f8a782bbec5d6603edae9b5ec71836c39fff4a00f: Status 404 returned error can't find the container with id 5b2410ca8f0b03e4c9f23d0f8a782bbec5d6603edae9b5ec71836c39fff4a00f Mar 14 10:40:01 crc kubenswrapper[5129]: I0314 10:40:01.292083 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" event={"ID":"b0c52b71-8ab2-4690-a0ae-900bbc1bddac","Type":"ContainerStarted","Data":"5b2410ca8f0b03e4c9f23d0f8a782bbec5d6603edae9b5ec71836c39fff4a00f"} Mar 14 10:40:03 crc kubenswrapper[5129]: I0314 10:40:03.665622 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rst7l_cc3d7fc0-4c50-42d1-984a-822d52e9ce6f/prometheus-operator/0.log" Mar 14 10:40:03 crc kubenswrapper[5129]: I0314 10:40:03.884659 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-75c9666696-dscrs_b056df4d-914e-45a6-8b07-fb2565d30c6a/prometheus-operator-admission-webhook/0.log" Mar 14 10:40:03 crc kubenswrapper[5129]: I0314 10:40:03.970008 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-75c9666696-zx67t_e92945a5-6bd7-41ad-a62e-97f681d79bef/prometheus-operator-admission-webhook/0.log" Mar 14 10:40:04 crc kubenswrapper[5129]: I0314 10:40:04.325756 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" event={"ID":"b0c52b71-8ab2-4690-a0ae-900bbc1bddac","Type":"ContainerStarted","Data":"9247530f996a6e3c58d678f6b9f461d343db8e9dcd962914257d3b9e360402c2"} Mar 14 10:40:04 crc kubenswrapper[5129]: I0314 10:40:04.345114 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" podStartSLOduration=2.149973554 podStartE2EDuration="4.345091005s" podCreationTimestamp="2026-03-14 10:40:00 +0000 UTC" firstStartedPulling="2026-03-14 10:40:01.192655561 +0000 UTC m=+13263.944570755" lastFinishedPulling="2026-03-14 10:40:03.387773022 +0000 UTC m=+13266.139688206" observedRunningTime="2026-03-14 10:40:04.335700241 +0000 UTC m=+13267.087615425" watchObservedRunningTime="2026-03-14 10:40:04.345091005 +0000 UTC m=+13267.097006179" Mar 14 10:40:04 crc kubenswrapper[5129]: I0314 10:40:04.381063 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-xp2x8_bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7/operator/0.log" Mar 14 10:40:04 crc kubenswrapper[5129]: I0314 10:40:04.425086 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-v6wgl_22479fe6-fd03-45b0-8cab-a7b641134b30/perses-operator/0.log" Mar 14 10:40:05 crc kubenswrapper[5129]: I0314 10:40:05.335394 5129 generic.go:334] "Generic (PLEG): container finished" podID="b0c52b71-8ab2-4690-a0ae-900bbc1bddac" containerID="9247530f996a6e3c58d678f6b9f461d343db8e9dcd962914257d3b9e360402c2" exitCode=0 Mar 14 10:40:05 crc kubenswrapper[5129]: I0314 10:40:05.335479 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" event={"ID":"b0c52b71-8ab2-4690-a0ae-900bbc1bddac","Type":"ContainerDied","Data":"9247530f996a6e3c58d678f6b9f461d343db8e9dcd962914257d3b9e360402c2"} Mar 14 10:40:07 crc kubenswrapper[5129]: I0314 10:40:07.356141 5129 generic.go:334] "Generic (PLEG): container finished" podID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerID="22f48016f1b4adf5603c9dc2fd909130210e88f1333da8f3c4bc573646eaad13" exitCode=0 Mar 14 10:40:07 crc kubenswrapper[5129]: I0314 10:40:07.356521 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cdjh" event={"ID":"150003e3-0c5e-4b1a-8370-df181d086f5b","Type":"ContainerDied","Data":"22f48016f1b4adf5603c9dc2fd909130210e88f1333da8f3c4bc573646eaad13"} Mar 14 10:40:07 crc kubenswrapper[5129]: I0314 10:40:07.687917 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" Mar 14 10:40:07 crc kubenswrapper[5129]: I0314 10:40:07.859415 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p5lg\" (UniqueName: \"kubernetes.io/projected/b0c52b71-8ab2-4690-a0ae-900bbc1bddac-kube-api-access-2p5lg\") pod \"b0c52b71-8ab2-4690-a0ae-900bbc1bddac\" (UID: \"b0c52b71-8ab2-4690-a0ae-900bbc1bddac\") " Mar 14 10:40:07 crc kubenswrapper[5129]: I0314 10:40:07.868106 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c52b71-8ab2-4690-a0ae-900bbc1bddac-kube-api-access-2p5lg" (OuterVolumeSpecName: "kube-api-access-2p5lg") pod "b0c52b71-8ab2-4690-a0ae-900bbc1bddac" (UID: "b0c52b71-8ab2-4690-a0ae-900bbc1bddac"). InnerVolumeSpecName "kube-api-access-2p5lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:40:07 crc kubenswrapper[5129]: I0314 10:40:07.962251 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p5lg\" (UniqueName: \"kubernetes.io/projected/b0c52b71-8ab2-4690-a0ae-900bbc1bddac-kube-api-access-2p5lg\") on node \"crc\" DevicePath \"\"" Mar 14 10:40:08 crc kubenswrapper[5129]: I0314 10:40:08.369641 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" Mar 14 10:40:08 crc kubenswrapper[5129]: I0314 10:40:08.370098 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558080-vhsf9" event={"ID":"b0c52b71-8ab2-4690-a0ae-900bbc1bddac","Type":"ContainerDied","Data":"5b2410ca8f0b03e4c9f23d0f8a782bbec5d6603edae9b5ec71836c39fff4a00f"} Mar 14 10:40:08 crc kubenswrapper[5129]: I0314 10:40:08.370121 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b2410ca8f0b03e4c9f23d0f8a782bbec5d6603edae9b5ec71836c39fff4a00f" Mar 14 10:40:08 crc kubenswrapper[5129]: I0314 10:40:08.373591 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cdjh" event={"ID":"150003e3-0c5e-4b1a-8370-df181d086f5b","Type":"ContainerStarted","Data":"8ccffb5d8e3bcd07e21a68fb58cde1f7b85372af33b6d749e68ee513b7a6d984"} Mar 14 10:40:08 crc kubenswrapper[5129]: I0314 10:40:08.398056 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6cdjh" podStartSLOduration=2.676344683 podStartE2EDuration="12.398038591s" podCreationTimestamp="2026-03-14 10:39:56 +0000 UTC" firstStartedPulling="2026-03-14 10:39:58.26652082 +0000 UTC m=+13261.018436004" lastFinishedPulling="2026-03-14 10:40:07.988214728 +0000 UTC m=+13270.740129912" observedRunningTime="2026-03-14 10:40:08.389432467 +0000 UTC m=+13271.141347651" watchObservedRunningTime="2026-03-14 10:40:08.398038591 +0000 UTC m=+13271.149953765" Mar 14 10:40:08 crc kubenswrapper[5129]: I0314 10:40:08.766305 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558074-jhqvj"] Mar 14 10:40:08 crc kubenswrapper[5129]: I0314 10:40:08.776491 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558074-jhqvj"] Mar 14 10:40:10 crc kubenswrapper[5129]: I0314 10:40:10.046179 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598c8171-4e5b-4d59-a1bb-ace53444a2a5" path="/var/lib/kubelet/pods/598c8171-4e5b-4d59-a1bb-ace53444a2a5/volumes" Mar 14 10:40:17 crc kubenswrapper[5129]: I0314 10:40:17.087510 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:40:17 crc kubenswrapper[5129]: I0314 10:40:17.088045 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:40:18 crc kubenswrapper[5129]: I0314 10:40:18.156028 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6cdjh" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="registry-server" probeResult="failure" output=< Mar 14 10:40:18 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:40:18 crc kubenswrapper[5129]: > Mar 14 10:40:19 crc kubenswrapper[5129]: I0314 10:40:19.666683 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:40:19 crc kubenswrapper[5129]: I0314 10:40:19.668873 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:40:21 crc kubenswrapper[5129]: I0314 10:40:21.413076 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-56jrj" podUID="39018862-672e-47fd-85bb-c1baa5a8db7b" containerName="registry-server" probeResult="failure" output=< Mar 14 10:40:21 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:40:21 crc kubenswrapper[5129]: > Mar 14 10:40:21 crc kubenswrapper[5129]: I0314 10:40:21.474294 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-56jrj" podUID="39018862-672e-47fd-85bb-c1baa5a8db7b" containerName="registry-server" probeResult="failure" output=< Mar 14 10:40:21 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:40:21 crc kubenswrapper[5129]: > Mar 14 10:40:21 crc kubenswrapper[5129]: I0314 10:40:21.522069 5129 patch_prober.go:28] interesting pod/oauth-openshift-675f5cc7c5-j7vnw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.69:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 10:40:21 crc kubenswrapper[5129]: I0314 10:40:21.522422 5129 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-675f5cc7c5-j7vnw" podUID="91e5e202-0a96-46be-b241-a97d49eb2619" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.69:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 10:40:25 crc kubenswrapper[5129]: I0314 10:40:25.742931 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-77prk_434ee85e-a99b-416e-b96c-9c019d77850a/kube-rbac-proxy/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.093712 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-77prk_434ee85e-a99b-416e-b96c-9c019d77850a/controller/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.136380 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-frr-files/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.537955 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-frr-files/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.561029 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-reloader/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.639482 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-metrics/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.650670 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-reloader/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.901999 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-frr-files/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.905136 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-metrics/0.log" Mar 14 10:40:26 crc kubenswrapper[5129]: I0314 10:40:26.935877 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-reloader/0.log" Mar 14 10:40:27 crc kubenswrapper[5129]: I0314 10:40:27.022367 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-metrics/0.log" Mar 14 10:40:27 crc kubenswrapper[5129]: I0314 10:40:27.301044 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-reloader/0.log" Mar 14 10:40:27 crc kubenswrapper[5129]: I0314 10:40:27.334836 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-frr-files/0.log" Mar 14 10:40:27 crc kubenswrapper[5129]: I0314 10:40:27.351479 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/cp-metrics/0.log" Mar 14 10:40:27 crc kubenswrapper[5129]: I0314 10:40:27.367144 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/controller/0.log" Mar 14 10:40:27 crc kubenswrapper[5129]: I0314 10:40:27.623370 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/frr-metrics/0.log" Mar 14 10:40:27 crc kubenswrapper[5129]: I0314 10:40:27.737599 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/kube-rbac-proxy/0.log" Mar 14 10:40:28 crc kubenswrapper[5129]: I0314 10:40:28.063860 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/kube-rbac-proxy-frr/0.log" Mar 14 10:40:28 crc kubenswrapper[5129]: I0314 10:40:28.134708 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6cdjh" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="registry-server" probeResult="failure" output=< Mar 14 10:40:28 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:40:28 crc kubenswrapper[5129]: > Mar 14 10:40:28 crc kubenswrapper[5129]: I0314 10:40:28.215978 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/reloader/0.log" Mar 14 10:40:28 crc kubenswrapper[5129]: I0314 10:40:28.338624 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-wwj7v_4c5ce029-cd89-4615-95e5-26e366269bc1/frr-k8s-webhook-server/0.log" Mar 14 10:40:28 crc kubenswrapper[5129]: I0314 10:40:28.906694 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5dd66c8db4-67bnn_2c206b3c-9ece-49e5-8227-1bb654d4635d/manager/0.log" Mar 14 10:40:29 crc kubenswrapper[5129]: I0314 10:40:29.132024 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-767958b5d7-fxpkc_b375b429-b57b-46b8-b616-a6a6723cf3c6/webhook-server/0.log" Mar 14 10:40:29 crc kubenswrapper[5129]: I0314 10:40:29.318170 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-79pdg_509b7fd3-32d6-4c74-a48c-0e2ed674175d/kube-rbac-proxy/0.log" Mar 14 10:40:30 crc kubenswrapper[5129]: I0314 10:40:30.253091 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-79pdg_509b7fd3-32d6-4c74-a48c-0e2ed674175d/speaker/0.log" Mar 14 10:40:31 crc kubenswrapper[5129]: I0314 10:40:31.503946 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88nmz_a42d927d-7fdc-49e1-9c20-eb3f410a3b9b/frr/0.log" Mar 14 10:40:38 crc kubenswrapper[5129]: I0314 10:40:38.138409 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6cdjh" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="registry-server" probeResult="failure" output=< Mar 14 10:40:38 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:40:38 crc kubenswrapper[5129]: > Mar 14 10:40:43 crc kubenswrapper[5129]: I0314 10:40:43.515686 5129 scope.go:117] "RemoveContainer" containerID="0569d9f35d84e09f7f0162296da1f20d5ba425c78509af7a6942bef423bd0d7b" Mar 14 10:40:43 crc kubenswrapper[5129]: I0314 10:40:43.565494 5129 scope.go:117] "RemoveContainer" containerID="458eb38c5bcb0112756de448bf0a6f01e1580a74b67c04bedc417134aaeb934b" Mar 14 10:40:44 crc kubenswrapper[5129]: I0314 10:40:44.613184 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b_196abf24-e768-4d07-ba20-6df2e2c3ec9f/util/0.log" Mar 14 10:40:44 crc kubenswrapper[5129]: I0314 10:40:44.928718 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b_196abf24-e768-4d07-ba20-6df2e2c3ec9f/pull/0.log" Mar 14 10:40:44 crc kubenswrapper[5129]: I0314 10:40:44.932940 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b_196abf24-e768-4d07-ba20-6df2e2c3ec9f/util/0.log" Mar 14 10:40:45 crc kubenswrapper[5129]: I0314 10:40:45.014218 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b_196abf24-e768-4d07-ba20-6df2e2c3ec9f/pull/0.log" Mar 14 10:40:45 crc kubenswrapper[5129]: I0314 10:40:45.250278 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b_196abf24-e768-4d07-ba20-6df2e2c3ec9f/pull/0.log" Mar 14 10:40:45 crc kubenswrapper[5129]: I0314 10:40:45.256816 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b_196abf24-e768-4d07-ba20-6df2e2c3ec9f/util/0.log" Mar 14 10:40:45 crc kubenswrapper[5129]: I0314 10:40:45.310389 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874wsj5b_196abf24-e768-4d07-ba20-6df2e2c3ec9f/extract/0.log" Mar 14 10:40:45 crc kubenswrapper[5129]: I0314 10:40:45.504664 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m_2afae4f4-1faf-44cf-81ee-5f11553a1407/util/0.log" Mar 14 10:40:45 crc kubenswrapper[5129]: I0314 10:40:45.942854 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m_2afae4f4-1faf-44cf-81ee-5f11553a1407/util/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.027028 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m_2afae4f4-1faf-44cf-81ee-5f11553a1407/pull/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.048740 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m_2afae4f4-1faf-44cf-81ee-5f11553a1407/pull/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.307102 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m_2afae4f4-1faf-44cf-81ee-5f11553a1407/extract/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.342289 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m_2afae4f4-1faf-44cf-81ee-5f11553a1407/util/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.399870 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1s9r9m_2afae4f4-1faf-44cf-81ee-5f11553a1407/pull/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.528485 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6_322f38ef-7ed1-4282-8ac5-59fcd232784d/util/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.740722 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6_322f38ef-7ed1-4282-8ac5-59fcd232784d/util/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.748549 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6_322f38ef-7ed1-4282-8ac5-59fcd232784d/pull/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.774438 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6_322f38ef-7ed1-4282-8ac5-59fcd232784d/pull/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.969065 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6_322f38ef-7ed1-4282-8ac5-59fcd232784d/util/0.log" Mar 14 10:40:46 crc kubenswrapper[5129]: I0314 10:40:46.971080 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6_322f38ef-7ed1-4282-8ac5-59fcd232784d/extract/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.003262 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e549gj6_322f38ef-7ed1-4282-8ac5-59fcd232784d/pull/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.136257 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.190070 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.291226 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs_23d1faf4-a154-444b-93c5-9ea1eb9e43ea/util/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.375719 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6cdjh"] Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.434982 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs_23d1faf4-a154-444b-93c5-9ea1eb9e43ea/pull/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.441708 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs_23d1faf4-a154-444b-93c5-9ea1eb9e43ea/util/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.481571 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs_23d1faf4-a154-444b-93c5-9ea1eb9e43ea/pull/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.673646 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs_23d1faf4-a154-444b-93c5-9ea1eb9e43ea/extract/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.710206 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs_23d1faf4-a154-444b-93c5-9ea1eb9e43ea/util/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.773071 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08tgtfs_23d1faf4-a154-444b-93c5-9ea1eb9e43ea/pull/0.log" Mar 14 10:40:47 crc kubenswrapper[5129]: I0314 10:40:47.889465 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-strv7_215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/extract-utilities/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.080253 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-strv7_215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/extract-content/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.113016 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-strv7_215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/extract-content/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.134284 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-strv7_215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/extract-utilities/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.342934 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-strv7_215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/extract-content/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.465733 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-strv7_215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/extract-utilities/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.556643 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fmnvg_3f7b1d09-8c5d-41b6-80fa-2de4b81d6912/extract-utilities/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.955787 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fmnvg_3f7b1d09-8c5d-41b6-80fa-2de4b81d6912/extract-utilities/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.966706 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fmnvg_3f7b1d09-8c5d-41b6-80fa-2de4b81d6912/extract-content/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.972766 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fmnvg_3f7b1d09-8c5d-41b6-80fa-2de4b81d6912/extract-content/0.log" Mar 14 10:40:48 crc kubenswrapper[5129]: I0314 10:40:48.991286 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6cdjh" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="registry-server" containerID="cri-o://8ccffb5d8e3bcd07e21a68fb58cde1f7b85372af33b6d749e68ee513b7a6d984" gracePeriod=2 Mar 14 10:40:49 crc kubenswrapper[5129]: I0314 10:40:49.285426 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fmnvg_3f7b1d09-8c5d-41b6-80fa-2de4b81d6912/extract-utilities/0.log" Mar 14 10:40:49 crc kubenswrapper[5129]: I0314 10:40:49.308336 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fmnvg_3f7b1d09-8c5d-41b6-80fa-2de4b81d6912/extract-content/0.log" Mar 14 10:40:49 crc kubenswrapper[5129]: I0314 10:40:49.573747 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:40:49 crc kubenswrapper[5129]: I0314 10:40:49.573795 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:40:49 crc kubenswrapper[5129]: I0314 10:40:49.684925 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-j8zqt_dc3dfa46-c5d8-40d0-8f3b-b0522341edd6/marketplace-operator/0.log" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.001158 5129 generic.go:334] "Generic (PLEG): container finished" podID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerID="8ccffb5d8e3bcd07e21a68fb58cde1f7b85372af33b6d749e68ee513b7a6d984" exitCode=0 Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.001224 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cdjh" event={"ID":"150003e3-0c5e-4b1a-8370-df181d086f5b","Type":"ContainerDied","Data":"8ccffb5d8e3bcd07e21a68fb58cde1f7b85372af33b6d749e68ee513b7a6d984"} Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.070930 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gkxcz_312716da-06d0-4ad7-9edf-f659d31db550/extract-utilities/0.log" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.292869 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gkxcz_312716da-06d0-4ad7-9edf-f659d31db550/extract-utilities/0.log" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.372443 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-strv7_215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/registry-server/0.log" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.442456 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gkxcz_312716da-06d0-4ad7-9edf-f659d31db550/extract-content/0.log" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.553653 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gkxcz_312716da-06d0-4ad7-9edf-f659d31db550/extract-content/0.log" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.635140 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.757896 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gkxcz_312716da-06d0-4ad7-9edf-f659d31db550/extract-content/0.log" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.758834 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-catalog-content\") pod \"150003e3-0c5e-4b1a-8370-df181d086f5b\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.759018 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pbw8\" (UniqueName: \"kubernetes.io/projected/150003e3-0c5e-4b1a-8370-df181d086f5b-kube-api-access-7pbw8\") pod \"150003e3-0c5e-4b1a-8370-df181d086f5b\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.759314 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-utilities\") pod \"150003e3-0c5e-4b1a-8370-df181d086f5b\" (UID: \"150003e3-0c5e-4b1a-8370-df181d086f5b\") " Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.782899 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-utilities" (OuterVolumeSpecName: "utilities") pod "150003e3-0c5e-4b1a-8370-df181d086f5b" (UID: "150003e3-0c5e-4b1a-8370-df181d086f5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.790309 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150003e3-0c5e-4b1a-8370-df181d086f5b-kube-api-access-7pbw8" (OuterVolumeSpecName: "kube-api-access-7pbw8") pod "150003e3-0c5e-4b1a-8370-df181d086f5b" (UID: "150003e3-0c5e-4b1a-8370-df181d086f5b"). InnerVolumeSpecName "kube-api-access-7pbw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.864181 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.864229 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pbw8\" (UniqueName: \"kubernetes.io/projected/150003e3-0c5e-4b1a-8370-df181d086f5b-kube-api-access-7pbw8\") on node \"crc\" DevicePath \"\"" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.898260 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "150003e3-0c5e-4b1a-8370-df181d086f5b" (UID: "150003e3-0c5e-4b1a-8370-df181d086f5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:40:50 crc kubenswrapper[5129]: I0314 10:40:50.966720 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150003e3-0c5e-4b1a-8370-df181d086f5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.081804 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6cdjh" event={"ID":"150003e3-0c5e-4b1a-8370-df181d086f5b","Type":"ContainerDied","Data":"a3f6100f1a1a48af09ac4a4bf49467d0788a04d5c1f81e0b4fcc1a306fcb647f"} Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.081860 5129 scope.go:117] "RemoveContainer" containerID="8ccffb5d8e3bcd07e21a68fb58cde1f7b85372af33b6d749e68ee513b7a6d984" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.082064 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6cdjh" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.108559 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gkxcz_312716da-06d0-4ad7-9edf-f659d31db550/extract-utilities/0.log" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.121309 5129 scope.go:117] "RemoveContainer" containerID="22f48016f1b4adf5603c9dc2fd909130210e88f1333da8f3c4bc573646eaad13" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.178220 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6cdjh"] Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.185780 5129 scope.go:117] "RemoveContainer" containerID="40545c6d237a5b8bc99b154c3c7f2a7b89429dada9fd0fa79d4a50802922b175" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.189892 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6cdjh"] Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.307133 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fmnvg_3f7b1d09-8c5d-41b6-80fa-2de4b81d6912/registry-server/0.log" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.764314 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gkxcz_312716da-06d0-4ad7-9edf-f659d31db550/registry-server/0.log" Mar 14 10:40:51 crc kubenswrapper[5129]: I0314 10:40:51.878732 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-86hcf_65592ebe-e824-4d7d-9385-9073d54404e0/extract-utilities/0.log" Mar 14 10:40:52 crc kubenswrapper[5129]: I0314 10:40:52.047676 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" path="/var/lib/kubelet/pods/150003e3-0c5e-4b1a-8370-df181d086f5b/volumes" Mar 14 10:40:52 crc kubenswrapper[5129]: I0314 10:40:52.104171 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-86hcf_65592ebe-e824-4d7d-9385-9073d54404e0/extract-utilities/0.log" Mar 14 10:40:52 crc kubenswrapper[5129]: I0314 10:40:52.159866 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-86hcf_65592ebe-e824-4d7d-9385-9073d54404e0/extract-content/0.log" Mar 14 10:40:52 crc kubenswrapper[5129]: I0314 10:40:52.171769 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-86hcf_65592ebe-e824-4d7d-9385-9073d54404e0/extract-content/0.log" Mar 14 10:40:52 crc kubenswrapper[5129]: I0314 10:40:52.346359 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-86hcf_65592ebe-e824-4d7d-9385-9073d54404e0/extract-utilities/0.log" Mar 14 10:40:52 crc kubenswrapper[5129]: I0314 10:40:52.413254 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-86hcf_65592ebe-e824-4d7d-9385-9073d54404e0/extract-content/0.log" Mar 14 10:40:53 crc kubenswrapper[5129]: I0314 10:40:53.656187 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-86hcf_65592ebe-e824-4d7d-9385-9073d54404e0/registry-server/0.log" Mar 14 10:41:07 crc kubenswrapper[5129]: I0314 10:41:07.942364 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rst7l_cc3d7fc0-4c50-42d1-984a-822d52e9ce6f/prometheus-operator/0.log" Mar 14 10:41:07 crc kubenswrapper[5129]: I0314 10:41:07.957918 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-75c9666696-dscrs_b056df4d-914e-45a6-8b07-fb2565d30c6a/prometheus-operator-admission-webhook/0.log" Mar 14 10:41:07 crc kubenswrapper[5129]: I0314 10:41:07.970398 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-75c9666696-zx67t_e92945a5-6bd7-41ad-a62e-97f681d79bef/prometheus-operator-admission-webhook/0.log" Mar 14 10:41:08 crc kubenswrapper[5129]: I0314 10:41:08.179259 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-v6wgl_22479fe6-fd03-45b0-8cab-a7b641134b30/perses-operator/0.log" Mar 14 10:41:08 crc kubenswrapper[5129]: I0314 10:41:08.211792 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-xp2x8_bc6e5aa8-2e36-4b86-b83c-b68182ffe0f7/operator/0.log" Mar 14 10:41:19 crc kubenswrapper[5129]: I0314 10:41:19.574212 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:41:19 crc kubenswrapper[5129]: I0314 10:41:19.574754 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:41:19 crc kubenswrapper[5129]: I0314 10:41:19.574802 5129 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" Mar 14 10:41:19 crc kubenswrapper[5129]: I0314 10:41:19.575545 5129 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553"} pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 10:41:19 crc kubenswrapper[5129]: I0314 10:41:19.575586 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" containerID="cri-o://85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" gracePeriod=600 Mar 14 10:41:19 crc kubenswrapper[5129]: E0314 10:41:19.700895 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:41:20 crc kubenswrapper[5129]: I0314 10:41:20.410461 5129 generic.go:334] "Generic (PLEG): container finished" podID="58bd6165-e663-4c4e-82ae-6009ff348000" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" exitCode=0 Mar 14 10:41:20 crc kubenswrapper[5129]: I0314 10:41:20.410522 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerDied","Data":"85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553"} Mar 14 10:41:20 crc kubenswrapper[5129]: I0314 10:41:20.410771 5129 scope.go:117] "RemoveContainer" containerID="d5a7a7f050827fb797d412717f100d7b334050f76e813160cf2256327e999ae1" Mar 14 10:41:20 crc kubenswrapper[5129]: I0314 10:41:20.411774 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:41:20 crc kubenswrapper[5129]: E0314 10:41:20.412174 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:41:35 crc kubenswrapper[5129]: I0314 10:41:35.036975 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:41:35 crc kubenswrapper[5129]: E0314 10:41:35.038876 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:41:43 crc kubenswrapper[5129]: I0314 10:41:43.656488 5129 scope.go:117] "RemoveContainer" containerID="741e86bae75bc824cb15782135de32af33240d667bbcba25bb193c5dda262d14" Mar 14 10:41:50 crc kubenswrapper[5129]: I0314 10:41:50.037195 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:41:50 crc kubenswrapper[5129]: E0314 10:41:50.038184 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.157110 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558082-5cvw9"] Mar 14 10:42:00 crc kubenswrapper[5129]: E0314 10:42:00.158015 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="extract-content" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.158031 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="extract-content" Mar 14 10:42:00 crc kubenswrapper[5129]: E0314 10:42:00.158062 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c52b71-8ab2-4690-a0ae-900bbc1bddac" containerName="oc" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.158068 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c52b71-8ab2-4690-a0ae-900bbc1bddac" containerName="oc" Mar 14 10:42:00 crc kubenswrapper[5129]: E0314 10:42:00.158076 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="extract-utilities" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.158082 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="extract-utilities" Mar 14 10:42:00 crc kubenswrapper[5129]: E0314 10:42:00.158101 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="registry-server" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.158106 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="registry-server" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.158321 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c52b71-8ab2-4690-a0ae-900bbc1bddac" containerName="oc" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.158333 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="150003e3-0c5e-4b1a-8370-df181d086f5b" containerName="registry-server" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.159023 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558082-5cvw9" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.162007 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.162014 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.163030 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.169179 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558082-5cvw9"] Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.198880 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spd28\" (UniqueName: \"kubernetes.io/projected/27cddf18-c261-4270-b8be-c2f2a14250d7-kube-api-access-spd28\") pod \"auto-csr-approver-29558082-5cvw9\" (UID: \"27cddf18-c261-4270-b8be-c2f2a14250d7\") " pod="openshift-infra/auto-csr-approver-29558082-5cvw9" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.300107 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spd28\" (UniqueName: \"kubernetes.io/projected/27cddf18-c261-4270-b8be-c2f2a14250d7-kube-api-access-spd28\") pod \"auto-csr-approver-29558082-5cvw9\" (UID: \"27cddf18-c261-4270-b8be-c2f2a14250d7\") " pod="openshift-infra/auto-csr-approver-29558082-5cvw9" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.329971 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spd28\" (UniqueName: \"kubernetes.io/projected/27cddf18-c261-4270-b8be-c2f2a14250d7-kube-api-access-spd28\") pod \"auto-csr-approver-29558082-5cvw9\" (UID: \"27cddf18-c261-4270-b8be-c2f2a14250d7\") " pod="openshift-infra/auto-csr-approver-29558082-5cvw9" Mar 14 10:42:00 crc kubenswrapper[5129]: I0314 10:42:00.488148 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558082-5cvw9" Mar 14 10:42:01 crc kubenswrapper[5129]: I0314 10:42:01.254284 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558082-5cvw9"] Mar 14 10:42:01 crc kubenswrapper[5129]: I0314 10:42:01.260269 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:42:01 crc kubenswrapper[5129]: I0314 10:42:01.891363 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558082-5cvw9" event={"ID":"27cddf18-c261-4270-b8be-c2f2a14250d7","Type":"ContainerStarted","Data":"568fb69b23a56261ac3ab97a7a740afcebcaa846d88fbb0e35a06a34cf776307"} Mar 14 10:42:02 crc kubenswrapper[5129]: I0314 10:42:02.037561 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:42:02 crc kubenswrapper[5129]: E0314 10:42:02.037952 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:42:02 crc kubenswrapper[5129]: I0314 10:42:02.903199 5129 generic.go:334] "Generic (PLEG): container finished" podID="27cddf18-c261-4270-b8be-c2f2a14250d7" containerID="d595119d31c9998ff570446064f9119bcc8d28db644f39020118fe304c80c932" exitCode=0 Mar 14 10:42:02 crc kubenswrapper[5129]: I0314 10:42:02.903303 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558082-5cvw9" event={"ID":"27cddf18-c261-4270-b8be-c2f2a14250d7","Type":"ContainerDied","Data":"d595119d31c9998ff570446064f9119bcc8d28db644f39020118fe304c80c932"} Mar 14 10:42:05 crc kubenswrapper[5129]: I0314 10:42:05.343571 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558082-5cvw9" Mar 14 10:42:05 crc kubenswrapper[5129]: I0314 10:42:05.508902 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spd28\" (UniqueName: \"kubernetes.io/projected/27cddf18-c261-4270-b8be-c2f2a14250d7-kube-api-access-spd28\") pod \"27cddf18-c261-4270-b8be-c2f2a14250d7\" (UID: \"27cddf18-c261-4270-b8be-c2f2a14250d7\") " Mar 14 10:42:05 crc kubenswrapper[5129]: I0314 10:42:05.514732 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cddf18-c261-4270-b8be-c2f2a14250d7-kube-api-access-spd28" (OuterVolumeSpecName: "kube-api-access-spd28") pod "27cddf18-c261-4270-b8be-c2f2a14250d7" (UID: "27cddf18-c261-4270-b8be-c2f2a14250d7"). InnerVolumeSpecName "kube-api-access-spd28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:42:05 crc kubenswrapper[5129]: I0314 10:42:05.611905 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spd28\" (UniqueName: \"kubernetes.io/projected/27cddf18-c261-4270-b8be-c2f2a14250d7-kube-api-access-spd28\") on node \"crc\" DevicePath \"\"" Mar 14 10:42:05 crc kubenswrapper[5129]: I0314 10:42:05.937222 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558082-5cvw9" event={"ID":"27cddf18-c261-4270-b8be-c2f2a14250d7","Type":"ContainerDied","Data":"568fb69b23a56261ac3ab97a7a740afcebcaa846d88fbb0e35a06a34cf776307"} Mar 14 10:42:05 crc kubenswrapper[5129]: I0314 10:42:05.937287 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="568fb69b23a56261ac3ab97a7a740afcebcaa846d88fbb0e35a06a34cf776307" Mar 14 10:42:05 crc kubenswrapper[5129]: I0314 10:42:05.937365 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558082-5cvw9" Mar 14 10:42:06 crc kubenswrapper[5129]: I0314 10:42:06.416414 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558076-p87t5"] Mar 14 10:42:06 crc kubenswrapper[5129]: I0314 10:42:06.431834 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558076-p87t5"] Mar 14 10:42:08 crc kubenswrapper[5129]: I0314 10:42:08.061491 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c7a1a6-bfca-4c41-a034-80f600602fc5" path="/var/lib/kubelet/pods/39c7a1a6-bfca-4c41-a034-80f600602fc5/volumes" Mar 14 10:42:15 crc kubenswrapper[5129]: I0314 10:42:15.039078 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:42:15 crc kubenswrapper[5129]: E0314 10:42:15.039838 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:42:28 crc kubenswrapper[5129]: I0314 10:42:28.052492 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:42:28 crc kubenswrapper[5129]: E0314 10:42:28.053646 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:42:41 crc kubenswrapper[5129]: I0314 10:42:41.036518 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:42:41 crc kubenswrapper[5129]: E0314 10:42:41.037562 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:42:43 crc kubenswrapper[5129]: I0314 10:42:43.762827 5129 scope.go:117] "RemoveContainer" containerID="65176f0fbd0d0a6f9fb995aaaed4e12bb118cc5ac76f519c304ce529524710ec" Mar 14 10:42:54 crc kubenswrapper[5129]: I0314 10:42:54.036195 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:42:54 crc kubenswrapper[5129]: E0314 10:42:54.036869 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:43:06 crc kubenswrapper[5129]: I0314 10:43:06.037030 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:43:06 crc kubenswrapper[5129]: E0314 10:43:06.037886 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:43:20 crc kubenswrapper[5129]: I0314 10:43:20.037520 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:43:20 crc kubenswrapper[5129]: E0314 10:43:20.038537 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:43:34 crc kubenswrapper[5129]: I0314 10:43:34.041088 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:43:34 crc kubenswrapper[5129]: E0314 10:43:34.042316 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:43:47 crc kubenswrapper[5129]: I0314 10:43:47.036852 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:43:47 crc kubenswrapper[5129]: E0314 10:43:47.037740 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:43:59 crc kubenswrapper[5129]: I0314 10:43:59.036547 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:43:59 crc kubenswrapper[5129]: E0314 10:43:59.037512 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.158970 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558084-dkpbc"] Mar 14 10:44:00 crc kubenswrapper[5129]: E0314 10:44:00.159377 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cddf18-c261-4270-b8be-c2f2a14250d7" containerName="oc" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.159389 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cddf18-c261-4270-b8be-c2f2a14250d7" containerName="oc" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.159597 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cddf18-c261-4270-b8be-c2f2a14250d7" containerName="oc" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.160266 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558084-dkpbc" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.163916 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.165009 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.165694 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.186326 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558084-dkpbc"] Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.226337 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46l9\" (UniqueName: \"kubernetes.io/projected/94b63db9-916b-4f5e-8342-c6a660e2da0b-kube-api-access-g46l9\") pod \"auto-csr-approver-29558084-dkpbc\" (UID: \"94b63db9-916b-4f5e-8342-c6a660e2da0b\") " pod="openshift-infra/auto-csr-approver-29558084-dkpbc" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.328021 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46l9\" (UniqueName: \"kubernetes.io/projected/94b63db9-916b-4f5e-8342-c6a660e2da0b-kube-api-access-g46l9\") pod \"auto-csr-approver-29558084-dkpbc\" (UID: \"94b63db9-916b-4f5e-8342-c6a660e2da0b\") " pod="openshift-infra/auto-csr-approver-29558084-dkpbc" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.369875 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46l9\" (UniqueName: \"kubernetes.io/projected/94b63db9-916b-4f5e-8342-c6a660e2da0b-kube-api-access-g46l9\") pod \"auto-csr-approver-29558084-dkpbc\" (UID: \"94b63db9-916b-4f5e-8342-c6a660e2da0b\") " pod="openshift-infra/auto-csr-approver-29558084-dkpbc" Mar 14 10:44:00 crc kubenswrapper[5129]: I0314 10:44:00.481115 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558084-dkpbc" Mar 14 10:44:01 crc kubenswrapper[5129]: I0314 10:44:01.261317 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558084-dkpbc"] Mar 14 10:44:01 crc kubenswrapper[5129]: I0314 10:44:01.317696 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558084-dkpbc" event={"ID":"94b63db9-916b-4f5e-8342-c6a660e2da0b","Type":"ContainerStarted","Data":"cad0c645b74e55a720ac4d789af9661d76f1998cbb52d46af40b527e7de841b7"} Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.179567 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dwzlt"] Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.181826 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.237677 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwzlt"] Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.273121 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-utilities\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.273240 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-catalog-content\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.273312 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vld5\" (UniqueName: \"kubernetes.io/projected/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-kube-api-access-7vld5\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.375825 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-catalog-content\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.375900 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vld5\" (UniqueName: \"kubernetes.io/projected/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-kube-api-access-7vld5\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.376015 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-utilities\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.376432 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-utilities\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.376428 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-catalog-content\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.393489 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vld5\" (UniqueName: \"kubernetes.io/projected/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-kube-api-access-7vld5\") pod \"redhat-marketplace-dwzlt\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:02 crc kubenswrapper[5129]: I0314 10:44:02.514979 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:03 crc kubenswrapper[5129]: I0314 10:44:03.253635 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwzlt"] Mar 14 10:44:03 crc kubenswrapper[5129]: I0314 10:44:03.369067 5129 generic.go:334] "Generic (PLEG): container finished" podID="94b63db9-916b-4f5e-8342-c6a660e2da0b" containerID="7a746d823d9f87d9fb284255add3f6676172e7e58547ab99e16029610ac0d788" exitCode=0 Mar 14 10:44:03 crc kubenswrapper[5129]: I0314 10:44:03.369134 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558084-dkpbc" event={"ID":"94b63db9-916b-4f5e-8342-c6a660e2da0b","Type":"ContainerDied","Data":"7a746d823d9f87d9fb284255add3f6676172e7e58547ab99e16029610ac0d788"} Mar 14 10:44:03 crc kubenswrapper[5129]: I0314 10:44:03.371758 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwzlt" event={"ID":"b2be89d4-1797-4586-b1b5-0d0831c2c2ca","Type":"ContainerStarted","Data":"334cce3442578d6638d0f1ad49f591a6d6f5af94f82c4f05d1fa4ce0a37b2de9"} Mar 14 10:44:04 crc kubenswrapper[5129]: I0314 10:44:04.387714 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerID="434d39f3f3430f589993661080ea5a8547a262e8d3ed713639116649b394ccbd" exitCode=0 Mar 14 10:44:04 crc kubenswrapper[5129]: I0314 10:44:04.389181 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwzlt" event={"ID":"b2be89d4-1797-4586-b1b5-0d0831c2c2ca","Type":"ContainerDied","Data":"434d39f3f3430f589993661080ea5a8547a262e8d3ed713639116649b394ccbd"} Mar 14 10:44:05 crc kubenswrapper[5129]: I0314 10:44:05.404170 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwzlt" event={"ID":"b2be89d4-1797-4586-b1b5-0d0831c2c2ca","Type":"ContainerStarted","Data":"63b28397cd171a54688206cd986aa8c933deadf653c22c4a31e266f7b1f18631"} Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.022879 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558084-dkpbc" Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.060307 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46l9\" (UniqueName: \"kubernetes.io/projected/94b63db9-916b-4f5e-8342-c6a660e2da0b-kube-api-access-g46l9\") pod \"94b63db9-916b-4f5e-8342-c6a660e2da0b\" (UID: \"94b63db9-916b-4f5e-8342-c6a660e2da0b\") " Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.089795 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b63db9-916b-4f5e-8342-c6a660e2da0b-kube-api-access-g46l9" (OuterVolumeSpecName: "kube-api-access-g46l9") pod "94b63db9-916b-4f5e-8342-c6a660e2da0b" (UID: "94b63db9-916b-4f5e-8342-c6a660e2da0b"). InnerVolumeSpecName "kube-api-access-g46l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.165139 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46l9\" (UniqueName: \"kubernetes.io/projected/94b63db9-916b-4f5e-8342-c6a660e2da0b-kube-api-access-g46l9\") on node \"crc\" DevicePath \"\"" Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.414385 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558084-dkpbc" event={"ID":"94b63db9-916b-4f5e-8342-c6a660e2da0b","Type":"ContainerDied","Data":"cad0c645b74e55a720ac4d789af9661d76f1998cbb52d46af40b527e7de841b7"} Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.414419 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cad0c645b74e55a720ac4d789af9661d76f1998cbb52d46af40b527e7de841b7" Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.414467 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558084-dkpbc" Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.417506 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerID="63b28397cd171a54688206cd986aa8c933deadf653c22c4a31e266f7b1f18631" exitCode=0 Mar 14 10:44:06 crc kubenswrapper[5129]: I0314 10:44:06.417569 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwzlt" event={"ID":"b2be89d4-1797-4586-b1b5-0d0831c2c2ca","Type":"ContainerDied","Data":"63b28397cd171a54688206cd986aa8c933deadf653c22c4a31e266f7b1f18631"} Mar 14 10:44:07 crc kubenswrapper[5129]: I0314 10:44:07.108190 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558078-m7mj2"] Mar 14 10:44:07 crc kubenswrapper[5129]: I0314 10:44:07.118141 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558078-m7mj2"] Mar 14 10:44:07 crc kubenswrapper[5129]: I0314 10:44:07.428440 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwzlt" event={"ID":"b2be89d4-1797-4586-b1b5-0d0831c2c2ca","Type":"ContainerStarted","Data":"8b497b59d44c157880abb1f01ab51d212f539dc75d8b68f1ea38c4c29d2d8ea2"} Mar 14 10:44:07 crc kubenswrapper[5129]: I0314 10:44:07.453095 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dwzlt" podStartSLOduration=3.058971858 podStartE2EDuration="5.453076s" podCreationTimestamp="2026-03-14 10:44:02 +0000 UTC" firstStartedPulling="2026-03-14 10:44:04.389962709 +0000 UTC m=+13507.141877893" lastFinishedPulling="2026-03-14 10:44:06.784066851 +0000 UTC m=+13509.535982035" observedRunningTime="2026-03-14 10:44:07.443679754 +0000 UTC m=+13510.195594938" watchObservedRunningTime="2026-03-14 10:44:07.453076 +0000 UTC m=+13510.204991184" Mar 14 10:44:08 crc kubenswrapper[5129]: I0314 10:44:08.061258 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6435d505-cd76-4e43-b41c-7eb92c46b41e" path="/var/lib/kubelet/pods/6435d505-cd76-4e43-b41c-7eb92c46b41e/volumes" Mar 14 10:44:12 crc kubenswrapper[5129]: I0314 10:44:12.515591 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:12 crc kubenswrapper[5129]: I0314 10:44:12.516104 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:12 crc kubenswrapper[5129]: I0314 10:44:12.568629 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:13 crc kubenswrapper[5129]: I0314 10:44:13.541882 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:13 crc kubenswrapper[5129]: I0314 10:44:13.592490 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwzlt"] Mar 14 10:44:14 crc kubenswrapper[5129]: I0314 10:44:14.036354 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:44:14 crc kubenswrapper[5129]: E0314 10:44:14.036731 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:44:15 crc kubenswrapper[5129]: I0314 10:44:15.495950 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dwzlt" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="registry-server" containerID="cri-o://8b497b59d44c157880abb1f01ab51d212f539dc75d8b68f1ea38c4c29d2d8ea2" gracePeriod=2 Mar 14 10:44:16 crc kubenswrapper[5129]: I0314 10:44:16.507512 5129 generic.go:334] "Generic (PLEG): container finished" podID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerID="8b497b59d44c157880abb1f01ab51d212f539dc75d8b68f1ea38c4c29d2d8ea2" exitCode=0 Mar 14 10:44:16 crc kubenswrapper[5129]: I0314 10:44:16.507587 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwzlt" event={"ID":"b2be89d4-1797-4586-b1b5-0d0831c2c2ca","Type":"ContainerDied","Data":"8b497b59d44c157880abb1f01ab51d212f539dc75d8b68f1ea38c4c29d2d8ea2"} Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.059527 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.093371 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-catalog-content\") pod \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.093624 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vld5\" (UniqueName: \"kubernetes.io/projected/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-kube-api-access-7vld5\") pod \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.093667 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-utilities\") pod \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\" (UID: \"b2be89d4-1797-4586-b1b5-0d0831c2c2ca\") " Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.097024 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-utilities" (OuterVolumeSpecName: "utilities") pod "b2be89d4-1797-4586-b1b5-0d0831c2c2ca" (UID: "b2be89d4-1797-4586-b1b5-0d0831c2c2ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.106861 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-kube-api-access-7vld5" (OuterVolumeSpecName: "kube-api-access-7vld5") pod "b2be89d4-1797-4586-b1b5-0d0831c2c2ca" (UID: "b2be89d4-1797-4586-b1b5-0d0831c2c2ca"). InnerVolumeSpecName "kube-api-access-7vld5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.178283 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2be89d4-1797-4586-b1b5-0d0831c2c2ca" (UID: "b2be89d4-1797-4586-b1b5-0d0831c2c2ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.196834 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vld5\" (UniqueName: \"kubernetes.io/projected/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-kube-api-access-7vld5\") on node \"crc\" DevicePath \"\"" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.196863 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.196873 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2be89d4-1797-4586-b1b5-0d0831c2c2ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.518044 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwzlt" event={"ID":"b2be89d4-1797-4586-b1b5-0d0831c2c2ca","Type":"ContainerDied","Data":"334cce3442578d6638d0f1ad49f591a6d6f5af94f82c4f05d1fa4ce0a37b2de9"} Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.518126 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwzlt" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.518348 5129 scope.go:117] "RemoveContainer" containerID="8b497b59d44c157880abb1f01ab51d212f539dc75d8b68f1ea38c4c29d2d8ea2" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.536905 5129 scope.go:117] "RemoveContainer" containerID="63b28397cd171a54688206cd986aa8c933deadf653c22c4a31e266f7b1f18631" Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.567770 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwzlt"] Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.591191 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwzlt"] Mar 14 10:44:17 crc kubenswrapper[5129]: I0314 10:44:17.599292 5129 scope.go:117] "RemoveContainer" containerID="434d39f3f3430f589993661080ea5a8547a262e8d3ed713639116649b394ccbd" Mar 14 10:44:18 crc kubenswrapper[5129]: I0314 10:44:18.048807 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" path="/var/lib/kubelet/pods/b2be89d4-1797-4586-b1b5-0d0831c2c2ca/volumes" Mar 14 10:44:29 crc kubenswrapper[5129]: I0314 10:44:29.036903 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:44:29 crc kubenswrapper[5129]: E0314 10:44:29.037613 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:44:42 crc kubenswrapper[5129]: I0314 10:44:42.036453 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:44:42 crc kubenswrapper[5129]: E0314 10:44:42.037175 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:44:43 crc kubenswrapper[5129]: I0314 10:44:43.852165 5129 scope.go:117] "RemoveContainer" containerID="bfcd4769cc567bd4313440d1e97a7460a0698bdf36833a5f3848fd7a10fe083a" Mar 14 10:44:56 crc kubenswrapper[5129]: I0314 10:44:56.036294 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:44:56 crc kubenswrapper[5129]: E0314 10:44:56.037112 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.160575 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h"] Mar 14 10:45:00 crc kubenswrapper[5129]: E0314 10:45:00.161448 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="extract-content" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.161459 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="extract-content" Mar 14 10:45:00 crc kubenswrapper[5129]: E0314 10:45:00.161474 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b63db9-916b-4f5e-8342-c6a660e2da0b" containerName="oc" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.161480 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b63db9-916b-4f5e-8342-c6a660e2da0b" containerName="oc" Mar 14 10:45:00 crc kubenswrapper[5129]: E0314 10:45:00.161493 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="registry-server" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.161501 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="registry-server" Mar 14 10:45:00 crc kubenswrapper[5129]: E0314 10:45:00.161510 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="extract-utilities" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.161517 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="extract-utilities" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.161805 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b63db9-916b-4f5e-8342-c6a660e2da0b" containerName="oc" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.161822 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2be89d4-1797-4586-b1b5-0d0831c2c2ca" containerName="registry-server" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.162552 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.164928 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.165075 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.182066 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h"] Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.327269 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkxb\" (UniqueName: \"kubernetes.io/projected/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-kube-api-access-2gkxb\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.327666 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-secret-volume\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.327700 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-config-volume\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.429544 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-secret-volume\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.429593 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-config-volume\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.429805 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkxb\" (UniqueName: \"kubernetes.io/projected/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-kube-api-access-2gkxb\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.431594 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-config-volume\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.436256 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-secret-volume\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.446949 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkxb\" (UniqueName: \"kubernetes.io/projected/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-kube-api-access-2gkxb\") pod \"collect-profiles-29558085-mcq6h\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:00 crc kubenswrapper[5129]: I0314 10:45:00.485175 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:01 crc kubenswrapper[5129]: I0314 10:45:01.162779 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h"] Mar 14 10:45:01 crc kubenswrapper[5129]: I0314 10:45:01.975954 5129 generic.go:334] "Generic (PLEG): container finished" podID="77b10f35-6fb9-4fe3-b84a-f1704c3f29c8" containerID="344a0cea91df816ac95eb1b43ee47ea435fe0510318b78e80868fd672a3c98d7" exitCode=0 Mar 14 10:45:01 crc kubenswrapper[5129]: I0314 10:45:01.976061 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" event={"ID":"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8","Type":"ContainerDied","Data":"344a0cea91df816ac95eb1b43ee47ea435fe0510318b78e80868fd672a3c98d7"} Mar 14 10:45:01 crc kubenswrapper[5129]: I0314 10:45:01.976207 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" event={"ID":"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8","Type":"ContainerStarted","Data":"ecaf5ca35c72185d3979836a9e88c77a38b2311ec4424942478aaf0b1dd59ec7"} Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.390263 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.504796 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-config-volume\") pod \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.505969 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-secret-volume\") pod \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.506656 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gkxb\" (UniqueName: \"kubernetes.io/projected/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-kube-api-access-2gkxb\") pod \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\" (UID: \"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8\") " Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.505741 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "77b10f35-6fb9-4fe3-b84a-f1704c3f29c8" (UID: "77b10f35-6fb9-4fe3-b84a-f1704c3f29c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.531416 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-kube-api-access-2gkxb" (OuterVolumeSpecName: "kube-api-access-2gkxb") pod "77b10f35-6fb9-4fe3-b84a-f1704c3f29c8" (UID: "77b10f35-6fb9-4fe3-b84a-f1704c3f29c8"). InnerVolumeSpecName "kube-api-access-2gkxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.546819 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77b10f35-6fb9-4fe3-b84a-f1704c3f29c8" (UID: "77b10f35-6fb9-4fe3-b84a-f1704c3f29c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.549143 5129 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.651136 5129 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:45:04 crc kubenswrapper[5129]: I0314 10:45:04.651185 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gkxb\" (UniqueName: \"kubernetes.io/projected/77b10f35-6fb9-4fe3-b84a-f1704c3f29c8-kube-api-access-2gkxb\") on node \"crc\" DevicePath \"\"" Mar 14 10:45:05 crc kubenswrapper[5129]: I0314 10:45:05.004149 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" event={"ID":"77b10f35-6fb9-4fe3-b84a-f1704c3f29c8","Type":"ContainerDied","Data":"ecaf5ca35c72185d3979836a9e88c77a38b2311ec4424942478aaf0b1dd59ec7"} Mar 14 10:45:05 crc kubenswrapper[5129]: I0314 10:45:05.004199 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecaf5ca35c72185d3979836a9e88c77a38b2311ec4424942478aaf0b1dd59ec7" Mar 14 10:45:05 crc kubenswrapper[5129]: I0314 10:45:05.004213 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558085-mcq6h" Mar 14 10:45:05 crc kubenswrapper[5129]: I0314 10:45:05.455807 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs"] Mar 14 10:45:05 crc kubenswrapper[5129]: I0314 10:45:05.467633 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-tnfxs"] Mar 14 10:45:06 crc kubenswrapper[5129]: I0314 10:45:06.048085 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38749edd-4240-47f4-b744-227e1cfee8e4" path="/var/lib/kubelet/pods/38749edd-4240-47f4-b744-227e1cfee8e4/volumes" Mar 14 10:45:09 crc kubenswrapper[5129]: I0314 10:45:09.036307 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:45:09 crc kubenswrapper[5129]: E0314 10:45:09.036966 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:45:19 crc kubenswrapper[5129]: I0314 10:45:19.384883 5129 generic.go:334] "Generic (PLEG): container finished" podID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerID="ae27cc552a65078897ff027698bfea5c4d2bd188b407d2bfaed3967ba151d4ab" exitCode=0 Mar 14 10:45:19 crc kubenswrapper[5129]: I0314 10:45:19.384972 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" event={"ID":"f8038849-2f50-4aa8-a28e-9a6b346f12e9","Type":"ContainerDied","Data":"ae27cc552a65078897ff027698bfea5c4d2bd188b407d2bfaed3967ba151d4ab"} Mar 14 10:45:19 crc kubenswrapper[5129]: I0314 10:45:19.386190 5129 scope.go:117] "RemoveContainer" containerID="ae27cc552a65078897ff027698bfea5c4d2bd188b407d2bfaed3967ba151d4ab" Mar 14 10:45:19 crc kubenswrapper[5129]: I0314 10:45:19.950783 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g4f4t_must-gather-zcqk2_f8038849-2f50-4aa8-a28e-9a6b346f12e9/gather/0.log" Mar 14 10:45:21 crc kubenswrapper[5129]: I0314 10:45:21.036282 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:45:21 crc kubenswrapper[5129]: E0314 10:45:21.036725 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:45:32 crc kubenswrapper[5129]: I0314 10:45:32.770551 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g4f4t/must-gather-zcqk2"] Mar 14 10:45:32 crc kubenswrapper[5129]: I0314 10:45:32.771375 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerName="copy" containerID="cri-o://e06f467eafe274ec8e5cc7181f490aebb536e0b420b9a00c0bd8e2f08bc15c48" gracePeriod=2 Mar 14 10:45:32 crc kubenswrapper[5129]: I0314 10:45:32.790858 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g4f4t/must-gather-zcqk2"] Mar 14 10:45:33 crc kubenswrapper[5129]: I0314 10:45:33.037186 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:45:33 crc kubenswrapper[5129]: E0314 10:45:33.037519 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:45:33 crc kubenswrapper[5129]: I0314 10:45:33.557046 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g4f4t_must-gather-zcqk2_f8038849-2f50-4aa8-a28e-9a6b346f12e9/copy/0.log" Mar 14 10:45:33 crc kubenswrapper[5129]: I0314 10:45:33.557759 5129 generic.go:334] "Generic (PLEG): container finished" podID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerID="e06f467eafe274ec8e5cc7181f490aebb536e0b420b9a00c0bd8e2f08bc15c48" exitCode=143 Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.283139 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g4f4t_must-gather-zcqk2_f8038849-2f50-4aa8-a28e-9a6b346f12e9/copy/0.log" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.283576 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.408616 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k24t9\" (UniqueName: \"kubernetes.io/projected/f8038849-2f50-4aa8-a28e-9a6b346f12e9-kube-api-access-k24t9\") pod \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.409080 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8038849-2f50-4aa8-a28e-9a6b346f12e9-must-gather-output\") pod \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\" (UID: \"f8038849-2f50-4aa8-a28e-9a6b346f12e9\") " Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.431758 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8038849-2f50-4aa8-a28e-9a6b346f12e9-kube-api-access-k24t9" (OuterVolumeSpecName: "kube-api-access-k24t9") pod "f8038849-2f50-4aa8-a28e-9a6b346f12e9" (UID: "f8038849-2f50-4aa8-a28e-9a6b346f12e9"). InnerVolumeSpecName "kube-api-access-k24t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.511647 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k24t9\" (UniqueName: \"kubernetes.io/projected/f8038849-2f50-4aa8-a28e-9a6b346f12e9-kube-api-access-k24t9\") on node \"crc\" DevicePath \"\"" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.569044 5129 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g4f4t_must-gather-zcqk2_f8038849-2f50-4aa8-a28e-9a6b346f12e9/copy/0.log" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.569663 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4f4t/must-gather-zcqk2" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.569555 5129 scope.go:117] "RemoveContainer" containerID="e06f467eafe274ec8e5cc7181f490aebb536e0b420b9a00c0bd8e2f08bc15c48" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.594277 5129 scope.go:117] "RemoveContainer" containerID="ae27cc552a65078897ff027698bfea5c4d2bd188b407d2bfaed3967ba151d4ab" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.691748 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8038849-2f50-4aa8-a28e-9a6b346f12e9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f8038849-2f50-4aa8-a28e-9a6b346f12e9" (UID: "f8038849-2f50-4aa8-a28e-9a6b346f12e9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:45:34 crc kubenswrapper[5129]: I0314 10:45:34.714812 5129 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8038849-2f50-4aa8-a28e-9a6b346f12e9-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 14 10:45:36 crc kubenswrapper[5129]: I0314 10:45:36.047690 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" path="/var/lib/kubelet/pods/f8038849-2f50-4aa8-a28e-9a6b346f12e9/volumes" Mar 14 10:45:43 crc kubenswrapper[5129]: I0314 10:45:43.965201 5129 scope.go:117] "RemoveContainer" containerID="1278758130bfc7c6566c3fcb95afbe01ccc6940e11bc660cee618d8eefb4e058" Mar 14 10:45:48 crc kubenswrapper[5129]: I0314 10:45:48.046181 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:45:48 crc kubenswrapper[5129]: E0314 10:45:48.047311 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.161300 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558086-grqgn"] Mar 14 10:46:00 crc kubenswrapper[5129]: E0314 10:46:00.162299 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerName="gather" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.162312 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerName="gather" Mar 14 10:46:00 crc kubenswrapper[5129]: E0314 10:46:00.162335 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerName="copy" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.162343 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerName="copy" Mar 14 10:46:00 crc kubenswrapper[5129]: E0314 10:46:00.162361 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b10f35-6fb9-4fe3-b84a-f1704c3f29c8" containerName="collect-profiles" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.162367 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b10f35-6fb9-4fe3-b84a-f1704c3f29c8" containerName="collect-profiles" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.162574 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerName="copy" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.162588 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8038849-2f50-4aa8-a28e-9a6b346f12e9" containerName="gather" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.162626 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="77b10f35-6fb9-4fe3-b84a-f1704c3f29c8" containerName="collect-profiles" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.163296 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558086-grqgn" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.165248 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.165678 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.170477 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.177769 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558086-grqgn"] Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.279459 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klpd5\" (UniqueName: \"kubernetes.io/projected/206199a6-f6d1-47e7-8f45-605535b31856-kube-api-access-klpd5\") pod \"auto-csr-approver-29558086-grqgn\" (UID: \"206199a6-f6d1-47e7-8f45-605535b31856\") " pod="openshift-infra/auto-csr-approver-29558086-grqgn" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.382390 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klpd5\" (UniqueName: \"kubernetes.io/projected/206199a6-f6d1-47e7-8f45-605535b31856-kube-api-access-klpd5\") pod \"auto-csr-approver-29558086-grqgn\" (UID: \"206199a6-f6d1-47e7-8f45-605535b31856\") " pod="openshift-infra/auto-csr-approver-29558086-grqgn" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.405894 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klpd5\" (UniqueName: \"kubernetes.io/projected/206199a6-f6d1-47e7-8f45-605535b31856-kube-api-access-klpd5\") pod \"auto-csr-approver-29558086-grqgn\" (UID: \"206199a6-f6d1-47e7-8f45-605535b31856\") " pod="openshift-infra/auto-csr-approver-29558086-grqgn" Mar 14 10:46:00 crc kubenswrapper[5129]: I0314 10:46:00.484367 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558086-grqgn" Mar 14 10:46:01 crc kubenswrapper[5129]: I0314 10:46:01.037169 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:46:01 crc kubenswrapper[5129]: E0314 10:46:01.037815 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:46:01 crc kubenswrapper[5129]: I0314 10:46:01.450496 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558086-grqgn"] Mar 14 10:46:01 crc kubenswrapper[5129]: W0314 10:46:01.459189 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206199a6_f6d1_47e7_8f45_605535b31856.slice/crio-9a0d4a0ace32f3ba9a52cea918f4a99e6e1d0807742b8b4e9e908bbb2452f356 WatchSource:0}: Error finding container 9a0d4a0ace32f3ba9a52cea918f4a99e6e1d0807742b8b4e9e908bbb2452f356: Status 404 returned error can't find the container with id 9a0d4a0ace32f3ba9a52cea918f4a99e6e1d0807742b8b4e9e908bbb2452f356 Mar 14 10:46:01 crc kubenswrapper[5129]: I0314 10:46:01.863013 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558086-grqgn" event={"ID":"206199a6-f6d1-47e7-8f45-605535b31856","Type":"ContainerStarted","Data":"9a0d4a0ace32f3ba9a52cea918f4a99e6e1d0807742b8b4e9e908bbb2452f356"} Mar 14 10:46:02 crc kubenswrapper[5129]: I0314 10:46:02.873065 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558086-grqgn" event={"ID":"206199a6-f6d1-47e7-8f45-605535b31856","Type":"ContainerStarted","Data":"46d73682941d1e786c86b6a9bee5ab150906052338884a976b759ca98bb03676"} Mar 14 10:46:02 crc kubenswrapper[5129]: I0314 10:46:02.898145 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558086-grqgn" podStartSLOduration=1.9031750509999998 podStartE2EDuration="2.898129126s" podCreationTimestamp="2026-03-14 10:46:00 +0000 UTC" firstStartedPulling="2026-03-14 10:46:01.461958344 +0000 UTC m=+13624.213873538" lastFinishedPulling="2026-03-14 10:46:02.456912429 +0000 UTC m=+13625.208827613" observedRunningTime="2026-03-14 10:46:02.892809901 +0000 UTC m=+13625.644725105" watchObservedRunningTime="2026-03-14 10:46:02.898129126 +0000 UTC m=+13625.650044310" Mar 14 10:46:03 crc kubenswrapper[5129]: I0314 10:46:03.884480 5129 generic.go:334] "Generic (PLEG): container finished" podID="206199a6-f6d1-47e7-8f45-605535b31856" containerID="46d73682941d1e786c86b6a9bee5ab150906052338884a976b759ca98bb03676" exitCode=0 Mar 14 10:46:03 crc kubenswrapper[5129]: I0314 10:46:03.884553 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558086-grqgn" event={"ID":"206199a6-f6d1-47e7-8f45-605535b31856","Type":"ContainerDied","Data":"46d73682941d1e786c86b6a9bee5ab150906052338884a976b759ca98bb03676"} Mar 14 10:46:06 crc kubenswrapper[5129]: I0314 10:46:06.346477 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558086-grqgn" Mar 14 10:46:06 crc kubenswrapper[5129]: I0314 10:46:06.407571 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klpd5\" (UniqueName: \"kubernetes.io/projected/206199a6-f6d1-47e7-8f45-605535b31856-kube-api-access-klpd5\") pod \"206199a6-f6d1-47e7-8f45-605535b31856\" (UID: \"206199a6-f6d1-47e7-8f45-605535b31856\") " Mar 14 10:46:06 crc kubenswrapper[5129]: I0314 10:46:06.414231 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/206199a6-f6d1-47e7-8f45-605535b31856-kube-api-access-klpd5" (OuterVolumeSpecName: "kube-api-access-klpd5") pod "206199a6-f6d1-47e7-8f45-605535b31856" (UID: "206199a6-f6d1-47e7-8f45-605535b31856"). InnerVolumeSpecName "kube-api-access-klpd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:46:06 crc kubenswrapper[5129]: I0314 10:46:06.510532 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klpd5\" (UniqueName: \"kubernetes.io/projected/206199a6-f6d1-47e7-8f45-605535b31856-kube-api-access-klpd5\") on node \"crc\" DevicePath \"\"" Mar 14 10:46:06 crc kubenswrapper[5129]: I0314 10:46:06.923741 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558086-grqgn" event={"ID":"206199a6-f6d1-47e7-8f45-605535b31856","Type":"ContainerDied","Data":"9a0d4a0ace32f3ba9a52cea918f4a99e6e1d0807742b8b4e9e908bbb2452f356"} Mar 14 10:46:06 crc kubenswrapper[5129]: I0314 10:46:06.924048 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a0d4a0ace32f3ba9a52cea918f4a99e6e1d0807742b8b4e9e908bbb2452f356" Mar 14 10:46:06 crc kubenswrapper[5129]: I0314 10:46:06.923847 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558086-grqgn" Mar 14 10:46:07 crc kubenswrapper[5129]: I0314 10:46:07.435491 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558080-vhsf9"] Mar 14 10:46:07 crc kubenswrapper[5129]: I0314 10:46:07.452402 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558080-vhsf9"] Mar 14 10:46:08 crc kubenswrapper[5129]: I0314 10:46:08.057091 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c52b71-8ab2-4690-a0ae-900bbc1bddac" path="/var/lib/kubelet/pods/b0c52b71-8ab2-4690-a0ae-900bbc1bddac/volumes" Mar 14 10:46:14 crc kubenswrapper[5129]: I0314 10:46:14.037039 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:46:14 crc kubenswrapper[5129]: E0314 10:46:14.038018 5129 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lf9lh_openshift-machine-config-operator(58bd6165-e663-4c4e-82ae-6009ff348000)\"" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" Mar 14 10:46:25 crc kubenswrapper[5129]: I0314 10:46:25.036485 5129 scope.go:117] "RemoveContainer" containerID="85c64989780c9729c536db55f547973bcc9435faef5186fa2693d47e28dbd553" Mar 14 10:46:26 crc kubenswrapper[5129]: I0314 10:46:26.188564 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" event={"ID":"58bd6165-e663-4c4e-82ae-6009ff348000","Type":"ContainerStarted","Data":"d55d1beb8d93bfe755cf7f8ed03463941f0cfaf5979f6d0b2f94ea95daa236b9"} Mar 14 10:46:44 crc kubenswrapper[5129]: I0314 10:46:44.078787 5129 scope.go:117] "RemoveContainer" containerID="9247530f996a6e3c58d678f6b9f461d343db8e9dcd962914257d3b9e360402c2" Mar 14 10:47:20 crc kubenswrapper[5129]: I0314 10:47:20.856821 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2nggq"] Mar 14 10:47:20 crc kubenswrapper[5129]: E0314 10:47:20.857813 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206199a6-f6d1-47e7-8f45-605535b31856" containerName="oc" Mar 14 10:47:20 crc kubenswrapper[5129]: I0314 10:47:20.857827 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="206199a6-f6d1-47e7-8f45-605535b31856" containerName="oc" Mar 14 10:47:20 crc kubenswrapper[5129]: I0314 10:47:20.858069 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="206199a6-f6d1-47e7-8f45-605535b31856" containerName="oc" Mar 14 10:47:20 crc kubenswrapper[5129]: I0314 10:47:20.859507 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:20 crc kubenswrapper[5129]: I0314 10:47:20.874924 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nggq"] Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.036800 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vql8\" (UniqueName: \"kubernetes.io/projected/b282534b-b730-42a6-922a-c69ad5ab1b43-kube-api-access-4vql8\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.036883 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-utilities\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.037197 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-catalog-content\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.139018 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vql8\" (UniqueName: \"kubernetes.io/projected/b282534b-b730-42a6-922a-c69ad5ab1b43-kube-api-access-4vql8\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.139102 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-utilities\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.139194 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-catalog-content\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.139677 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-catalog-content\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.139782 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-utilities\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.160816 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vql8\" (UniqueName: \"kubernetes.io/projected/b282534b-b730-42a6-922a-c69ad5ab1b43-kube-api-access-4vql8\") pod \"community-operators-2nggq\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.190375 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:21 crc kubenswrapper[5129]: W0314 10:47:21.962844 5129 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb282534b_b730_42a6_922a_c69ad5ab1b43.slice/crio-f760d224f72bbcee2137f3eba03af4d4e1da496cc94f46de117fe91ae9b5f8f4 WatchSource:0}: Error finding container f760d224f72bbcee2137f3eba03af4d4e1da496cc94f46de117fe91ae9b5f8f4: Status 404 returned error can't find the container with id f760d224f72bbcee2137f3eba03af4d4e1da496cc94f46de117fe91ae9b5f8f4 Mar 14 10:47:21 crc kubenswrapper[5129]: I0314 10:47:21.978407 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nggq"] Mar 14 10:47:22 crc kubenswrapper[5129]: I0314 10:47:22.853058 5129 generic.go:334] "Generic (PLEG): container finished" podID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerID="76db5b3366bff9608f8f65e374e9e35cfadacf9aa2f44d5698546fe7e6a5de26" exitCode=0 Mar 14 10:47:22 crc kubenswrapper[5129]: I0314 10:47:22.853160 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nggq" event={"ID":"b282534b-b730-42a6-922a-c69ad5ab1b43","Type":"ContainerDied","Data":"76db5b3366bff9608f8f65e374e9e35cfadacf9aa2f44d5698546fe7e6a5de26"} Mar 14 10:47:22 crc kubenswrapper[5129]: I0314 10:47:22.853431 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nggq" event={"ID":"b282534b-b730-42a6-922a-c69ad5ab1b43","Type":"ContainerStarted","Data":"f760d224f72bbcee2137f3eba03af4d4e1da496cc94f46de117fe91ae9b5f8f4"} Mar 14 10:47:22 crc kubenswrapper[5129]: I0314 10:47:22.856923 5129 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 10:47:23 crc kubenswrapper[5129]: I0314 10:47:23.866837 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nggq" event={"ID":"b282534b-b730-42a6-922a-c69ad5ab1b43","Type":"ContainerStarted","Data":"b526dd8c424916755923fdbd81f61bdadaff09ca083ac7841e35c3103ddd80fb"} Mar 14 10:47:25 crc kubenswrapper[5129]: I0314 10:47:25.900809 5129 generic.go:334] "Generic (PLEG): container finished" podID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerID="b526dd8c424916755923fdbd81f61bdadaff09ca083ac7841e35c3103ddd80fb" exitCode=0 Mar 14 10:47:25 crc kubenswrapper[5129]: I0314 10:47:25.900870 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nggq" event={"ID":"b282534b-b730-42a6-922a-c69ad5ab1b43","Type":"ContainerDied","Data":"b526dd8c424916755923fdbd81f61bdadaff09ca083ac7841e35c3103ddd80fb"} Mar 14 10:47:26 crc kubenswrapper[5129]: I0314 10:47:26.913165 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nggq" event={"ID":"b282534b-b730-42a6-922a-c69ad5ab1b43","Type":"ContainerStarted","Data":"4e5df2a1afc6e7c6d361986bd2ec57fedf57b1ed3e7f64fc60f4a5d4effefb16"} Mar 14 10:47:26 crc kubenswrapper[5129]: I0314 10:47:26.941045 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2nggq" podStartSLOduration=3.478708327 podStartE2EDuration="6.941025053s" podCreationTimestamp="2026-03-14 10:47:20 +0000 UTC" firstStartedPulling="2026-03-14 10:47:22.856582177 +0000 UTC m=+13705.608497371" lastFinishedPulling="2026-03-14 10:47:26.318898913 +0000 UTC m=+13709.070814097" observedRunningTime="2026-03-14 10:47:26.938667719 +0000 UTC m=+13709.690582903" watchObservedRunningTime="2026-03-14 10:47:26.941025053 +0000 UTC m=+13709.692940247" Mar 14 10:47:31 crc kubenswrapper[5129]: I0314 10:47:31.190825 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:31 crc kubenswrapper[5129]: I0314 10:47:31.191478 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:31 crc kubenswrapper[5129]: I0314 10:47:31.251799 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:32 crc kubenswrapper[5129]: I0314 10:47:32.071974 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:32 crc kubenswrapper[5129]: I0314 10:47:32.144591 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nggq"] Mar 14 10:47:33 crc kubenswrapper[5129]: I0314 10:47:33.991684 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2nggq" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="registry-server" containerID="cri-o://4e5df2a1afc6e7c6d361986bd2ec57fedf57b1ed3e7f64fc60f4a5d4effefb16" gracePeriod=2 Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.003664 5129 generic.go:334] "Generic (PLEG): container finished" podID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerID="4e5df2a1afc6e7c6d361986bd2ec57fedf57b1ed3e7f64fc60f4a5d4effefb16" exitCode=0 Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.004011 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nggq" event={"ID":"b282534b-b730-42a6-922a-c69ad5ab1b43","Type":"ContainerDied","Data":"4e5df2a1afc6e7c6d361986bd2ec57fedf57b1ed3e7f64fc60f4a5d4effefb16"} Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.767471 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.786588 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-utilities\") pod \"b282534b-b730-42a6-922a-c69ad5ab1b43\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.786757 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-catalog-content\") pod \"b282534b-b730-42a6-922a-c69ad5ab1b43\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.786842 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vql8\" (UniqueName: \"kubernetes.io/projected/b282534b-b730-42a6-922a-c69ad5ab1b43-kube-api-access-4vql8\") pod \"b282534b-b730-42a6-922a-c69ad5ab1b43\" (UID: \"b282534b-b730-42a6-922a-c69ad5ab1b43\") " Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.787557 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-utilities" (OuterVolumeSpecName: "utilities") pod "b282534b-b730-42a6-922a-c69ad5ab1b43" (UID: "b282534b-b730-42a6-922a-c69ad5ab1b43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.794717 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b282534b-b730-42a6-922a-c69ad5ab1b43-kube-api-access-4vql8" (OuterVolumeSpecName: "kube-api-access-4vql8") pod "b282534b-b730-42a6-922a-c69ad5ab1b43" (UID: "b282534b-b730-42a6-922a-c69ad5ab1b43"). InnerVolumeSpecName "kube-api-access-4vql8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.855491 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b282534b-b730-42a6-922a-c69ad5ab1b43" (UID: "b282534b-b730-42a6-922a-c69ad5ab1b43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.889909 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.889937 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vql8\" (UniqueName: \"kubernetes.io/projected/b282534b-b730-42a6-922a-c69ad5ab1b43-kube-api-access-4vql8\") on node \"crc\" DevicePath \"\"" Mar 14 10:47:35 crc kubenswrapper[5129]: I0314 10:47:35.889948 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b282534b-b730-42a6-922a-c69ad5ab1b43-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:47:36 crc kubenswrapper[5129]: I0314 10:47:36.025968 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nggq" event={"ID":"b282534b-b730-42a6-922a-c69ad5ab1b43","Type":"ContainerDied","Data":"f760d224f72bbcee2137f3eba03af4d4e1da496cc94f46de117fe91ae9b5f8f4"} Mar 14 10:47:36 crc kubenswrapper[5129]: I0314 10:47:36.026026 5129 scope.go:117] "RemoveContainer" containerID="4e5df2a1afc6e7c6d361986bd2ec57fedf57b1ed3e7f64fc60f4a5d4effefb16" Mar 14 10:47:36 crc kubenswrapper[5129]: I0314 10:47:36.026190 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nggq" Mar 14 10:47:36 crc kubenswrapper[5129]: I0314 10:47:36.064430 5129 scope.go:117] "RemoveContainer" containerID="b526dd8c424916755923fdbd81f61bdadaff09ca083ac7841e35c3103ddd80fb" Mar 14 10:47:36 crc kubenswrapper[5129]: I0314 10:47:36.070260 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nggq"] Mar 14 10:47:36 crc kubenswrapper[5129]: I0314 10:47:36.083141 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2nggq"] Mar 14 10:47:36 crc kubenswrapper[5129]: I0314 10:47:36.093002 5129 scope.go:117] "RemoveContainer" containerID="76db5b3366bff9608f8f65e374e9e35cfadacf9aa2f44d5698546fe7e6a5de26" Mar 14 10:47:38 crc kubenswrapper[5129]: I0314 10:47:38.054840 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" path="/var/lib/kubelet/pods/b282534b-b730-42a6-922a-c69ad5ab1b43/volumes" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.292138 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jvcc8"] Mar 14 10:47:56 crc kubenswrapper[5129]: E0314 10:47:56.300051 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="extract-content" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.300070 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="extract-content" Mar 14 10:47:56 crc kubenswrapper[5129]: E0314 10:47:56.300081 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="registry-server" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.300087 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="registry-server" Mar 14 10:47:56 crc kubenswrapper[5129]: E0314 10:47:56.300106 5129 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="extract-utilities" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.300113 5129 state_mem.go:107] "Deleted CPUSet assignment" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="extract-utilities" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.300758 5129 memory_manager.go:354] "RemoveStaleState removing state" podUID="b282534b-b730-42a6-922a-c69ad5ab1b43" containerName="registry-server" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.304855 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.312544 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvcc8"] Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.501824 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbecac7-4314-4b9f-83fe-8ad31e00a241-utilities\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.501958 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbecac7-4314-4b9f-83fe-8ad31e00a241-catalog-content\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.502005 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v2ws\" (UniqueName: \"kubernetes.io/projected/6dbecac7-4314-4b9f-83fe-8ad31e00a241-kube-api-access-2v2ws\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.603816 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbecac7-4314-4b9f-83fe-8ad31e00a241-utilities\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.603916 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbecac7-4314-4b9f-83fe-8ad31e00a241-catalog-content\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.603957 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v2ws\" (UniqueName: \"kubernetes.io/projected/6dbecac7-4314-4b9f-83fe-8ad31e00a241-kube-api-access-2v2ws\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.604347 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbecac7-4314-4b9f-83fe-8ad31e00a241-utilities\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.604413 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbecac7-4314-4b9f-83fe-8ad31e00a241-catalog-content\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.626588 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v2ws\" (UniqueName: \"kubernetes.io/projected/6dbecac7-4314-4b9f-83fe-8ad31e00a241-kube-api-access-2v2ws\") pod \"certified-operators-jvcc8\" (UID: \"6dbecac7-4314-4b9f-83fe-8ad31e00a241\") " pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:56 crc kubenswrapper[5129]: I0314 10:47:56.637480 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:47:57 crc kubenswrapper[5129]: I0314 10:47:57.381340 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvcc8"] Mar 14 10:47:58 crc kubenswrapper[5129]: I0314 10:47:58.258938 5129 generic.go:334] "Generic (PLEG): container finished" podID="6dbecac7-4314-4b9f-83fe-8ad31e00a241" containerID="5adee4a67d144d3e0ca9810ab0478aabb65a77d60b76196e870b74200d1a2ce8" exitCode=0 Mar 14 10:47:58 crc kubenswrapper[5129]: I0314 10:47:58.260431 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcc8" event={"ID":"6dbecac7-4314-4b9f-83fe-8ad31e00a241","Type":"ContainerDied","Data":"5adee4a67d144d3e0ca9810ab0478aabb65a77d60b76196e870b74200d1a2ce8"} Mar 14 10:47:58 crc kubenswrapper[5129]: I0314 10:47:58.260490 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcc8" event={"ID":"6dbecac7-4314-4b9f-83fe-8ad31e00a241","Type":"ContainerStarted","Data":"b2dea07b15644cc91555c927237c5cf8d809386dd3f375dd4d696633462b09cd"} Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.161425 5129 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558088-zn69k"] Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.163882 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558088-zn69k" Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.165959 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.166161 5129 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mnntf" Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.166307 5129 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.175244 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558088-zn69k"] Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.314462 5129 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm496\" (UniqueName: \"kubernetes.io/projected/3f6d89d0-7c26-48d3-9075-5a731e30c0a8-kube-api-access-vm496\") pod \"auto-csr-approver-29558088-zn69k\" (UID: \"3f6d89d0-7c26-48d3-9075-5a731e30c0a8\") " pod="openshift-infra/auto-csr-approver-29558088-zn69k" Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.416629 5129 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm496\" (UniqueName: \"kubernetes.io/projected/3f6d89d0-7c26-48d3-9075-5a731e30c0a8-kube-api-access-vm496\") pod \"auto-csr-approver-29558088-zn69k\" (UID: \"3f6d89d0-7c26-48d3-9075-5a731e30c0a8\") " pod="openshift-infra/auto-csr-approver-29558088-zn69k" Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.436954 5129 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm496\" (UniqueName: \"kubernetes.io/projected/3f6d89d0-7c26-48d3-9075-5a731e30c0a8-kube-api-access-vm496\") pod \"auto-csr-approver-29558088-zn69k\" (UID: \"3f6d89d0-7c26-48d3-9075-5a731e30c0a8\") " pod="openshift-infra/auto-csr-approver-29558088-zn69k" Mar 14 10:48:00 crc kubenswrapper[5129]: I0314 10:48:00.483210 5129 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558088-zn69k" Mar 14 10:48:01 crc kubenswrapper[5129]: I0314 10:48:01.148589 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558088-zn69k"] Mar 14 10:48:03 crc kubenswrapper[5129]: I0314 10:48:03.318525 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558088-zn69k" event={"ID":"3f6d89d0-7c26-48d3-9075-5a731e30c0a8","Type":"ContainerStarted","Data":"47af5191929ef5b2a72781b5587485d401139ac3d03f033d13bc18e0db56b94b"} Mar 14 10:48:04 crc kubenswrapper[5129]: I0314 10:48:04.332977 5129 generic.go:334] "Generic (PLEG): container finished" podID="6dbecac7-4314-4b9f-83fe-8ad31e00a241" containerID="5a4baed6c01a745ddf4c8c217dd5a8fb81bf9db54c6b5f51690e6752536be4b9" exitCode=0 Mar 14 10:48:04 crc kubenswrapper[5129]: I0314 10:48:04.333056 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcc8" event={"ID":"6dbecac7-4314-4b9f-83fe-8ad31e00a241","Type":"ContainerDied","Data":"5a4baed6c01a745ddf4c8c217dd5a8fb81bf9db54c6b5f51690e6752536be4b9"} Mar 14 10:48:04 crc kubenswrapper[5129]: I0314 10:48:04.342126 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558088-zn69k" event={"ID":"3f6d89d0-7c26-48d3-9075-5a731e30c0a8","Type":"ContainerStarted","Data":"50b6b53b3ea06649db868e07ba1d22669352dd23fc9207ce5d7f97a36d278968"} Mar 14 10:48:04 crc kubenswrapper[5129]: I0314 10:48:04.391554 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558088-zn69k" podStartSLOduration=3.497109918 podStartE2EDuration="4.391534949s" podCreationTimestamp="2026-03-14 10:48:00 +0000 UTC" firstStartedPulling="2026-03-14 10:48:03.01416384 +0000 UTC m=+13745.766079024" lastFinishedPulling="2026-03-14 10:48:03.908588871 +0000 UTC m=+13746.660504055" observedRunningTime="2026-03-14 10:48:04.383104701 +0000 UTC m=+13747.135019915" watchObservedRunningTime="2026-03-14 10:48:04.391534949 +0000 UTC m=+13747.143450133" Mar 14 10:48:05 crc kubenswrapper[5129]: I0314 10:48:05.356104 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcc8" event={"ID":"6dbecac7-4314-4b9f-83fe-8ad31e00a241","Type":"ContainerStarted","Data":"0da27352503ab7b9caea79773128eba78c3bf9882add437abda0ff5176f96c6a"} Mar 14 10:48:05 crc kubenswrapper[5129]: I0314 10:48:05.360195 5129 generic.go:334] "Generic (PLEG): container finished" podID="3f6d89d0-7c26-48d3-9075-5a731e30c0a8" containerID="50b6b53b3ea06649db868e07ba1d22669352dd23fc9207ce5d7f97a36d278968" exitCode=0 Mar 14 10:48:05 crc kubenswrapper[5129]: I0314 10:48:05.360332 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558088-zn69k" event={"ID":"3f6d89d0-7c26-48d3-9075-5a731e30c0a8","Type":"ContainerDied","Data":"50b6b53b3ea06649db868e07ba1d22669352dd23fc9207ce5d7f97a36d278968"} Mar 14 10:48:05 crc kubenswrapper[5129]: I0314 10:48:05.435738 5129 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jvcc8" podStartSLOduration=2.725874502 podStartE2EDuration="9.435717968s" podCreationTimestamp="2026-03-14 10:47:56 +0000 UTC" firstStartedPulling="2026-03-14 10:47:58.263035719 +0000 UTC m=+13741.014950913" lastFinishedPulling="2026-03-14 10:48:04.972879185 +0000 UTC m=+13747.724794379" observedRunningTime="2026-03-14 10:48:05.375751098 +0000 UTC m=+13748.127666282" watchObservedRunningTime="2026-03-14 10:48:05.435717968 +0000 UTC m=+13748.187633152" Mar 14 10:48:06 crc kubenswrapper[5129]: I0314 10:48:06.637845 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:48:06 crc kubenswrapper[5129]: I0314 10:48:06.638287 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:48:07 crc kubenswrapper[5129]: I0314 10:48:07.698059 5129 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jvcc8" podUID="6dbecac7-4314-4b9f-83fe-8ad31e00a241" containerName="registry-server" probeResult="failure" output=< Mar 14 10:48:07 crc kubenswrapper[5129]: timeout: failed to connect service ":50051" within 1s Mar 14 10:48:07 crc kubenswrapper[5129]: > Mar 14 10:48:07 crc kubenswrapper[5129]: I0314 10:48:07.848342 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558088-zn69k" Mar 14 10:48:07 crc kubenswrapper[5129]: I0314 10:48:07.885535 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm496\" (UniqueName: \"kubernetes.io/projected/3f6d89d0-7c26-48d3-9075-5a731e30c0a8-kube-api-access-vm496\") pod \"3f6d89d0-7c26-48d3-9075-5a731e30c0a8\" (UID: \"3f6d89d0-7c26-48d3-9075-5a731e30c0a8\") " Mar 14 10:48:07 crc kubenswrapper[5129]: I0314 10:48:07.894942 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6d89d0-7c26-48d3-9075-5a731e30c0a8-kube-api-access-vm496" (OuterVolumeSpecName: "kube-api-access-vm496") pod "3f6d89d0-7c26-48d3-9075-5a731e30c0a8" (UID: "3f6d89d0-7c26-48d3-9075-5a731e30c0a8"). InnerVolumeSpecName "kube-api-access-vm496". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:48:07 crc kubenswrapper[5129]: I0314 10:48:07.988410 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm496\" (UniqueName: \"kubernetes.io/projected/3f6d89d0-7c26-48d3-9075-5a731e30c0a8-kube-api-access-vm496\") on node \"crc\" DevicePath \"\"" Mar 14 10:48:08 crc kubenswrapper[5129]: I0314 10:48:08.399970 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558088-zn69k" event={"ID":"3f6d89d0-7c26-48d3-9075-5a731e30c0a8","Type":"ContainerDied","Data":"47af5191929ef5b2a72781b5587485d401139ac3d03f033d13bc18e0db56b94b"} Mar 14 10:48:08 crc kubenswrapper[5129]: I0314 10:48:08.400273 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47af5191929ef5b2a72781b5587485d401139ac3d03f033d13bc18e0db56b94b" Mar 14 10:48:08 crc kubenswrapper[5129]: I0314 10:48:08.400057 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558088-zn69k" Mar 14 10:48:08 crc kubenswrapper[5129]: I0314 10:48:08.936132 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558082-5cvw9"] Mar 14 10:48:08 crc kubenswrapper[5129]: I0314 10:48:08.947099 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558082-5cvw9"] Mar 14 10:48:10 crc kubenswrapper[5129]: I0314 10:48:10.067928 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27cddf18-c261-4270-b8be-c2f2a14250d7" path="/var/lib/kubelet/pods/27cddf18-c261-4270-b8be-c2f2a14250d7/volumes" Mar 14 10:48:16 crc kubenswrapper[5129]: I0314 10:48:16.691658 5129 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:48:16 crc kubenswrapper[5129]: I0314 10:48:16.752712 5129 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jvcc8" Mar 14 10:48:16 crc kubenswrapper[5129]: I0314 10:48:16.859757 5129 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvcc8"] Mar 14 10:48:16 crc kubenswrapper[5129]: I0314 10:48:16.935224 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-strv7"] Mar 14 10:48:16 crc kubenswrapper[5129]: I0314 10:48:16.935625 5129 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-strv7" podUID="215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" containerName="registry-server" containerID="cri-o://ec152d5d802793035ccc850a9e49efc93814f47cd39385fb1e9fb174a720eb2f" gracePeriod=2 Mar 14 10:48:17 crc kubenswrapper[5129]: I0314 10:48:17.518634 5129 generic.go:334] "Generic (PLEG): container finished" podID="215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" containerID="ec152d5d802793035ccc850a9e49efc93814f47cd39385fb1e9fb174a720eb2f" exitCode=0 Mar 14 10:48:17 crc kubenswrapper[5129]: I0314 10:48:17.518715 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-strv7" event={"ID":"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6","Type":"ContainerDied","Data":"ec152d5d802793035ccc850a9e49efc93814f47cd39385fb1e9fb174a720eb2f"} Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.529803 5129 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-strv7" event={"ID":"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6","Type":"ContainerDied","Data":"13ad314c505d7e6a5c5f2a4b3ffb72245cc636c14ba1292d090a5f2e9b616843"} Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.530105 5129 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13ad314c505d7e6a5c5f2a4b3ffb72245cc636c14ba1292d090a5f2e9b616843" Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.601082 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-strv7" Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.720718 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-catalog-content\") pod \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.720820 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkvkp\" (UniqueName: \"kubernetes.io/projected/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-kube-api-access-jkvkp\") pod \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.721019 5129 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-utilities\") pod \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\" (UID: \"215cccb9-4c4f-4dc4-9c7a-607a4d8079b6\") " Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.722127 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-utilities" (OuterVolumeSpecName: "utilities") pod "215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" (UID: "215cccb9-4c4f-4dc4-9c7a-607a4d8079b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.726796 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-kube-api-access-jkvkp" (OuterVolumeSpecName: "kube-api-access-jkvkp") pod "215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" (UID: "215cccb9-4c4f-4dc4-9c7a-607a4d8079b6"). InnerVolumeSpecName "kube-api-access-jkvkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.777389 5129 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" (UID: "215cccb9-4c4f-4dc4-9c7a-607a4d8079b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.823147 5129 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkvkp\" (UniqueName: \"kubernetes.io/projected/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-kube-api-access-jkvkp\") on node \"crc\" DevicePath \"\"" Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.823182 5129 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:48:18 crc kubenswrapper[5129]: I0314 10:48:18.823192 5129 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:48:19 crc kubenswrapper[5129]: I0314 10:48:19.540761 5129 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-strv7" Mar 14 10:48:19 crc kubenswrapper[5129]: I0314 10:48:19.592558 5129 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-strv7"] Mar 14 10:48:19 crc kubenswrapper[5129]: I0314 10:48:19.605634 5129 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-strv7"] Mar 14 10:48:20 crc kubenswrapper[5129]: I0314 10:48:20.051939 5129 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215cccb9-4c4f-4dc4-9c7a-607a4d8079b6" path="/var/lib/kubelet/pods/215cccb9-4c4f-4dc4-9c7a-607a4d8079b6/volumes" Mar 14 10:48:44 crc kubenswrapper[5129]: I0314 10:48:44.214652 5129 scope.go:117] "RemoveContainer" containerID="d595119d31c9998ff570446064f9119bcc8d28db644f39020118fe304c80c932" Mar 14 10:48:44 crc kubenswrapper[5129]: I0314 10:48:44.285655 5129 scope.go:117] "RemoveContainer" containerID="12a6be0b93ad7fcc33126034ae9eb6e1fdf358c4b571963506c4b0e9b556e847" Mar 14 10:48:44 crc kubenswrapper[5129]: I0314 10:48:44.318760 5129 scope.go:117] "RemoveContainer" containerID="ab8da096f25973d88d30b9c7cd076e230953e47ab295242f1f1613b05a251df4" Mar 14 10:48:44 crc kubenswrapper[5129]: I0314 10:48:44.361789 5129 scope.go:117] "RemoveContainer" containerID="ec152d5d802793035ccc850a9e49efc93814f47cd39385fb1e9fb174a720eb2f" Mar 14 10:48:49 crc kubenswrapper[5129]: I0314 10:48:49.574051 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:48:49 crc kubenswrapper[5129]: I0314 10:48:49.574568 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:49:19 crc kubenswrapper[5129]: I0314 10:49:19.574368 5129 patch_prober.go:28] interesting pod/machine-config-daemon-lf9lh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:49:19 crc kubenswrapper[5129]: I0314 10:49:19.574976 5129 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lf9lh" podUID="58bd6165-e663-4c4e-82ae-6009ff348000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"